Get new jobs by email
- ...of experience building data pipelines in cloud environments ~4+ years of experience with Big Data technologies (e.g., Spark, Hadoop) and cloud architecture ~3+ years of experience with reporting and analytics tools (e.g., Tableau, Power BI) ~ Hands-on...SuggestedLong term contract
- ...Preferred qualifications Experience with streaming API s like Kafka Understanding of Big Data/Data Lake technologies (Spark, Hadoop, Databricks etc) Understanding of Design patterns and clean coding Understanding of technical aspects of Analytic applications Familiarity...SuggestedLocal area
- ...Familiarity with market risk concepts including VaR, Greeks, scenario analysis and stress testing. ~ Hands on experience with Hadoop, Spark. ~ Proficiency on Git, Jenkins and CI/CD pipelines. ~ Excellent problem-solving skills and strong mathematical and analytical...Suggested
- ...Develop and optimize ETL/ELT processes, data flows, and infrastructure leveraging AWS, SQL, and platforms such as Databricks, Redshift, Hadoop and Airflow. Assemble and manage large, complex datasets to meet functional and non-functional business requirements....SuggestedLong term contractRemote work
- ...~ Hands-on experience with AWS services (S3, Glue, Redshift, Lambda, EMR) ~ Experience with big data tools like Apache Spark, Hadoop, or Kafka ~ Knowledge of Machine Learning concepts and model lifecycle ~ Experience with ETL/ELT frameworks and data pipeline...SuggestedRemote work
- ...or technically related discipline Minimum of 5+ years of relevant industry experience Minimum 5 years of experience developing in Hadoop echo system ( Spark, PySpark, MapReduce, Hive, Impala ) Minimum 5 years of experience with common application frameworks (JEE Spring...SuggestedRemote work
- ...SQL; relational (PostgreSQL, MySQL) and NoSQL (MongoDB) experience Cloud & Big Data: AWS/Azure/Google Cloud Platform, Spark, Hadoop, scalable storage (S3, Blob, HDFS) Ready to Apply? Take the next step in your data engineering career with this exciting opportunity...SuggestedContract work
- ...Must be Local to Reston, NO RELO - OnSite 3 days a week. Top 5 Technical Skills: Python (Big Data Pipeline) AWS Hadoop, Spark, Hive EMR Terraform Job Description: Strong Python development to build a big-data pipeline for data processing and analysis Need strong experience...SuggestedContract workWork experience placementLocal area3 days per week
- ...solutions in virtualized environments. ~ Experience in one or more of the following: Networking, Devops, Security, Compute, Storage, Hadoop, Kubernetes, or SRE. ~ Networking - Experience delivering end-to-end networking designs and architectures that include DNS,...SuggestedContract workRemote work
- ...~ Hands-on experience with AWS services (S3, Glue, Redshift, Lambda, EMR) ~ Experience with big data tools like Apache Spark, Hadoop, or Kafka ~ Knowledge of Machine Learning concepts and model lifecycle ~ Experience with ETL/ELT frameworks and data pipeline...SuggestedRemote work
- ...specifically AWS. Preferred qualifications: Experience with MLOps tools (e.g., MLflow, Kubeflow). Knowledge of big data technologies (Spark, Hadoop, Databricks). Background in NLP, computer vision, or other advanced AI techniques. Relevant certifications (Coursera, edX, AWS,...Suggested
- ...amounts of real-world data. Experience retrieving and manipulating data from a variety of data sources included DB2, Oracle, SQL Server, Hadoop and flat files. Experience with database management systems (e.g., PostgresSQL, MySQL, SQLite, SQL, etc.) Excellent analytical...SuggestedContract workWork experience placement
$118.7k - $218.6k
...Microsoft SQL Server, Oracle, Mongo Db o Reporting Tools (e.g., Business Objects, Reporting Services) o Big Data Tools (e.g., Hadoop, Hive, Pig, Impala, Mahout) o Machine Learning/Cognitive (e.g., Python, Mahout, CognitiveScale) o Data Visualization Tools (e.g...SuggestedVisa sponsorship- ...Senior Software Engineer experience in the Hadoop ecosystem (Spark, PySpark, MapReduce, Hive, Impala). Strong background in application frameworks such as JEE, Spring Boot, Struts, and Hibernate. Proficiency with relational databases (MS SQL Server preferred, Oracle,...Suggested
- ...Develop best possible, most robust, and extensible solutions from feature requests Work with big data technology (Kafka, Hadoop, Spark, etc) Work with Data Scientists to develop rich value-added features Work with DBA to create ETL and Data Warehouse...SuggestedRemote work
- ...technically related discipline Minimum of 5+ years of relevant industry experience Minimum 5 years of experience developing in the Hadoop ecosystem (Spark, PySpark, MapReduce, Hive, Impala ) Minimum 5 years of experience with common application frameworks (JEE...Remote work
$144.2k - $265.6k
...profitability analysis Experience with advanced analytics or programming, including but not limited to: Big Data tools (Hadoop, Hive, Pig, Impala) Reporting tools (Business Objects, Reporting Services) Machine learning/cognitive (Python, Mahout,...Visa sponsorship- ...Experience with natural language processing (NLP), computer vision, or other AI techniques. Familiarity with big data technologies (Hadoop, Spark) and cloud platforms (AWS, Azure, Google Cloud) Strong analytical skills and the ability to work with complex datasets....Remote work
- ...Spring Batch Strong experience with Google Cloud Platform (GKE, BigQuery, Cloud Storage, IAM) Hands-on with Dataproc (Spark/Hadoop) and Composer (Airflow) Strong experience in Angular Expertise in distributed systems, scalability, and system design...Remote work
- ...platforms. The ideal candidate will have deep expertise in distributed systems, cloud platforms, and modern big data technologies such as Hadoop, Spark etc Responsibilities: Design, develop, and maintain large-scale data processing pipelines using Big Data technologies...Work experience placementLocal area3 days per week
$65.05 per hour
...a quantitative discipline. Preferred Skills: Background in enterprise stress testing. Experience with Elastic, Hadoop, Teradata, or other distributed big data ecosystems. Knowledge of cloud or distributed computing environments. Prior work...Contract work- ...Foundation : Strong understanding of statistics, linear algebra and calculus as applied to ML. Big Data : Experience with Spark, Hadoop or similar distributed computing frameworks. Cloud Platforms : Familiar with AI services on AWS, Azure or Google Cloud...
- ...Airflow) Knowledge of real-time inference and batch processing systems Familiarity with big data technologies (Spark, Kafka, Hadoop) Experience with containerization (Docker, Kubernetes) Background in applied AI domains (NLP, computer vision,...Remote work
- ...~ Solid Angular frontend experience ~ Hands-on AWS experience (Lambda, ECS, S3, RDS, EMR) ~ Big Data experience is required Hadoop, Spark, Presto, or EMR ~ Strong SQL skills with performance tuning ~ Experience with scripting (Python, Unix shell, Groovy,...Contract work
- ...Familiarity with market risk concepts including VaR, Greeks, scenario analysis and stress testing. ~ Hands on experience with Hadoop, Spark. ~ Proficiency on Git, Jenkins and CI/CD pipelines. ~ Excellent problem-solving skills and strong mathematical and analytical...Contract work
$118.7k - $218.6k
...standard PaPM capabilities. Analytical Tools: 3+ years' experience in using analytics tools and languages (SQL Server, Oracle, Hadoop, Python, SAP Analytics Cloud, Tableau, Power BI). ERP: 3+ years' experience with large ERP systems, particularly SAP and Oracle....Visa sponsorship- Hadoop Developer / Lead / Architect / PCON Implify, Inc is a Global IT Solutions and services firm. Since its inception, Implify, Inc has been providing best-quality and cost-effective IT solutions to fortune 1000 companies, mid-range companies and upcoming companies via...Permanent employmentFull timeWork experience placement
- ...processes and data pipelines Knowledge of cloud platforms (AWS, Azure, or GCP) Familiarity with big data tools (Spark, Hadoop) Experience in a specific domain (finance, healthcare, e-commerce, marketing, etc.) What We Offer Opportunity to work...
- A global IT solutions firm is seeking qualified candidates for Hadoop Developer and Hadoop / Big Data Architect roles. Candidates should possess a Bachelor's degree or equivalent and extensive experience in the Big Data ecosystem. Key skills include Hadoop, Java, SQL tuning...Permanent employmentFull time
$55k - $138k
...company's success. -Develop, maintain, and expand a complex, data-centric Treasury Management payment network currently hosted on Hadoop and Neo4j -Work on complex data requirements independently -Collaborate with Data Scientists, Architects, and other key...Full timeTemporary workPart timeWork experience placementWork at office
