Average salary: $110,109 /yearly
More statsGet new jobs by email
- ...and best practices. Maintain a focus on customer-service, efficiency, quality, and growth. Performs other duties as needed Plus Hadoop knowledge/ experience Qualifications Bachelor’s degree with at least three to five years related experience Additional...Suggested
- A leading software development firm in Dearborn is seeking a Technical Business Analyst to bridge the gap between business needs and technology solutions. The role involves documenting business requirements, collaborating on system design, and conducting data analysis. ...Suggested
- ...Minimum 5 years’ experience in field of data engineering involving analytics-focused data warehouse environments such as AWS, Snowflake, Hadoop, Oracle, etc. Minimum 2 years working experience in AWS utilizing services such as S3, AWS CLI, and DynamoDB Deep working...SuggestedWork experience placement
$130.88k - $169.54k
...is required: 1. Utilizing SQL for data extraction and feature engineering from multiple platforms including Google Cloud Platform, Hadoop, Teradata, PC, and Mainframe. 2. Communicating statistical and technical topics to non-technical business partners. 1 year of experience...SuggestedFull timeImmediate startWork from homeFlexible hours- Job Summary Primary and Secondary skill set - GCP- (BigQuery, DataFlow, DataFusion, DataProc), Hadoop Eco system. Roles/Responsibilities Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful...SuggestedFlexible hours
$153k - $227k
...preferred; AWS or GCP also considered. Solid understanding of CI/CD principles and tools. Familiarity with big data technologies such as Hadoop, Hive, HBase, Object Storage (ADLS/S3), Event Queues. Strong understanding of performance optimization techniques such as...SuggestedLocal areaRelocationRelocation packageFlexible hours- ...testing. Responsibilities: Design and Development of applications in Java/J2EE/Python/Spring boot/PCF/Unix/Power BI/Cassandra/Kafka/Hadoop. Experience in cloud/Edge Hosting of services and Apps. Interoperability of Apps and services between cloud and Data centers. Top...SuggestedWork at officeLocal area
$120k - $150k
...Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top...SuggestedWorldwide$120k - $150k
...Supermicro is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top...SuggestedWorldwide- ...explorer, Azure Data Factory. Data Warehouse Solutions: Redshift, Snowflake, Postgres, Data Lake. Big Data technologies: Azure, AWS, Hadoop, Spark, Hive, Kafka, Flume, NoSQL stores (HBase, Cassandra, DynamoDB, MongoDB). Cloud storage: S3, GCS, ADLS, Blob. Machine...SuggestedTemporary workWork experience placement
$140k - $160k
...Python / Scala / Java / C#) Minimum 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming) Minimum of 2 years' experience with AWS or...SuggestedRemote jobLocal area- ...Liferay, CoreOS, or other CMS/portal technologies. Experience in the Automotive or Customer Analytics domain is a plus. Exposure to Hadoop ecosystem and big data technologies. Understanding of design patterns and system optimization techniques. Certifications: Java...Suggested
- ...Learning, Natural Language Processing (NLP), SVM, XGBoost, Random Forest, Decision Trees, Clustering Data Engineering : Databricks, Hadoop, SQL, Data Pipelines, Data Preprocessing & Feature Engineering Cloud & Big Data Platforms : Preferred Microsoft Azure (Data Lake,...Suggested
- ...experience with ETL/ELT tools (e.g., Informatica, Talend, Apache NiFi). Experience with big data technologies (e.g., Apache Spark, Hadoop ecosystem). Proficiency in data warehousing concepts (e.g., Snowflake, Redshift, BigQuery). Experience with cloud...Suggested
- ...monitoring, performance tuning, and capacity planning. Distributed Systems: Strong hands-on experience with Spark, Flink, and Kafka . Hadoop Ecosystem: Proficiency in Hadoop Cluster Administration and Operations. Cloud & Containers: Deep understanding of AWS and...SuggestedWork at officeHome office
- ...information retrieval concepts, relevance metrics, and evaluation methods. Familiarity with large-scale data processing (e.g., Spark, Hadoop) is a plus. Key Responsibilities: Design, develop, and implement search ranking models using Learn to Rank approaches....
- ...Actions Azure DevOps) ~ Experience with HIL/SIL testing environments Nice to Have: AUTOSAR (Classic / Adaptive) Cybersecurity (ISO 21434) Performance profiling and embedded optimization Big data tools (Spark Databricks Hadoop SQL)...Full time
$60 - $65 per hour
...Develop and maintain data pipelines using Informatica Big Data Management (BDM) Implement mappings, workflows, and transformations for Hadoop-based and cloud data platforms Integrate Informatica BDM with Hadoop, Hive, Spark, and cloud storage Optimize BDM jobs for...Hourly payFull timeWork at officeLocal areaRemote workWorldwide- ...Strong data engineering/analytics skills: Python (pandas NumPy) time-series processing MDF readers and one or more of Spark/Databricks/Hadoop/SQL. Hands-on with Git CI/CD (e.g. Jenkins/GitHub Actions/Azure DevOps) static analysis (e.g. Polyspace/Cppcheck) and issue...Full time
$125.5k - $230.2k
...advanced analytics needs. Oversee cloud-based data management platforms and technologies such as Databricks, Snowflake, Azure, AWS, and Hadoop for ingestion, storage, and processing. Drive the design and implementation of analytical models, data pipelines, and...Summer holidayWork at officeFlexible hours- ...experimental design (A/B testing) Strong data engineering capabilities including SQL/NoSQL database programming, distributed computing tools (Hadoop, Spark, Kafka), data pipeline development, and experience with cloud platforms (AWS, Azure, Google Cloud Platform) Production ML...
- ...programming and 3+ years of MLOps experience in production environments. ~5+ years with Big Data platforms such as BigQuery or Hadoop and 3+ years with PySpark. ~2+ years building APIs, preferably with FastAPI, and integrating with Google Cloud Platform/Azure or...Internship3 days per week
- ...with cloud platforms (AWS, Azure, or Google Cloud Platform). Knowledge of ETL/ELT pipelines and big data technologies (Spark, Hadoop). Experience with AI/ML frameworks (TensorFlow, PyTorch, or similar). Strong programming skills (Python, SQL, or Scala)....Full timeRemote work
- ...Tech stack. Experienced with Infrastructure as Code (IaC). Experience with big data technologies such as Apache Spark or Hadoop. Stay informed about the ethical implications of machine learning eg: selection bias. Model Training Data Analytics...Contract work
- ...drift monitoring. Use Azure Data Factory (ADF) and Azure Databricks for orchestrated, scalable data processing; use AWS EMR for Hadoop/Spark workloads supporting AI features. Build Agentic AI Solutions Design secure tool-calling and multi-agent orchestration...Local area
- ...Data Engineer (pyspark, Hadoop, Scala) - Walmart - Sunnyvale, CA (hybrid ) Skill Set: Can perform Pyspark, Hadoop, Scala, ETL, Day to Day: Working on Walmart signals and table. Preparing and processing raw data from users and creating tables for Walmart....
- ...Develop and optimize ETL/ELT processes, data flows, and infrastructure leveraging AWS, SQL, and platforms such as Databricks, Redshift, Hadoop and Airflow. Assemble and manage large, complex datasets to meet functional and non-functional business requirements....Long term contractRemote work
- ...with Google Cloud Platform (Google Cloud Platform) , especially BigQuery Prior experience with Teradata Familiarity with Hadoop ecosystem Exposure to tools such as Dremio and distributed storage systems Cloud certifications (Google Cloud Platform preferred...Contract workVisa sponsorship
- ...public API. Deliver robust and extensible solutions from feature requests. Work with big data technologies such as Kafka, Hadoop, and Spark. Collaborate with Data Scientists to deliver value-added features. Partner with DBAs to create ETL and Data...Hourly payContract workLocal areaRemote work
- ...Experience with natural language processing (NLP), computer vision, or other AI techniques. Familiarity with big data technologies (Hadoop, Spark) and cloud platforms (AWS, Azure, Google Cloud) Strong analytical skills and the ability to work with complex datasets....Local areaRemote work