Average salary: $110,109 /yearly

More stats
Get new jobs by email
  •  ...and best practices. Maintain a focus on customer-service, efficiency, quality, and growth. Performs other duties as needed Plus Hadoop knowledge/ experience Qualifications Bachelor’s degree with at least three to five years related experience Additional... 
    Suggested

    360 IT Professionals

    Dearborn, MI
    2 days ago
  • A leading software development firm in Dearborn is seeking a Technical Business Analyst to bridge the gap between business needs and technology solutions. The role involves documenting business requirements, collaborating on system design, and conducting data analysis. ...
    Suggested

    360 IT Professionals

    Dearborn, MI
    2 days ago
  •  ...Minimum 5 years’ experience in field of data engineering involving analytics-focused data warehouse environments such as AWS, Snowflake, Hadoop, Oracle, etc. Minimum 2 years working experience in AWS utilizing services such as S3, AWS CLI, and DynamoDB Deep working... 
    Suggested
    Work experience placement

    MSR Technology Group

    Michigan
    16 days ago
  • $130.88k - $169.54k

     ...is required: 1. Utilizing SQL for data extraction and feature engineering from multiple platforms including Google Cloud Platform, Hadoop, Teradata, PC, and Mainframe. 2. Communicating statistical and technical topics to non-technical business partners. 1 year of experience... 
    Suggested
    Full time
    Immediate start
    Work from home
    Flexible hours

    Motorsport Hackers

    Dearborn, MI
    1 day ago
  • Job Summary Primary and Secondary skill set - GCP- (BigQuery, DataFlow, DataFusion, DataProc), Hadoop Eco system. Roles/Responsibilities Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful... 
    Suggested
    Flexible hours

    TechDigital Group

    Dearborn, MI
    2 days ago
  • $153k - $227k

     ...preferred; AWS or GCP also considered. Solid understanding of CI/CD principles and tools. Familiarity with big data technologies such as Hadoop, Hive, HBase, Object Storage (ADLS/S3), Event Queues. Strong understanding of performance optimization techniques such as... 
    Suggested
    Local area
    Relocation
    Relocation package
    Flexible hours

    General Motors

    Warren, MI
    23 hours ago
  •  ...testing. Responsibilities: Design and Development of applications in Java/J2EE/Python/Spring boot/PCF/Unix/Power BI/Cassandra/Kafka/Hadoop. Experience in cloud/Edge Hosting of services and Apps. Interoperability of Apps and services between cloud and Data centers. Top... 
    Suggested
    Work at office
    Local area

    TechDigital Group

    Warren, MI
    5 days ago
  • $120k - $150k

     ...Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top... 
    Suggested
    Worldwide

    Super Micro Computer Spain, S.L.

    Detroit, MI
    1 day ago
  • $120k - $150k

     ...Supermicro is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top... 
    Suggested
    Worldwide

    Supermicro

    Detroit, MI
    3 days ago
  •  ...explorer, Azure Data Factory. Data Warehouse Solutions: Redshift, Snowflake, Postgres, Data Lake. Big Data technologies: Azure, AWS, Hadoop, Spark, Hive, Kafka, Flume, NoSQL stores (HBase, Cassandra, DynamoDB, MongoDB). Cloud storage: S3, GCS, ADLS, Blob. Machine... 
    Suggested
    Temporary work
    Work experience placement

    Talent Management Plus, Inc.

    Detroit, MI
    5 days ago
  • $140k - $160k

     ...Python / Scala / Java / C#) Minimum 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming) Minimum of 2 years' experience with AWS or... 
    Suggested
    Remote job
    Local area

    StockX Inc.

    Detroit, MI
    5 days ago
  •  ...Liferay, CoreOS, or other CMS/portal technologies. Experience in the Automotive or Customer Analytics domain is a plus. Exposure to Hadoop ecosystem and big data technologies. Understanding of design patterns and system optimization techniques. Certifications: Java... 
    Suggested

    Compunnel, Inc.

    Auburn Hills, MI
    4 days ago
  •  ...Learning, Natural Language Processing (NLP), SVM, XGBoost, Random Forest, Decision Trees, Clustering Data Engineering : Databricks, Hadoop, SQL, Data Pipelines, Data Preprocessing & Feature Engineering Cloud & Big Data Platforms : Preferred Microsoft Azure (Data Lake,... 
    Suggested

    General Motors

    Warren, MI
    2 days ago
  •  ...experience with ETL/ELT tools (e.g., Informatica, Talend, Apache NiFi). Experience with big data technologies (e.g., Apache Spark, Hadoop ecosystem). Proficiency in data warehousing concepts (e.g., Snowflake, Redshift, BigQuery). Experience with cloud... 
    Suggested

    Openkyber

    Wyoming, MI
    4 hours agonew
  •  ...monitoring, performance tuning, and capacity planning. Distributed Systems: Strong hands-on experience with Spark, Flink, and Kafka . Hadoop Ecosystem: Proficiency in Hadoop Cluster Administration and Operations. Cloud & Containers: Deep understanding of AWS and... 
    Suggested
    Work at office
    Home office

    Openkyber

    Wyoming, MI
    5 hours agonew
  •  ...information retrieval concepts, relevance metrics, and evaluation methods. Familiarity with large-scale data processing (e.g., Spark, Hadoop) is a plus. Key Responsibilities: Design, develop, and implement search ranking models using Learn to Rank approaches.... 

    Openkyber

    Wyoming, MI
    1 day ago
  •  ...Actions Azure DevOps) ~ Experience with HIL/SIL testing environments Nice to Have: AUTOSAR (Classic / Adaptive) Cybersecurity (ISO 21434) Performance profiling and embedded optimization Big data tools (Spark Databricks Hadoop SQL)... 
    Full time

    Wise Equation Solutions Inc.

    Novi, MI
    8 days ago
  • $60 - $65 per hour

     ...Develop and maintain data pipelines using Informatica Big Data Management (BDM) Implement mappings, workflows, and transformations for Hadoop-based and cloud data platforms Integrate Informatica BDM with Hadoop, Hive, Spark, and cloud storage Optimize BDM jobs for... 
    Hourly pay
    Full time
    Work at office
    Local area
    Remote work
    Worldwide

    Computer Aid, Inc.

    Wyoming, MI
    5 days ago
  •  ...Strong data engineering/analytics skills: Python (pandas NumPy) time-series processing MDF readers and one or more of Spark/Databricks/Hadoop/SQL. Hands-on with Git CI/CD (e.g. Jenkins/GitHub Actions/Azure DevOps) static analysis (e.g. Polyspace/Cppcheck) and issue... 
    Full time

    Cays Inc

    Novi, MI
    13 days ago
  • $125.5k - $230.2k

     ...advanced analytics needs. Oversee cloud-based data management platforms and technologies such as Databricks, Snowflake, Azure, AWS, and Hadoop for ingestion, storage, and processing. Drive the design and implementation of analytical models, data pipelines, and... 
    Summer holiday
    Work at office
    Flexible hours

    Ernst & Young Oman

    Detroit, MI
    1 day ago
  •  ...experimental design (A/B testing) Strong data engineering capabilities including SQL/NoSQL database programming, distributed computing tools (Hadoop, Spark, Kafka), data pipeline development, and experience with cloud platforms (AWS, Azure, Google Cloud Platform) Production ML... 

    Openkyber

    Wyoming, MI
    1 day ago
  •  ...programming and 3+ years of MLOps experience in production environments. ~5+ years with Big Data platforms such as BigQuery or Hadoop and 3+ years with PySpark. ~2+ years building APIs, preferably with FastAPI, and integrating with Google Cloud Platform/Azure or... 
    Internship
    3 days per week

    Openkyber

    Wyoming, MI
    1 day ago
  •  ...with cloud platforms (AWS, Azure, or Google Cloud Platform). Knowledge of ETL/ELT pipelines and big data technologies (Spark, Hadoop). Experience with AI/ML frameworks (TensorFlow, PyTorch, or similar). Strong programming skills (Python, SQL, or Scala).... 
    Full time
    Remote work

    Openkyber

    Wyoming, MI
    1 day ago
  •  ...Tech stack. Experienced with Infrastructure as Code (IaC). Experience with big data technologies such as Apache Spark or Hadoop. Stay informed about the ethical implications of machine learning eg: selection bias. Model Training Data Analytics... 
    Contract work

    Openkyber

    Wyoming, MI
    1 hour agonew
  •  ...drift monitoring. Use Azure Data Factory (ADF) and Azure Databricks for orchestrated, scalable data processing; use AWS EMR for Hadoop/Spark workloads supporting AI features. Build Agentic AI Solutions Design secure tool-calling and multi-agent orchestration... 
    Local area

    Openkyber

    Wyoming, MI
    2 days ago
  •  ...Data Engineer (pyspark, Hadoop, Scala) - Walmart - Sunnyvale, CA (hybrid ) Skill Set: Can perform Pyspark, Hadoop, Scala, ETL, Day to Day: Working on Walmart signals and table. Preparing and processing raw data from users and creating tables for Walmart.... 

    Openkyber

    Wyoming, MI
    2 days ago
  •  ...Develop and optimize ETL/ELT processes, data flows, and infrastructure leveraging AWS, SQL, and platforms such as Databricks, Redshift, Hadoop and Airflow. Assemble and manage large, complex datasets to meet functional and non-functional business requirements.... 
    Long term contract
    Remote work

    Openkyber

    Wyoming, MI
    3 days ago
  •  ...with Google Cloud Platform (Google Cloud Platform) , especially BigQuery Prior experience with Teradata Familiarity with Hadoop ecosystem Exposure to tools such as Dremio and distributed storage systems Cloud certifications (Google Cloud Platform preferred... 
    Contract work
    Visa sponsorship

    Openkyber

    Wyoming, MI
    4 days ago
  •  ...public API. Deliver robust and extensible solutions from feature requests. Work with big data technologies such as Kafka, Hadoop, and Spark. Collaborate with Data Scientists to deliver value-added features. Partner with DBAs to create ETL and Data... 
    Hourly pay
    Contract work
    Local area
    Remote work

    Eliassen Group

    Lansing, MI
    1 day ago
  •  ...Experience with natural language processing (NLP), computer vision, or other AI techniques. Familiarity with big data technologies (Hadoop, Spark) and cloud platforms (AWS, Azure, Google Cloud) Strong analytical skills and the ability to work with complex datasets.... 
    Local area
    Remote work

    Openkyber

    Wyoming, MI
    2 days ago