Average salary: $130,990 /yearly
More statsGet new jobs by email
- ...Title: Hadoop developer Location : 100% Remote work Type : W2 -Contract to hire Rate : $Open / hour Requirements Job Descriptions: Hadoop Development within the database Maintaining and monitoring the Hadoop environment, addressing vulnerabilities...SuggestedRemote jobContract work
- ...(MMM) or Multi-Touch Attribution (MTA) models for marketing attribution. Experience with big data technologies (e.g., Spark, Hadoop, Flink). Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data science services. Experience with notebook...SuggestedRemote job
- ...at scale, with a deep understanding of data infrastructure and distributed systems. ~ Expertise in big data technologies such as Hadoop, Spark, Kafka, and other distributed computing systems. ~ Experience designing, building, and operating large-scale systems using...SuggestedFull timeRemote workWorldwide
- ...conception to development Experience working with cloud platforms (AWS, GCP, or Azure). Knowledge of big data technologies like Spark or Hadoop. Understanding of financial services concepts such as credit scoring, portfolio risk, or customer lifetime value. What We Offer You...SuggestedWork visaH1bFlexible hours
- ...tuning required. Programming Python (strong) Production-quality ETL + ML support. Big Data / Hadoop Admin-level familiarity Cluster behavior, resource mgmt, job tuning. Orchestration Cloud Composer...Suggested
- ...Experience with any major public cloud, preferably AWS, Azure, and their data services. Knowledge of big data technologies such as Hadoop, Spark, Flink, Kafka, Databricks, Synapse, etc. Proficient in SQL and experienced in query planning and optimization....SuggestedFull timePart timeInternshipSeasonal work
- ...Oracle, Teradata, Netezza, Greenplum, etc.) Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common data integration and data transformation tools (e.g. Informatica, DataStage,...SuggestedRemote jobFlexible hours
- ...Proficiency in multiple programming languages and middleware technologies (MuleSoft). · Experience with databases (SQL, NoSQL/HBase/Hadoop) and distributed systems. · Strong problem-solving skills with the ability to analyze complex technical challenges and implement...SuggestedRemote jobTemporary work
- ...Epidemiology, Biostatistics, Computer Science, or other subject with high statistical and programming content Experience with the Hadoop database platform and Impala or Hive SQL Experience with the Databricks programming environment (Spark, Python, R) Reporting and...SuggestedRemote work
$176k - $230k
...their impact on detection efficacy. Big Data Technologies: Familiarity with big data processing frameworks like Apache Spark, Hadoop, Flink, or similar, for handling and analyzing massive volumes of detection data. Data Pipelines (ETL/ELT): Expertise in designing...SuggestedRemote job- ...engineering and modern data warehouse build-outs. ~1+ years of experience building Medallion Architecture in a “Big Data” platform (Hadoop, MPP, etc…) ~Demonstrated expertise implementing Delta Lake, data warehouse, and data mart solutions. ~Strong proficiency in...SuggestedWork experience placementLocal area
- ...operation of Security Groups, KMS Keys, VPC NACLs, and SCPs. Familiar with ETL and big data tool-chains such as those provided by Hadoop/EMR, Glue, Spark, Impala, or similar. Understanding of relational database systems and how applications interact with them....SuggestedFull timeContract workPart timeInternshipSeasonal workRemote work
- ...and ETL tools, such as Apache Kafka, Apache Airflow, or Informatica. Familiarity with big data processing frameworks, such as Hadoop, Spark, or Flink. Knowledge of cloud platforms, such as AWS, Azure, or GCP, and experience with data storage and processing services...SuggestedRemote jobFull timeWork from home
- ...confident and comfortable using our platform. Part of this exciting journey will also have you exposed to other tools such as Spark, Hadoop, Hive, Kubernetes, with our platform being hosted on Google cloud (GCP). You will learn how to create entities and networks and...SuggestedFull timeTemporary workRemote workWork visa
- ...automating data pipelines ~ Good experience using business intelligence/visualization tools (such as Tableau), data frameworks (such as Hadoop, DataFrames, RDDs, Dataclasses) and data formats (CSV, JSON, Parquet, Avro, ORC) ~ Advanced knowledge of R, SQL and Python;...SuggestedRemote jobFull time
- ...AI domain. ~ Experience developing advanced language models ~ Experience applying Generative AI-based tools ~ Experience with Hadoop and NoSQL related technologies such as Map Reduce, Hive, HBase, mongoDB, Cassandra. ~ Experience modifying and applying advanced...Remote jobFlexible hours
- ...both unstructured and relational databases Collaborative Low ego and high pride in your work Bonus Points Experience with Hadoop, Storm, or Spark Past work with Docker and/or Ansible Background in statistics Background in UX/HCI/Design ERP systems...Remote jobFull timeFlexible hours
- ...Step Functions, Glue, RDS, EKS, DMS, EMR, etc. Industry experience with different big data platforms and tools such as Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc. Industry experience working with relational and NoSQL databases in a production environment...Remote jobFull timeFlexible hours
- ...programming skills in Python or similar. Excellent proficiency with traditional SQL and big data technologies such as Apache Spark, Hadoop, and distributed computing frameworks. Ability to analyze large datasets and derive business insights using BI tools like...Remote jobFull timeWork at officeFlexible hours
- ...Experience using version control systems (e.g., Git) Basic knowledge of Data Science such as Big Data Analytics (Machine Learning, Hadoop, Map Reduce, Hive, Spark, etc.) Basic knowledge of cloud computing environments and tooling (AWS, Azure) Basic SQL skills with...Remote jobFull timeWork experience placementWorldwideFlexible hours
- ...pandas) ~ Solid understanding of machine learning algorithms and statistical modeling ~ Experience with big data technologies (e.g., Hadoop, Spark, Kafka) ~ Proven ability to create and implement big data solutions from scratch ~ Comfortable setting up and managing...Remote jobFull time
- ...Develop solutions to query, aggregate, and analyze time-series data in real-time and batch modes. Grow our analysis into a Hadoop-like Big Data job. Support visualization efforts in Tableau Build forecasting using AI models along with other team members...Remote jobFull timeTemporary work
- ...Optimize ETL/ELT processes for data ingestion, transformation, and storage. Work with big data technologies (e.g., Databricks, Spark, Hadoop) and cloud platforms (Azure). Exercises independent judgment in developing, approving and maintaining a data hub as a central...Remote jobFull timeTemporary workLocal area
- ...systems, microservices architecture, and cloud computing (AWS, GCP, or Azure). ~ Experience with big data technologies, such as Hadoop, Spark, Kafka, or similar. ~ Strong knowledge of databases, both SQL and NoSQL (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). ~...Remote jobFull time
- ...design workshops, and requirements gathering activities. Proficient with big data technologies such as Kubernetes, Spark, and Hadoop. Familiar with enterprise security solutions such as LDAP or Kerberos. Strongly skilled in developing functional and technical...Remote jobFull timeFlexible hours
- ...LinkedIn Recruiter, Lattice, G Suite, Atlassian, AWS, Python, Java, Ruby, GO, node.js, Temporal, Scala, Apache NiFi, Talend, Informatica, Hadoop, Hive, Spark, Pandas, Looker, Argo, Airflow Luigi, Kubernetes, C#, JavaScript (for advanced concepts), ASP.NET MVC, .NET Core,...Remote jobFull time
- ...serverless, event-driven systems). + **Big Data & Streaming Technologies:** Familiarity with big data architectures and tools (e.g., Hadoop, Spark) and real-time data streaming technologies (e.g., Apache Kafka, Apache Flink). + **DevOps & Automation:** Familiarity with...
- ...datos, tanto en línea como fuera de línea, mediante procesos automatizados. Además, conocimientos en herramientas como Spark , Hadoop y/o R . Conocimientos en desarrollo de paneles de visualización de datos mediante herramientas como Power BI o DataStudio...Remote job
- ...Technical Proficiency: Strong proficiency with modern data platforms and cloud technologies (e.g., AWS, Azure, GCP), big data frameworks (Hadoop, Spark/Kafka), and data warehousing solutions (Snowflake, Databricks). Programming & Tools: Proficient in programming and query...Full timeWork experience placementRemote work
- ...# Data Warehousing (Relational and dimensional data modeling); # Distributed data storage and large-data processing (Spark, EMR, Hadoop, Hive); # AWS data management (S3, Redshift, DynamoDB) and related tools (Athena, EMR, Glue); # Agile/scrum environment; # Talend...Full timeLocal areaRemote work