Average salary: $130,990 /yearly
More statsGet new jobs by email
$116.1k - $170.28k
...The ideal candidate will have a strong background in developing batch processing systems, with extensive experience in the Apache Hadoop ecosystem (Map Reduce, Oozie, Hive, Pig, HBase, Storm). This role involves working in Java, and working on Machine Learning pipelines...SuggestedFull timeRemote work$80 per hour
...Title : Hadoop Platform Engineer Location : Remote (U.S.-based) Job Type : Contract (6 months) Industry: Retail Compensation : $80 – $150/hour (W2) Required Skills: Java, Hadoop, Hive, Spark, Hadoop Distributed File System (HDFS), Linux --- About...SuggestedContract workWork at officeRemote work- ...Title: Hadoop developer Location : 100% Remote work Type : W2 -Contract to hire Rate : $Open / hour Requirements Job Descriptions: Hadoop Development within the database Maintaining and monitoring the Hadoop environment, addressing vulnerabilities...SuggestedRemote jobContract work
- ...decisiones. · Comprensión de procesamiento de datos, analytics y soporte a decisiones de negocio . · Conocimientos deseables: Spark, Hadoop y capacidad de leer código Cobol. · Habilidades de resolución de problemas y capacidad para trabajar de forma independiente o...SuggestedRemote jobFull timeFlexible hours
- ...time. Strong knowledge of programming languages/tools including Java, Scala, Spark, SQL, Hive, ElasticSearch. Most tools within the Hadoop Ecosystem are necessary, but we're mainly looking for Spark and Scala (Java if not Scala). Experience with streaming technologies...SuggestedRemote jobFull timeWork experience placementH1bLocal area
$170k
...batch or streaming production data pipelines, ideally using one or more distributed processing frameworks such as Spark, Flink or Hive/Hadoop Knowledge in data modeling and establishing data architecture across multiple systems Thrive in a fast paced environment, and...SuggestedHourly payFull timeImmediate startFlexible hours$170k
...engineering projects. We value people over process. ~ Have experience building production data pipelines using Spark, Flink or Hive/Hadoop. Have hands-on experience with schema design and data modeling. ~ Have programming proficiency in Python, or Scala/Java. You have...SuggestedHourly payFull timeImmediate startFlexible hours$116.1k - $170.28k
...batch data processing, and large-scale data pipelines. The ideal candidate has strong hands-on experience with Oozie, Pig, the Apache Hadoop ecosystem, and programming proficiency in Java (preferred) or Python. This role requires a deep understanding of data structures...SuggestedRemote jobFull time- ...relevant field experience Some background in deploying and supporting structured and large-scale database environments, such as the Hadoop ecosystem, Elastic Search, and Postgres/MySQL About BlueVoyant At BlueVoyant, we recognize that effective cyber security...SuggestedRemote jobFull timeWork at officeLocal areaFlexible hours
$188.87k - $244.87k
...services which respond to batch and real-time data to safely rollout features and experiments using technology stack of AB testing, Hadoop, Spark, Flink, Hbase, Druid, Python, Java, Distributed Systems, React and statistical analysis. Participate in all phases of...SuggestedRemote jobFull timeLocal areaRelocation package$84.5k - $204.6k
...and influence with insights Familiarity in relevant machine learning frameworks and packages such as Tensorflow and PyTorch. GCP/Hadoop and big data experience – an advantage Additional Job Description: Subsidiary: PayPal Travel Percent: 0 -...SuggestedFull timeWork at officeLocal areaFlexible hours- ...to Have Skill Analytics: SQL, SAS, Python, Tableau, Qlikview, Splunk, SQL Server, Power BI AI/ML: NLG tools, data aggregation tools Data Management: Collibra, Informatica, Hadoop, Manta, ASG, Ab Initio, Solidatus, Linkurio Experience 6-9 years...Suggested
$140.03k - $180.58k
...Document data architecture, processes, and workflows. Technology and Tools : Leverage tools and technologies such as Apache Spark, Hadoop, Kafka, and Python/Java for data processing. Utilize cloud platforms (AWS, Azure, Google Cloud) for data storage and compute...SuggestedFull timeContract workRemote workMonday to Friday- ...problem-solving skills with strong verbal/written communication skills ~ Familiarity with big data open source tools (e.g. Spark, Hadoop, Kafka, etc.), and open source web frameworks and UI platforms (e.g. Flask, Shiny, Django, Dash) is a plus ~ Preferred:...SuggestedRemote work
- ...Advanced knowledge of cloud-specific data services (e.g., DataBricks, Azure Data Lake). Expertise in big data technologies (e.g., Hadoop, Spark). Strong understanding of data security and governance principles. Experience in scripting languages (Python, SQL). Additional...SuggestedContract workRemote work
- ...of data warehousing concepts and tools (e.g., Redshift, Snowflake, Google Big Query) Experience with big data platforms (e.g., Hadoop, Spark, Kafka). Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). Expertise in ETL...Full timeRemote work
$120k - $160k
...ability to work independently and collaboratively; and attention to detail for data accuracy. - Knowledge of big data tools (e.g., Hadoop, Kafka) and DevOps processes. Education & Experience - A bachelor's degree in computer science, information technology, or...Remote jobFull timeWork from homeHome officeFlexible hours- ...analysis ~ SQL, Excel, and data analysis experience ~ Python/R scripting experience ~ Experience with distributed computing (Hive/Hadoop) ~ Experience with Unix/Linux ~ Experience initiating and driving projects to completion with minimal guidance ~ Experience...Remote job
- ...to migrate and/or scale cloud data solutions. Build pipelines and scalable analytic tools using leading technologies (for example: Hadoop, Spark, Kafka, Kubernetes, Terraform, Airflow, AWS, Azure, GCP, etc.) Regularly conduct peer code reviews to ensure code quality...
- ...analysis and visual support. Skills: Experienced in either programming languages such as Python and/or R, big data tools such as Hadoop, or data visualization tools such as Tableau. Ability to communicate effectively in writing, including conveying complex...Remote jobFlexible hours
- ...Azure Databricks, including cluster management, notebook development, and Delta Lake. Proficiency in big data technologies (e.g., Hadoop, Spark) and data processing frameworks (e.g., PySpark). Deep understanding of Azure services like Azure Data Lake, Azure Synapse...
- ...Epidemiology, Biostatistics, Computer Science, or other subject with high statistical and programming content Experience with the Hadoop database platform and Impala or Hive SQL Experience with the Databricks programming environment (Spark, Python, R) Reporting and...Remote work
- ...cluster management, notebooks, Delta Lake). Strong SQL skills; experience with NoSQL databases. Proficiency in big data tech (Hadoop, Spark, PySpark). Strong understanding of Azure Data Lake, ADF, Synapse. Experience with ETL/ELT, data warehousing, data...Long term contractRemote work
- ...technologies such as Lucene, Solr, Elasticsearch, etc. Experience with big data and distributed computing technologies such as Hadoop, MapReduce, Spark, Storm, etc. Fluency in multiple languages is a plus Demonstrable proficiency in more than one applicable programming...Full timeRemote work
$160k - $190k
...and strong interpersonal and communication skills. The primary role will be to use a range of database technologies, including SQL, Hadoop and raw JSON data to support our team in performing research, implementing key metric calculations, and helping to develop...Full timeFlexible hours- ...both unstructured and relational databases Collaborative Low ego and high pride in your work Bonus Points Experience with Hadoop, Storm, or Spark Past work with Docker and/or Ansible Background in statistics Background in UX/HCI/Design ERP systems...Remote jobFull timeFlexible hours
$5,000 - $5,830 per month
...analysis, and scripting. Experience retrieving and processing data from APIs. Hands-on experience with: Big data technologies (Hadoop, Spark, Kafka), Cloud platforms (AWS, Azure, or GCP), Orchestration tools (Airflow, Dagster), GitLab, DBT, Terraform, Liquibase....Remote jobFull timeFor contractorsMonday to Friday- ...automating data pipelines ~ Good experience using business intelligence/visualization tools (such as Tableau), data frameworks (such as Hadoop, DataFrames, RDDs, Dataclasses) and data formats (CSV, JSON, Parquet, Avro, ORC) ~ Advanced knowledge of R, SQL and Python;...Remote jobFull time
$156k - $195k
...data stack Experience deploying and managing data pipelines in the cloud Experience working with technologies like Airflow, Hadoop and Spark Understanding of streaming technologies like Kafka, Spark Streaming The listed Pay Range reflects base salary range...Full timeFlexible hours- ...computing technologies such as OpenMP, CUDA, OpenACC, MPI Experience with big data and distributed computing technologies such as Hadoop, MapReduce, Spark, Storm, etc. Demonstrable proficiency in more than one applicable programming languages is a plus \n...Full timeRemote work
