Get new jobs by email
$55k - $138k
...Cleveland-OH, Birmingham-AL, or Dallas-TX. -Develop, maintain, and expand a complex, data-... ...Management payment network currently hosted on Hadoop and Neo4j -Work on complex data... ...Engineer Skills -Orchestration; scheduling -Spark/PySpark -Cloudera Hadoop (HIVE, HQL, Imapala...SuggestedFull timeTemporary workPart timeWork experience placement$193.4k - $220.7k
...Description Lead Data Engineer (Java, Python, Spark, AWS) Do you love building and... ...with and across Agile teams to design, develop, test, implement, and support technical... ...Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or...SuggestedFull timePart timeInternshipH1bLocal area$65k - $194.35k
...in both business and technology. - Demonstrate initiative to develop approaches to solutions as part of a team; review architecture... ...into key business performance metrics - Demonstrate command of Hadoop, Spark 3m Python, Kafka, and complex orchestration. - Communication and...SuggestedFull timeTemporary workPart timeWork experience placement$63.25k - $158.7k
...within the PNC footprint. Job Requirements: - Strong foundation in Python programming and SQL. - Proficient in the Hadoop ecosystem (e.g., HDFS, Hive, Spark). - Proficient in the Neo4j ecosystem (e.g., Browser, Cypher, Bloom). - Familiarity with version control systems (...SuggestedFull timeTemporary workPart timeWork experience placement- ...Process Specialist at Infosys Ltd. Infosys is seeking Lead Java Spark Developer. In this role, you will interface with key stakeholders and... ...batch frameworks. Experience programming Spark SQL, HIVE, Hadoop, and Java for Spark. Experience with Microservices architecture...SuggestedFull timeRelocation
$65k - $165.6k
...data pipelines using cloud platforms and big data technologies -Developing and optimizing data warehousing solutions to ensure high... ...g., AWS, Azure, Google Cloud) and big data frameworks (e.g., Hadoop, Spark, Kafka) to build robust data architectures -Utilizing DevOps...SuggestedFull timeTemporary workPart timeWork experience placementWork at officeRemote work$55k - $158.7k
...Data Development Strong background in Apache Spark (batch and streaming) for ETL and analytics. Familiarity with Hadoop ecosystem (HDFS, Hive, YARN, MapReduce) for large... ...allows our employees to be heard, valued, and developed to do their best work. Top Reasons to Join PNC...SuggestedFull timeTemporary workPart timeWork experience placementWork at officeRemote work$225.4k - $257.2k
...Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full... ...with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) ~4+ year experience working on real-time...SuggestedFull timePart timeInternshipLocal area$110.5k - $171.93k
...the company's success. As a Data Engineer Senior (Hadoop/Big Data) within PNC's Technology organization on... ...position.* Technical Expertise: *Extenisve experience developing within a Cloudera Hadoop ecosystem (PySpark, Spark Streaming, Hive, Kafka, Impala) *Strong...SuggestedFull timeTemporary workPart timeWork experience placement- Job Overview Infosys is seeking a Spark and Scala Developer. In this role, you will enable digital transformation for our clients in a global delivery model, research technologies independently, recommend appropriate solutions, and contribute to technology-specific best...SuggestedRelocation
$66 - $68.26 per hour
.../hr - $68.26/hr Immediate Need Immediate need for a talented Hadoop Developer. This is a 12 months contract opportunity with long-term potential... ...Hadoop Ecosystem Expertise ETL Development & Support Spark with Python (PySpark) Pyramid Consulting, Inc. provides equal...SuggestedContract workLocal areaImmediate start- Overview Join to apply for the ETL Datastage / Hadoop Developer role at Tata Consultancy Services 22 hours ago Be among the first 25 applicants Join to apply for the ETL Datastage / Hadoop Developer role at Tata Consultancy Services Must Have Technical/Functional Skills...SuggestedFull time
- Job Title: Hadoop Developer Duration: Full Time Job Description Must Have Technical/Functional Skills Primary Skill: Hadoop Developer Secondary Skills: UI - Hive, Spark/ Impala, Yarn, Unix, Shell Script Experience: Minimum 7 years Slowing Changing Dimension Design...SuggestedFull time
- Infosys is seeking a Hadoop and PySpark Lead Developer . In this role, you will enable digital transformation for our clients in a global delivery model... ...At least 3 years of experience in Hadoop, Python, and Spark Good experience in end-to-end implementation of data...SuggestedRelocation
- ...software code development + Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in... ...Terraform, EMR, EKS/ECS, Lambda, RDS and S3 + Experience with Spark and SQL + Overall knowledge of the Software Development Life Cycle...Suggested
- ...approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs... ...development on data platforms, including practical expertise with Apache Spark for large-scale data processing. Strong proficiency in SQL...Work experience placement
- ...environment • Building the steps to implement new solution • Moving from Hadoop to the new open source software Must Have Technical Skills:... ...Languages: Python, Scala, Java • Data Processing Frameworks: Spark, Trino, Flink • Databases: SQL, NoSQL (e.g., Cassandra, DynamoDB...Contract workFor contractorsFlexible hours
$65 - $68 per hour
...Graph-based data workflows and working with Graph Analytics Extensive hands-on experience in designing, developing, and maintaining software frameworks using Kafka, Spark, Neo4J, Tiger Graph DB Hands-on experience in Java, Scala, or Python Design and build and deploy...Hourly payContract workTemporary workImmediate startRemote workWorldwide- ...Infrastructure & Automation: Jenkins, GitLab CI/CD, Terraform, Helm, Kubernetes (K8s), OpenShift (OCP), Vault, Ansible Platform Components: Spark, Trino, Airflow, NiFi, Ranger, Iceberg, DataHub Security & Networking: IAM, SSO, RBAC, Kerberos, firewall configurations, and...Full time
- ...leadership to deliver enterprise-scale data platforms built on Apache Spark, Trino, Apache Iceberg, and cloud-native orchestration tools.... ...focusing on performance, reliability, and maintainability Develop technical documentation and runbooks for platform operations Transfer...Full timeRemote workFlexible hours
- ...Summary: We are seeking a highly skilled Graph Data Platform Developer with strong expertise in TigerGraph or Neo4j , along with... ...or Neo4j Big Data & Streaming : Expertise with Kafka , Spark , YARN Programming : Proficiency in Java , Scala , or...Contract workLocal area
$105k
...challenges daily What to Expect (Job Responsibilities): - Design and develop backend services using Java, Spring Boot, J2EE, and REST APIs... ..., and JavaScript - Work with large-scale data platforms using Spark, Hadoop, Hive, Kafka, and Zookeeper - Deploy applications on cloud...Immediate start- ...Big Data & DevOps Proficiency : Experience with large‑scale data, CI/CD pipelines, IaC tools (Terraform/Pulumi), and exposure to Spark, Hadoop, Kafka. Cross‑Functional Collaboration : Skilled in API integration, data modeling, reporting tools (Power BI/Tableau), and ....Contract work
$140k - $160k
...difference in the world. Your Mission As a Senior Full Stack Developer at Hawk, your mission is to architect, build, and evolve our... ...familiarity with distributed systems or big data stacks (Kafka, Spark, Hadoop). Deep understanding of REST APIs, microservices architecture...Full time- ...Design/Detailing of Processes. Experience as Full Stack Java Developer / Architect with one or more of the following skillsets:... ...Chef / Puppet. Experience with big data technologies such as Hadoop / HBase / Spark / Kafka / Hive . Solid understanding of Machine Learning...Full timeH1b
- ...like GCP with data engineering data flow / Airflow, Pub/Sub/ Kafka, data proc/Hadoop, Big Query. ETL development experience with strong SQL background such as Python/R, Scala, Java, Hive, Spark, Kafka. Strong knowledge on Python Program development to build reusable...Relocation
- ...PySpark Expert (Onsite - Dallas, TX). Responsibilities Design, develop, and optimize ETL/ELT pipelines using SQL, Python, and... ...processing workflows in distributed data platforms (e.g., Hadoop, Databricks, or Spark environments) Partner with business stakeholders to...
$158.6k - $181k
...Do: + Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full... ...experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) + 2+ year experience working on real-...Full timePart timeInternshipH1bLocal area- ...Technologies: Familiarity with big data tools and frameworks like Apache Spark, Hadoop, or similar technologies. CI/CD: Understanding of Continuous... ...and integrate various data sources. Data Modeling: Develop and maintain data models and architecture that support business...Full time
- ...6.-Aptitude to understand and adapt to newer technologies 7.-Assist in the evaluation of new solutions for integration into the Hadoop Roadmap/Strategy 8.-Motivate internal and external resources to deliver on project commitments 9.-The desire to learn new soft and...Full time
