Get new jobs by email
- ...approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code... .../ technologies: Terraform, EMR, EKS/ECS, Lambda, RDS, S3 and Spark Experience with SQL, Linux/Unix shell and scripting Proficient...Suggested
- Overview Software Engineer III - Java, Scala, Spark, AWS role at JPMorganChase . This position is within Corporate Technology and involves... ...by software code development. Gather, analyze, synthesize, and develop visualizations and reporting from large data sets to support...Suggested
$133k - $185k
...software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service... ...services: EMR, EKS/ECS, Lambda, RDS, Terraform Experience with Spark and SQL. Overall knowledge of the Software Development Life...Suggested- ...leading financial institution in building and optimizing large-scale data processing systems. This role involves working with Hadoop and Spark to ingest, transform, and analyze high-volume market and trading data. The engineer will contribute to the development and maintenance...SuggestedPart time
- ...Scala/ Spark Developer Duration: Long term Location: Broadway, NY 3 days onsite Mandatory Skills: Big Data, Scala, Spark, Core Java Experience: As a Spark Scala Developer, you will play a critical role in the design, development, deployment and optimization of data...Suggested
$90k - $130k
...Experience building and enhance Scala framework and Spark experience Design and develop automated testing framework to perform data validation.... ...high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB, Kafka. Collaborate with application...SuggestedFull timePart time$45 - $49 per hour
...Full Stack Developer Location: Jersey City, NJ - hybrid onsite Salary Range: $45-49/hr Required Skills & Qualifications... ...Boot Experience with RESTful services and Oracle databases Knowledge of Kafka, Big Data technologies like Hadoop, Sqoop, and Spark...SuggestedPart time- Job Title Job Description: This is where the detailed job description goes. It provides an overview of the role, responsibilities, and expectations. It's important to highlight the key skills and qualifications required for the position. Key Responsibilities: Here...SuggestedPart timeRemote work
- ...analytics. Job Description Mandatory ·3+ years of experience with Apache NiFi programming ·3+ years of experience with Apache Hadoop and Spark ecosystems of open-source tools. Our data processing and modeling pipelines are built using Hortonworks platform (HDP) ·2+ years...SuggestedPermanent employmentFull time
- ...the business and its technical teams, contractors, and vendors. Develops secure and high-quality production code, and reviews and... ...data lake platforms, such as AWS Redshift, Glue, Databricks, Spark/Hadoop, and Snowflake. Familiarity with workflow orchestration tools...SuggestedFull timeFor contractors
- ...premise with distributed computing and emerging technologies. Develops secure and high-quality production code, and review and debug... ...building ETL/Data Pipeline and data platforms (e.g., Databricks, Spark/Hadoop, and Snowflake). Knowledge of workflow orchestration tools (e....Suggested
- Quantifind is seeking a Senior Platform Engineer to enhance our Apache Spark pipeline, which generates risk signal results from diverse data sources. You will define and deliver data services and machine learning infrastructure, contributing to the backend that powers our...SuggestedRemote workFlexible hours
- ...technical items within your domain of expertise Experience developing or leading large or cross-functional teams of... ...lake, ETL processes and big data technologies, such as Hadoop, Snowflake, Databricks, Apache Spark, Airflow, Apache Kafka, or equivalent technology stacks...Suggested
- ...Responsibilities AI and Agentic Solutions Development: Design, develop, and implement agentic systems for real-time decision-making... ...design). Knowledge of distributed computing frameworks (e.g., Spark, Hadoop). Familiarity with versioning tools (e.g., Git), containerization...SuggestedContract workLocal area
$158.6k - $181k
...Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full... ...experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) ~2+ year experience working on real...SuggestedFull timePart timeInternshipH1bLocal area- ...approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code... ...Solutions Architect strongly preferred Proficiency in Python, Spark, Databricks About Us JPMorganChase, one of the oldest financial...
- ...stability, and scalability. As a key technical contributor, you will develop essential technology solutions across diverse technical domains,... .... Advanced in one or more big data programming language(s) - Spark, PySpark, and Kafka. Experience in engineering data products...
- ...of documents at scale. Contribute to improving and extending our Spark-based distributed data processing pipeline. Help enhance our... ...building data pipelines with distributed compute frameworks like Hadoop. Spark, or Dask Knowledge of Linux/Unix systems, Docker/Kubernetes...Full time
- ...SQL Hyperscale databases for high availability and performance. Develop and optimize Databricks workflows (PySpark, Delta Lake) for large... ...strategies, and partitioning for Azure SQL Hyperscale. Optimize Spark jobs (Databricks) for efficiency and cost-effectiveness. Monitor...Full timeContract work
- ...is part of the job family responsible for developing and maintaining innovative software... ...dashboards. Design and Development of ETL/Hadoop, including stored procedures, queries, performance... ...-scale data sources using SQL, Hadoop, Spark, Hive, Snowflake, Databricks, Teradata or...Temporary workPart time
- ...initiatives. The ideal candidate will be responsible for designing, developing, and validating data pipelines and analytics workflows,... ...data and transformed layers. Optimize Databricks notebooks and Spark jobs for performance and cost efficiency. Implement CI/CD integration...Remote work
- ...(e.g., Langchain, LlamaIndex) and AI frameworks. Experience with Rest API, FastAPI, Websockets; knowledge of Redis, Hadoop/Hive, Neo4J, Apache Spark, Kafka, MongoDB is plus. Excellent written and verbal communication skills; ability to translate business strategy into...Full timeWork at office
$140k - $145k
...Direct message the job poster from Synechron We are hiring Java Developer / Lead with Github Copilot experience. Please share resumes to... ...experience is mandatory. Knowledge of Big Data technologies like Spark and Pyspark. Experience applying data engineering principles in...Full timeFlexible hours- ...customers. Design AWS Data Lake implementations. Design and Develop AWS Data Quality and Governance solutions. Become a deep... ...Tooling, Services & Libraries - Airflow, Kafka, Parquet, Spark, Metaflow, Git, Hadoop. AWS Infrastructure Scripting - CloudFormation, AWS CLI, AWS...Full timeRemote work
- ...container, and OS deployment, scaling, and management. Ability to develop in multiple programming languages such as Python, Bash, or... ...desired. Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill,...
- ...functional teams, provide technical leadership, and mentor junior developers, fostering a culture of continuous improvement and innovation.... ...efficient solutions. Experience in working with Databricks, Spark, and the Financial Industry is a plus. Responsibilities Lead...
- ...knowledge of object orientated programming and constructing algorithms. Knowledge of big data tools and frameworks such as Apache Spark, Hadoop, Kafka and Elasticsearch. A USA Citizen, Resident, Green card holder or possess a valid H1B. Achieved a very good degree in...Currently hiringH1b
- ...manner. Your role as a vital technical contributor will involve developing critical technology solutions across numerous technical domains... ...Familiarity with data processing frameworks such as Apache Spark or Flink Experience building RESTful APIs and working with event...
- ...Tech Centers.**Role Overview*** Design, develop, and maintain data pipelines to extract data... ..., build and unit test applications on ***Spark framework on Python***.* Build PySpark... ...require in-depth knowledge on Databricks/ Hadoop.* Experience working with storage frameworks...Start working todayRelocationVisa sponsorshipFlexible hoursDay shift
- ...we are EvolutionaryScale’s mission is to develop artificial intelligence to understand biology... ...processing pipelines using tools like Spark and Ray, for acquiring biology datasets.... ...processing systems using technologies such as Hadoop, Spark, or Ray. Knowledge of streaming...H1bVisa sponsorship