Get new jobs by email
- Core Java Developer/ Software Engineer With AWS, Big Data/Hadoop Role: Core Java Developer/ Software Engineer Duration: Open Duration - Long Term Interview: Phone... ...applications and supporting infrastructure; e.g., Spark, Redshift, and Kinesis. Needs to be built for...Suggested
- ...to join the Central Software (CSW) team. The role focuses on developing and maintaining robust cloud-based data pipelines and other big... ...-on experience with big data technologies such as Apache Spark, Hadoop, Kafka, Flink, or similar distributed systems. Expertise in relational...SuggestedFull time
$179.84k - $247.28k
...experience building and scaling big data pipelines and architectures from scratch. Deep expertise in big data frameworks (Hadoop, Spark) and the JVM stack (Java, Scala). Strong software engineering fundamentals and ability to write efficient, high-quality code...SuggestedFull timeWorldwide- Senior Software Development Engineer (Scala, Data Warehouse, Spark) - Remote Join to apply for the Senior Software Development Engineer... ...2 or 3, and related libraries such as Zio 2+ years Experience developing Data Warehouses using databases such as Delta Lake and Postgres...SuggestedRemote job
- Senior Software Engineer - Core Data Hadoop HBase Full-time SAAS Company Scaling web APIs to thousands of concurrent requests Working with distributed systems - gracefully handling failures and partitioning data for maximum throughput Designing databases and caching layers...SuggestedFull time
- ...Job Description Job Description \n \nA rapidly expanding educational institution is hiring for a Senior Backend End Developer for their team. As a Senior Backend End Developer, you’ll be on their development team in charge of designing building and implementing...SuggestedFull time
$90k - $190k
...and other development team members to continue to support and develop a robust, high-quality software system that can be accessed on... ...experience with Cassandra, Kafka, Minio, Elasticsearch, HBase, Hadoop, Lucene, Python. Exposure or knowledge of machine learning algorithms...SuggestedWork at officeLocal areaImmediate start- Job Description Job Description The Company: Tutor Intelligence is building the technology and processes to let robots go where they’ve never gone before: the average American factory. We understand that general-purpose and generally-intelligent robots are going...SuggestedWork at office
- ...Hybrid with 2-3 days remote per week. Responsibilities: Port embedded software from Cortex-M3 to Linux environments. Develop Linux applications to interface with parallel and SPI hardware. Collaborate with customers and internal teams to define and execute...SuggestedPermanent employmentContract workRemote work2 days per week3 days per week
- ...value they get from athenaOne. Job Responsibilities: Design and develop code on an Agile team of Engineers, a Scrum Master, a Product Owner... ...and talented employees — or athenistas, as we call ourselves — spark the innovation and passion needed to accomplish our goal. We...SuggestedFull timeWork at officeRemote work
- ...provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is... ...up to date. What you'll be doing: Develop best practices around cloud infrastructure... ...technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Airflow, Presto, Druid,...SuggestedFull timeWork experience placementLocal areaRemote work
- ...TensorFlow, Keras, Langchain, and LlamaIndex.Experience with cloud platforms (e.g. Azure, AWS, GCP)Knowledge of big data tools (e.g. Spark, Hadoop) Strong understanding of data pipelines, APIs, and system integration.Healthcare and/or revenue cycle experience strongly...SuggestedShift work
- Job Description Job Description Xometry (NASDAQ: XMTR) powers the industries of today and tomorrow by connecting the people with big ideas to the manufacturers who can bring them to life. Xometry's digital marketplace gives manufacturers the critical resources they...Suggested
- ...growing team, making a huge social impact, touching the lives of those we love and care about most. We are a fast-paced startup developing a game-changing technology that generates Nitric Oxide on demand, at the patient location, from the air we breathe. Our Third...SuggestedFull timePart timeLocal area
$158.6k - $181k
...Overview Senior Data Engineer (PySpark/Spark, AWS, Databricks, EMR, ETL/SQL) Do you... ...with and across Agile teams to design, develop, test, implement, and support technical solutions... ...data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL...SuggestedFull timePart timeInternshipH1bLocal area$140k - $225k
...role: The Principal Platform Engineer shall have senior software engineering experience building and operating hybrid-infrastructure developer platforms (private and commercial cloud infrastructure) at an Individual Contributor Level. A successful candidate has prior...Work experience placementLive inRemote workFlexible hours$158.6k - $181k
Overview Senior Data Engineer (PySpark/Spark, AWS, Databricks, EMR, ETL/SQL) Data Engineers... ...with and across Agile teams to design, develop, test, implement, and support technical... ...distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or...Full timePart timeInternshipVisa sponsorship$70k - $120k
...of our company. What you'll do: ETL/ELT Development: Design, develop, and maintain robust, scalable data pipelines to support both... ...GCP, Azure) Experience with big data technologies like Apache Spark or Hadoop Some Knowledge of data visualization and business...Full timeFlexible hoursNight shiftWeekend work$230k - $368k
...strategy. Deep expertise in building and modernizing distributed data platforms using technologies such as Apache Spark, Kafka, Flink, NiFi, and Cloudera/Hadoop. Hands‑on proficiency with modern data platforms and tools including Databricks, Snowflake, Delta Lake, and...Full timePart timeWorldwideFlexible hours- Mandatory Skills Big Data Java, Scala, Apache Spark Required Skills + 8 years exp in Bigdata with Java & Scala. Should have good experience... ...in Big Data technologies Should have worked in Apache Spark, Hadoop stack Should have good communication skills Should be an...Work at officeImmediate start
- ...key talents. Emergent technology staffing (UI/UX developers & designers, Mobility & Cloud Engineers, Big Data/Hadoop developers & architects…& more) remains our primary... ...Experience with Hadoop ecosystem tools such as Spark, MapReduce or other Big Data platforms Interest in...
$155.3k - $207k
...is mining & filtering the massive influx of fleet data by developing billions-scale data workflows and state-of-the-art mining... ...manipulation (e.g. Athena, Redshift, BigQuery). Experience with Spark, Beam, Kafka, Hadoop or other data processing tools. Fluency in Python and...Remote work$120k - $190k
...are building AWS edge platform for disconnected operations and must ensure a smooth software deployment process for applications developed on IL4 and delivering them to IL6/SIPR. You will be responsible for ensuring observability, monitoring, and alerting operate as engineered...Live inLocal areaRemote workFlexible hours$225.4k - $257.2k
...Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full... ...experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) ~4+ year experience working on real-time...Full timePart timeInternshipLocal area- ...One.* Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full... ...experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL)* 4+ year experience working on real-...Full timePart timeInternship
- Job Description: Join a cutting-edge engineering team developing advanced laser communication systems used across mission-critical environments. As an Embedded Software Engineer, you will be responsible for designing, coding, testing, and deploying embedded software...Flexible hours
$179.84k - $247.28k
...team. In this role, you will be responsible for architecting and developing large-scale, distributed data processing pipelines that power... ...scalable data processing pipelines using technologies like Apache Spark, Apache Beam, and Airflow - Optimize data processing systems...Full timeWorldwide- ...experience building products and services that improve software developer feedback loops, time to-production, rollback strategies, CI/... ...of big data technologies, (e.g., Elastic Search, Apache Hadoop, Spark, Kafka, etc.). Relevant Certifications: Certifications in Cloud...Full timeShift work
- ...Terraform, ensuring security, scalability, and cost awareness. Develop and maintain CI/CD pipelines (Jenkins, Argo CD, GitHub Actions)... ...Airflow, Cloud Composer). Exposure to big-data frameworks (Spark, Flink) or modern data-lake architectures. Knowledge of cost-...Full time
- ...conceptual model for a data domain inside the central data platform. Develop and maintain data schemas and models for products within... ...in Cloud (GCP, AWS, ..), SQL and NoSQL databases, Java, Hadoop, Python, Spark, Data Lakes, Data Warehouses, and Data Marts preferred...Work experience placementImmediate startRemote workFree visaFlexible hours

