Get new jobs by email
- Core Java Developer/ Software Engineer With AWS, Big Data/Hadoop Contract USM Business Systems Role: Core Java Developer/ Software Engineer Duration: Open Duration... ...applications and supporting infrastructure; e.g., Spark, Redshift, and Kinesis. Needs to be built for...SuggestedContract work
- ...CloudFormation to manage and version control cloud resources. Develop and maintain robust data orchestration workflows using modern... ...Actions In-depth knowledge of big data technologies (e.g., Apache Spark, Kafka) and data orchestration tools (e.g., Apache Airflow,...SuggestedFull timeLocal area
- Senior Software Engineer - Core Data Hadoop HBase Full-time SAAS Company Scaling web APIs to thousands of concurrent requests Working with distributed systems - gracefully handling failures and partitioning data for maximum throughput Designing databases and caching layers...SuggestedFull time
- ...search, and digital ads within a Big Data environment, design, develop, and maintain high-performance applications and microservices.... ...What is our technology stack? Java, Spring, Hibernate, Scala, Spark, Hadoop, Git, HBase, Unix, Docker, Maven, Ansible, Postgres and...SuggestedLocal areaFlexible hours
- ...and other development team members to continue to support and develop a robust, high-quality software system that can be accessed on... ...experience with Cassandra, Kafka, Minio, Elasticsearch, HBase, and Hadoop. Exposure or knowledge of machine learning algorithms and...SuggestedWork at officeLocal areaImmediate start
- ...experience building products and services that improve software developer feedback loops, time to-production, rollback strategies, CI/... ...of big data technologies, (e.g., Elastic Search, Apache Hadoop, Spark, Kafka, etc.). Relevant Certifications: ~ Certifications...SuggestedFull timeShift work
- ...provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is... ...up to date. What you'll be doing: Develop best practices around cloud infrastructure... .../tools mentioned here: Big Data / Hadoop, Kafka, Spark, Airflow, Presto, Druid, Opensearch...SuggestedFull timeWork experience placementLocal areaRemote work
- ...you will work with our Cloud Platform team designing and developing web services, device IoT services and REST APIs.... ...management systems such as Puppet or Chef. Experience with Hadoop ecosystem tools such as Spark, MapReduce or other Big Data platforms. Interest in emerging...SuggestedHome office
- ...provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is... ...up to date. What you'll be doing: Develop best practices around cloud infrastructure... ...technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Airflow, Presto, Druid,...SuggestedWork experience placementLocal area
- ...Full Stack/Backend Developer Full Stack/Backend Developer needed to support the development of cloud hosted back-end elements of a data... ..., and trade-offs of common data warehouse (e.g. Apache Hadoop); workflow orchestration (e.g. Apache Beam); data extract, transform...SuggestedRemote workFlexible hours
- The ETL developers will be responsible for delivering high performing, reliable and scalable data warehousing and ETL solutions... ...Microstrategy) is a plus Experience with Big Data technologies (Hadoop, MapReduce, Spark) is a plus Experience with database technologies such as...SuggestedLocal areaFlexible hours
- ...and machine learning infrastructure ~ Hands-on experience with Spark, Ray, Kubernetes, Terraform, and AWS services such as EMR, S3,... ...Experience with SageMaker, Azure ML, or big data ecosystems (e.g., Hadoop, Hive, HBase) Background in payments, fintech, or risk-focused...SuggestedPermanent employmentFull timeRemote work
- Mandatory Skills: - Big Data Java, Scala, Apache Spark Required Skills: + 8 years experience in Big Data with Java & Scala. Good experience... ...projects in Big Data technologies. Worked with Apache Spark and Hadoop stack. Good communication skills. Individual contributor and...SuggestedWork at officeImmediate start
- ...the third consecutive year. Join Our Team as a Full Stack Developer! Are you a coding wizard with a passion for solving... ...Elastic Search, plus a knack for big data technologies like Spark, HBase, HIVE, and Hadoop. Why Join Us: Competitive salary and comprehensive benefits...SuggestedLocal areaRemote workH1bFlexible hours
- ...We are looking for an experienced Quant Developer with Python experience , to join our Research... ..., NOSQL databases Distributed computing: Spark, Dask, or HPC Understanding/interest in... ...any of the following would be valuable: Hadoop, Spark, Kafka, and related technologies...SuggestedFull timeLocal area
- ...leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in... ...databases; Demonstrate familiarity with big data technologies like Hadoop, Spark, or Kafka is a plus; Experience with machine learning and...Full timeH1b
- ...Role: Embedded IoT Developer Work location: Burlington, MA (onsite) Contract Rate- $50/hr on w2, $60/hr on c2c Job Description: "Seeking a skilled Embedded IoT Developer to design, develop, and optimize firmware for connected consumer products." Key Responsibilities...Contract workWork at office
- ...leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in... ...relational databases; Familiarity with big data technologies like Hadoop, Spark, or Kafka is a plus; Experience with machine learning and...Full timeH1b
- ...and a deep understanding of machine learning workflows. Key Responsibilities Design, implement, and maintain CI/CD pipelines: Develop and manage automated pipelines for code deployment, testing, and monitoring, ensuring the seamless integration of new features and...Permanent employmentWork at office
- ...Engineer collaborates with economists, data scientists, analysts, developers, and vendors to deliver robust, scalable, secure... ...research teams. Familiarity with big data technologies like Hadoop, Spark, or similar platforms. Experience with high-performance computing...Permanent employmentFull timeTemporary workPart timeWork at officeShift work3 days per week
- ...bugs, quickly identifying and prioritizing bugs, and working with developers to resolve them. •Maintain comprehensive product and support... ...(Bash / Python) Experience with databases, such as Kafka, Spark, Cassandra, ElasticSearch, and Redis Extensive experience with...Work at officeVisa sponsorship3 days per week
- ...System Development Division (SDD) is focused on delivering National Defense capabilities by driving mission-focused strategies to develop advanced technology systems enabling enduring products and solutions focused on achieving the customer vision. Our programs deliver...Contract workWork at officeRemote work
- ...System Development Division (SDD) is focused on delivering National Defense capabilities by driving mission-focused strategies to develop advanced technology systems enabling enduring products and solutions focused on achieving the customer vision. Our programs deliver...Contract workRemote work
- ...operations. Backed by top-tier investorsincluding Andreessen Horowitz, Spark Capital, and Renegade Partnerswe are boldly investing in R&D and... ...will do: As part of our product engineering team, you will develop interfaces that are used in mission-critical operations to...
- ...development technologies, tools, and processes Present your own designs to internal/external groups and review designs of others Develop test strategies, design automation frameworks, write unit/functional tests to drive up code coverage and automation metrics...
- ...applicable to unlock deeper insights from healthcare data Develop and automate data analysis workflows and reporting mechanisms... ...Experience with data analytics tools and technologies (e.g. Spark, Hadoop, machine learning libraries) Proven track record of implementing...
- Join to apply for the Associate, Quantitative Developer role at Arrowstreet Capital, Limited Partnership Join to apply for the Associate... ...packages High-performance computing Distributed computing Hadoop, Spark, Kafka, and related technologies SQL Unix/Linux system tools...Full timeLocal area
- ...Collections, Multi-threading and memory management and concurrency Experience in large scale cloud data migrations using Snowflake, Python, Spark, SQL Good understanding of Agile software development frameworks Strong communication and Analytical skills Ability to work in teams...Relocation
- ...networking and services in real-world environmentsArchitect cloud infrastructure solutions like Kubernetes, Kubeflow, OpenStack, Ceph, and Spark either On-Premises or in Public Cloud (AWS, Azure, Google Cloud)Architect and integrate popular open source software such as...Local areaRemote workWork from homeWorldwide
- ...to apply for the Director, Quantitative Developer role at Arrowstreet Capital, Limited PartnershipContinue... ..., NOSQL databasesDistributed computing: Spark, Dask, or HPCAbility to drive technical... ...any of the following would be valuable:Hadoop, Spark, Kafka, and related...Local area