Get new jobs by email
- ...of experience building data pipelines in cloud environments ~4+ years of experience with Big Data technologies (e.g., Spark, Hadoop) and cloud architecture ~3+ years of experience with reporting and analytics tools (e.g., Tableau, Power BI) ~ Hands-on...SuggestedLong term contract
- Cognizant Technologies delivers value added turn‑key solutions to our clients which accelerate their organization’s growth. Whether it’s entering new markets, introducing innovative new products, or initiating transformation process changes, Cognizant Technologies will ...Suggested
- ...SQL; relational (PostgreSQL, MySQL) and NoSQL (MongoDB) experience Cloud & Big Data: AWS/Azure/Google Cloud Platform, Spark, Hadoop, scalable storage (S3, Blob, HDFS) Ready to Apply? Take the next step in your data engineering career with this exciting opportunity...SuggestedContract work
- ...Develop and optimize ETL/ELT processes, data flows, and infrastructure leveraging AWS, SQL, and platforms such as Databricks, Redshift, Hadoop and Airflow. Assemble and manage large, complex datasets to meet functional and non-functional business requirements....SuggestedLong term contractRemote work
- ...Must be Local to Reston, NO RELO - OnSite 3 days a week. Top 5 Technical Skills: Python (Big Data Pipeline) AWS Hadoop, Spark, Hive EMR Terraform Job Description: Strong Python development to build a big-data pipeline for data processing and analysis Need strong experience...SuggestedContract workWork experience placementLocal area3 days per week
- ...solutions in virtualized environments. ~ Experience in one or more of the following: Networking, Devops, Security, Compute, Storage, Hadoop, Kubernetes, or SRE. ~ Networking - Experience delivering end-to-end networking designs and architectures that include DNS,...SuggestedContract workRemote work
- ...Preferred qualifications: Experience with streaming API s like Kafka Understanding of Big Data/Data Lake technologies (Spark, Hadoop, Databricks etc) Understanding of Design patterns and clean coding Understanding of technical aspects of Analytic...SuggestedLocal area
- ...or technically related discipline Minimum of 5+ years of relevant industry experience Minimum 5 years of experience developing in Hadoop echo system ( Spark, PySpark, MapReduce, Hive, Impala ) Minimum 5 years of experience with common application frameworks (JEE Spring...SuggestedRemote work
- ...~ Hands-on experience with AWS services (S3, Glue, Redshift, Lambda, EMR) ~ Experience with big data tools like Apache Spark, Hadoop, or Kafka ~ Knowledge of Machine Learning concepts and model lifecycle ~ Experience with ETL/ELT frameworks and data pipeline...SuggestedRemote work
- ...~ Hands-on experience with AWS services (S3, Glue, Redshift, Lambda, EMR) ~ Experience with big data tools like Apache Spark, Hadoop, or Kafka ~ Knowledge of Machine Learning concepts and model lifecycle ~ Experience with ETL/ELT frameworks and data pipeline...SuggestedRemote work
- ...Familiarity with market risk concepts including VaR, Greeks, scenario analysis and stress testing. ~ Hands on experience with Hadoop, Spark. ~ Proficiency on Git, Jenkins and CI/CD pipelines. ~ Excellent problem-solving skills and strong mathematical and analytical...Suggested
- ...technically related discipline Minimum of 5+ years of relevant industry experience Minimum 5 years of experience developing in the Hadoop ecosystem (Spark, PySpark, MapReduce, Hive, Impala ) Minimum 5 years of experience with common application frameworks (JEE...SuggestedRemote work
$71.42 - $90.22 per hour
...infrastructure tooling, including Ansible, Chef, Terraform, Jenkins, Docker, Kubernetes Experience with big data technologies: Apache Hadoop, Hive, Spark ecosystem Familiarity with Google Cloud infrastructure and security Experience with Java and Spring Boot,...SuggestedContract workRemote work- ...Senior Software Engineer experience in the Hadoop ecosystem (Spark, PySpark, MapReduce, Hive, Impala). Strong background in application frameworks such as JEE, Spring Boot, Struts, and Hibernate. Proficiency with relational databases (MS SQL Server preferred, Oracle,...Suggested
$65.05 per hour
...a quantitative discipline. Preferred Skills: Background in enterprise stress testing. Experience with Elastic, Hadoop, Teradata, or other distributed big data ecosystems. Knowledge of cloud or distributed computing environments. Prior work...SuggestedContract work- ...~ Solid Angular frontend experience ~ Hands-on AWS experience (Lambda, ECS, S3, RDS, EMR) ~ Big Data experience is required Hadoop, Spark, Presto, or EMR ~ Strong SQL skills with performance tuning ~ Experience with scripting (Python, Unix shell, Groovy,...Contract work
- ...Foundation : Strong understanding of statistics, linear algebra and calculus as applied to ML. Big Data : Experience with Spark, Hadoop or similar distributed computing frameworks. Cloud Platforms : Familiar with AI services on AWS, Azure or Google Cloud...
- ...Airflow) Knowledge of real-time inference and batch processing systems Familiarity with big data technologies (Spark, Kafka, Hadoop) Experience with containerization (Docker, Kubernetes) Background in applied AI domains (NLP, computer vision,...Remote work
- ...Familiarity with market risk concepts including VaR, Greeks, scenario analysis and stress testing. ~ Hands on experience with Hadoop, Spark. ~ Proficiency on Git, Jenkins and CI/CD pipelines. ~ Excellent problem-solving skills and strong mathematical and analytical...Contract work
- ...Kubernetes Cloud (AWS/Azure/GCP) Experience in building solutions with Big Data tools and frameworks, some of: SQL DBs Spark, Hadoop, Cassandra, Redis, Mongo, Elastic… ETL like processing in a Big Data environment S3, RedShift, DynamoDB, Kinesis, EMR, Athena,...Permanent employmentFull timeHome office
- A leading technology solutions provider in Minneapolis is seeking an experienced Software Engineer for a 3-6 month contract-to-hire opportunity. The candidate will perform all phases of software engineering, including designing reusable Java components and implementing ...Contract work
$99k - $225k
...development of algorithms leveraging R, Python, or SQL / NoSQL Experience with distributed data or computing tools, including MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL Experience with visualization packages, including Plotly, Seaborn, or ggplot2 TS/SCI...Full timeContract workPart timeLocal areaRemote work- ...SCI eligibility and active polygraph or ability to obtain a polygraph. Advanced knowledge of data science toolkits such as Apache, Hadoop, and SAS. Proficiency in Python, R, C++, SQL, and visualization technologies. Experience with intelligence data (e.g., telemetry)...
- ...Engineering, Data Science, Data Analytics, Solution Architecture, Data Architecture 3+ years of experience with Big Data systems, including Hadoop, Spark 3+ years of experience with Python, Java, Shell Scripting, SQL or R Experience with briefing senior leadership Experience...2 days per week3 days per week
- ...management of AWS services including EC2, S3, and IAM. Work with the team to implement and optimize big data processing frameworks such as Hadoop and Spark. Help with the integration and use of various compute instances for specific data processing needs. Contribute to the...
- ...relevant field experience Some background in deploying and supporting structured and large-scale database environments, such as the Hadoop ecosystem, Elastic Search, and Postgres/MySQL About BlueVoyant At BlueVoyant, we recognize that effective cyber security requires...Full timeWork at officeLocal areaRemote workFlexible hours
$175k - $180k
...experience with relational and NoSQL databases. Extensive experience with distributed data processing frameworks (e.g., Apache Spark, Hadoop) and stream processing technologies (e.g., Apache Kafka, Flink). Strong understanding of data warehousing concepts, dimensional...Full timeRemote work- ...Poly is required for access to classified environments and systems. Requirements Subject Matter Expert in the use of Spark and the Hadoop distributed processing framework. Expert knowledge in utilizing different EC2 compute instance types for in-memory, Delta cache, and...Live inRelocation
- ...and analytics platforms, including SQL and cloud platforms. Advanced programming knowledge and expertise in tools like Apache and Hadoop. Proficiency with visualization tools such as Tableau, PowerBI, or Kibana. Excellent organizational and problem-solving skills. Preferred...
- ...experience in job offered, Database Analyst, Database Administrator or related. Experience in Python, Java, Scala, SQL, AWS, Azure, GCP, Hadoop, Spark, Kafka, PySpark, AWS S3, AWS Lambda, Redshift, Spark MLlib. Travel and relocation possible to unanticipated client locations...Relocation