Average salary: $122,333 /yearly
More statsGet new jobs by email
- ...industries. From large scale mass communication projects to highly secure and confidential data analytics using Big Data frameworks like Hadoop and AWS Redshift, ODS has a broad set of proven technical capabilities that help companies make complex digital transformations....SuggestedAll shifts
- ...industries. From large scale mass communication projects to highly secure and confidential data analytics using Big Data frameworks like Hadoop and AWS Redshift, ODS has a broad set of proven technical capabilities that help companies make complex digital transformations....SuggestedWork at officeLocal areaShift work
- ...industries. From large scale mass communication projects to highly secure and confidential data analytics using Big Data frameworks like Hadoop and AWS Redshift, ODS has a broad set of proven technical capabilities that help companies make complex digital transformations....SuggestedWork experience placementWork at officeLocal area
- ...learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Experience with data processing tools and big data technologies (e.g., Hadoop, Spark). Demonstrated ability to lead and mentor technical teams, fostering a collaborative and innovative work environment....Suggested
- ...Strong project leader ~ Infrastructure Build experience ~ Executive presentations ~ Critical thinking for remediation plans ~ Hadoop platform knowledge ~ Experience with Scrum and Jira ~ Knowledge of infrastructure technologies (physical server hosting,...SuggestedFull timeWork at officeShift workNight shiftDay shift
- ...results under tight deadlines Strong understanding of SQL and an ability identify data quality issues Experience working with Hadoop, Teradata, Oracle, MS DBs and Snowflake Experience with reporting technologies like Tableau, SSRS, Alteryx, Toad, SQL Developer,...SuggestedPermanent employment
- ...Experience with data analytics and/or visualization techniques (e.g. SQL, Python, Tableau, Alteryx), as well as big data technologies (e.g. Hadoop, Cassandra, AWS) ~ Excellent communication, with an ability to convey the strategic vision in a digestible manner for varying...SuggestedRelocation package
$60 - $70 per hour
..., or a related field. ~10+ years of relevant experience in Big Data engineering and development. ~ Expertise in Oracle PL/SQL, Hadoop ecosystem, Hive Tables, and Python/PySpark. ~ Experience in creating workflows and scheduling jobs using Autosys, with a strong background...SuggestedHourly payContract workTemporary workLocal areaWorldwide- ...financial services industry, specifically Wealth Management Understanding of big data storage and processing technologies including Hadoop, Spark, Hbase/Cassandra, Phoenix, Spark, Pig and Hive Demonstrated ability to meet challenging growth metrics and client/partner...Suggested
$193.4k - $220.7k
...experience with a public cloud (AWS, Microsoft Azure, Google Cloud) ~4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) ~4+ year experience working on real-time data and streaming applications ~4+ years...SuggestedFull timePart timeInternshipH1bLocal area- ...algorithms. ~ Proficient in data science programming languages like Python, R or Scala. ~ Experience with big-data technologies such as Hadoop, Spark, SparkML, etc. & familiarity with basic data table operations (SQL, Hive, etc.). ~ Demonstrated relationship building...Suggested
$158.6k - $181k
...experience with a public cloud (AWS, Microsoft Azure, Google Cloud) ~3+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) ~2+ year experience working on real-time data and streaming applications ~2+ years...SuggestedFull timePart timeInternshipH1bLocal area$193.4k - $220.7k
...experience with a public cloud (AWS, Microsoft Azure, Google Cloud) ~4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) ~4+ year experience working on real-time data and streaming applications ~4+ years...SuggestedFull timePart timeInternshipH1bLocal area- .... Experience of data sources and Vector Store platforms such as Redis, Solar, Postgres DB, FAISS, Teradata, Oracle, SQL Server, Hadoop etc. Experienced in using design patterns and following best software engineering practices. An understanding of fundamental...SuggestedFull timeWork at officeWorldwideShift workDay shift
- ...MongoDB, InfluxDB or similar time series) DB SQL Languages (SQL, NoSQL, Python) ETL/Batch processes (Spring Batch, Java) Big Data (Hadoop, Apache Spark, Apache Parquet or similar) ~ Excellent communication skills both vertically and horizontally within the...SuggestedFull timeH1bVisa sponsorship
- ...data sources to Trans Data Hub Coding and testing Technical Skills: SQL CA7 Python Flex Skills: Spark Hadoop Soft Skills: Great communication (team is mostly remote) Suggest solutions/ Critical thinking and problem solving Degrees...Contract workFor contractorsWork experience placementRemote workFlexible hours3 days per week
- ...design and maintenance. Direct software programming and development of documentation. Must Have Technical Skills: ETL Hadoop Python Teradata SDLC Education/Certifications: Bachelors, will look at experience in lieu of education Software testing...Contract workFor contractorsWork at officeLocal areaRemote workFlexible hours
- .../financial background preferred Roles and Responsibilities: Candidate must be at least 8 years of experience with strong ETL/Hadoop data testing experience. He/She must have strong knowledge on SQL/HQL and able to work independently with minimal supervision. Also...Contract workFor contractorsWork at officeRemote workFlexible hours
- ...stakeholders across the organization and take complete ownership of deliverables. ● Experience in using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic Search ● Adept understanding of different file formats like Delta Lake, Avro,...Full timeRemote work
- ...that is not listed at this moment* Roles and Responsibilities: Candidate must be at least 8 years of experience with strong ETL/Hadoop data testing experience. He/She must have strong knowledge on SQL/HQL and able to work independently with minimal supervision. Also...Contract workLocal areaFlexible hours
- ...• Will be the Apache Iceberg SME • Designing the environment • Building the steps to implement new solution • Moving from Hadoop to the new open source software Must Have Technical Skills: Level 4 – 6+ years (will not have this with Apache Iceberg as it...Contract workFor contractorsLocal areaFlexible hours