Average salary: $130,520 /yearly
More stats ...Responsibilities
Production support role of Hadoop ecosystem including Sqoop, Spark, Hive, HDFS, Impala, Kafka
Project based work - upgrades and implementing new capabilities
Required Skills
Experience installing/working Cloudera CDP 7.1.7 or higher
Red...
Suggested
...industry knowledge and innovative strategies, we serve a wide range of sectors, specializing in both IT and Non-IT fields.
The Role
Hadoop:
Data extraction, Data Science/modeling, someone who can mentor junior developers. 9+ years of experience on pure Hadoop....
Suggested
Full time
...and experiences that matter. When you step into a career with AT&T, you won’t just imagine the future – you’ll create it.
As a Hadoop Platform Tester, you will play a crucial role in ensuring the quality and reliability of our cyber security platforms leveraging...
Suggested
Holiday work
Temporary work
Local area
Relocation
3 days per week
...Job Description
Job Description Big Data/ Hadoop Engineer - (Ideal candidate will be a Sr. java Developer currently working in Big Data and AWS cloud platform)
Responsibilities:
Work with big data analytics engineers to develop automated testing systems....
Suggested
Flexible hours
...Work closely with o9 Dev, Devops and project teams at all levels to help ensure the success of projects
Help design and implement Hadoop architectures and configurations for customers working with Cloud deployments
Write and produce technical documentation and...
Suggested
...Position: Informatica with Hadoop
Location: Charlotte NC, Dallas TX
Role Type: Full Time Employment
10+ years of IT experience.
At least 6+ years of experience in ETL , in Informatica Power Center tool.
Should be strong in Data warehousing...
Suggested
Full time
Work experience placement
...Supermicro is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are the #5 fastest growing company among the Silicon Valley Top...
Suggested
Worldwide
...Job Mode: Onsite
Required Skills:
4 + years of knowledge in Big Data environment
GCP Big Query, Hadoop architecture, HDFS commands, and designing and optimizing queries to build data pipelines.
3 + years of experience in building...
Suggested
Full time
...knowledge.
Roles and responsibility
Migration of data from in house application - primarily Oracle, Netizza, Hadoop to Google Cloud platform
Build data models using google big query in the Google Cloud Platform
Develop Python scripts for...
Suggested
...product business and data science team to collect user stories and translate into technical specifications •Uses knowledge in Cloud & Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines •Uses strong programming skills...
Suggested
Second job
Shift work
...engineering such as Python or PySpark
Knowledge of big data platforms like Snowflake, DBT, AWS Redshift, Postgres, MongoDB, and Hadoop
Experience with “Snowflake Cloud Datawarehouse” and DBT tool
Experience with data pipeline and workflow management tools
Advanced...
Suggested
...complex stored procedures a plus
~ Preferred Skills:
o Denodo Experience a plus
Big Data Experience a plus (Hadoop, MongoDB, Exadata)Soft Skills:
Must have good communication, they will be working with different business units
If...
Suggested
Hourly pay
Holiday work
Contract work
Temporary work
For contractors
Local area
...Cloud/On-Premises).
Experience working with cloud technologies such as AWS (Lambdas, S3, step functions, SNS, SQS)
Experience in Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines
Benefits and Other Fun Stuff...
Suggested
...data management platforms, including relational and non-relational data engines (MS SQL, MySQL, PostgreSQL, Amazon Redshift, MongoDB, Hadoop, Snowflake, Big Query)
Understanding and exposure/experience to advanced analytics/machine learning
Regards,
Manoj...
Suggested
Local area
...Proficient in object oriented design and design patterns
• Experience in ETL, Data warehouse concepts
• Experience in DataStage, Hadoop Ecosystem and Scheduling tool Control-M,
• Experience with unit testing tools such as JUnit, TestNG,
• Can describe solutions...
Suggested
...~ Experience and knowledge with the Data Warehouse ETL process
~ Exposure to cloud platforms and big data systems such as Hadoop HDFS, and Hive is a plus
~ Experience with tools and concepts related to data and analytics, such as dimensional modeling, ETL,...
Offshore
...will be responsible for building out the platform for processing and delivering large amounts of data using technologies like Spark, Hadoop, Kafka, MemSQL, Elastic and Data Lake, alongside traditional RDMS technologies.
We are a consumer driven team, with the goal to...
Immediate start
...Duration: Full Time
Required Skill:
The Qualifying candidate should be able to work in one or more than one projects in the Hadoop, RDBMS platforms including technical deliverables as per business needs.
Worked in onshore offshore model with extensive...
Full time
Offshore
...Excellent team player with ability to work with multiple stake holders
• Excellent ability to deliver under pressure
• Worked in a Hadoop environment as a Developer
• Some hands on experience in writing MapReduce jobs in Java, Pig, or Hive
Knowledge of Test...
...) techniques relevant to telco (automation scripting, terraform, etc.)
Learn model-driven operations with Juju charms for Kafka, Hadoop, PostgreSQL, MongoDB, NGINX, and more
Help customers adopt advanced Bare Metal, Public, Private and Hybrid Cloud solutions
Learn...
Holiday work
Remote job
...Qualifications:
Experience in the Financial Service Industry
Experience in Cluster Computing and Big Data solutions: Spark, Hadoop, HDSF, XRS using public cloud
Degree in Computer Science
ABOUT GOLDMAN SACHS:
About Goldman Sachs
The Goldman...
Permanent employment
...team and work on developing and implementing universal forecasting models, focusing on ML forecasting techniques using Python and Hadoop.
- DSI application is a unique, ML-driven platform that incorporates over 100 financial metrics to create a Universal Forecasting...
Seasonal work
Immediate start
...,Athena,Glue,Dynamo DB, Redshift Spectrum, Lambda and Cloud formation
~6+ years Experience with processing large data sets using Hadoop, HDFS, Spark, Kafka, Flume, Hbase, Solr or similar distributed systems
~5+ years Experience in E-R and Dimensional modeling
~5+...
Local area
...to inform conclusions about the data
Skilled in creating and managing databases with the use of relevant software such as MySQL, Hadoop, or MongoDB
Programming including coding, debugging, and using relevant programming languages
Communication including...
Work experience placement
...processes and outcomes, and inform decision-making
Develop solutions in R or Python
Develop production-grade solutions
Work in Hadoop, Redshift, and Spark
Translate business and product questions into analytics projects
Communicate clearly over written and...
Holiday work
Local area
...processes and outcomes, and inform decision-making
Develop solutions in R or Python
Develop production-grade solutions
Work in Hadoop, Redshift, and Spark
Translate business and product questions into analytics projects
Communicate clearly over written and...
Holiday work
Local area
...and self-motivated
Preferred Qualifications
Experience with microservice architectures (SOA)
Experience with Kafka, MongoDB, Hadoop, Cassandra
Experience with Payments systems (understanding of the various payment networks)
Experience with SQL databases (...
Work experience placement
...SAS, R, Python etc., tools to mine, manipulate & aggregate complex consumer and transaction level data on big data platforms such as Hadoop, Spark etc.
Use complex statistical techniques such as decision trees, regression modeling, machine learning, testing techniques...
...technologies (e.g., SQL Server, PostgreSQL);
Unstructured data storage technologies (e.g., MongoDB, Neo4j);
Big data tools (e.g., Hadoop, Spark/PySpark, Kafka);
Testing tool experiences (e.g., Selenium, Cucumber, Postman);
Visualization tools (e.g., Tableau,...
Permanent employment
...Financial Services Division.
Requirements:
~3-5+ years of experience in Data Engineering
~ Experience with Big Data technologies (Hadoop, Spark, Hive, etc.)
~ Experience working within cloud environments (AWS preferred)
~ Demonstrated backend work experience
~...
Work experience placement