Hadoop Developer Job Description Template
Our company is looking for a Hadoop Developer to join our team.
Responsibilities:
- Identify useful technology that can be used to fulfill user story requirements from an Analytics perspective;
- Integrate multiple data sources using Extraction, Transformation and Loading (ETL);
- Lead a team of highly motivated data integration engineers;
- Build data lake and data marts using HDFS, NoSQL and Relational databases;
- Collect and process event data from multiple application sources with both internal Elsevier and external vendor products;
- Identify client organization’s strengths and weaknesses;
- Write high quality source code to program complete applications within deadlines;
- Understand business requirements and how they translate in application features;
- Create ETL pipelines using SQL, Python, Hive, and Spark to populate data models;
- Top 3 responsibilities you would expect the Subcon to shoulder and execute;
- Evaluate existing applications to reprogram, update and add new features;
- Work closely with team members to implement the requirement captured in the Rally user stories;
- Spark, Scala, and statistics background, Python;
- Conduct functional and non-functional test.
Requirements:
- Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field;
- Advanced knowledge of ETL/Data Routing and understanding of tools such as NiFi, Kinesis, etc;
- Good exposure with data science related projects;
- Payments Industry Background;
- Unix/Linux Shell scripting (most command line environments acceptable, e.g. Bourne/Bash/Korn/Perl/PowerShell);
- Experience in Scala, Java development experience is a must;
- Basic writing ability should be Java (a puzzle solver with experience in memory overhead/computation complexity/data structures/speed);
- Basic knowledge of Applied Statistics;
- Knowledge of Hive/HBase/Phoenix;
- Excellent communication skills as this is client& business facing role;
- Experience handling Unix systems, for optimal usage to host enterprise web applications;
- Hands on expertize with UNIX;
- HBASE, HIVE, SQOOP, MAPR, PIG, Splunk, and Talend;
- Bachelor’s degree in related field or equivalent experience;
- pyton.