Big Data Developer Job Description

Big Data Developer Job Description Template

Our company is looking for a Big Data Developer to join our team.

Responsibilities:

  • Work closely with Architecture group in delivering technical solutions;
  • Design and implement product features in collaboration with business and IT stakeholders;
  • Define Standard Output constructs which allow business users to extract their own data consistently across business units where needed;
  • Coordinate and interact with technical and business groups of client team;
  • Build scalable data pipelines on top of Hive and Spark using the Airflow scheduler/executor framework;
  • Demonstrate substantial depth of knowledge and experience in a specific area of Java, Data Streaming, Big Data, Spark, Scala, Kubernetes etc;
  • Create complex views and materialized views and tune for performance;
  • Define data/information architecture standards, policies and procedures for the organization;
  • Define Standard Integration points for data being captured into the Hadoop Data Platforms;
  • To run data aggregation either directly from Kafka or from Hbase;
  • Demonstrate substantial depth of knowledge and experience in a specific area of Java, Data Streaming, Big Data, Spark, Scala etc;
  • Provides consultation on complex projects and is considered to be a top-level contributor;
  • Apply Machine Learning models to forecast and visualize key revenue metrics and trends;
  • Relies on extensive experience and judgment to plan and accomplish goals according to timelines and budgets agreed upon with business stakeholders;
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Requirements:

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field;
  • Scala;
  • Spark;
  • Hbase;
  • Extensive experience architecting and engineering BigData processing applications using Spark, Hive;
  • Participate in requirement gathering and create design documents;
  • Certification in Hadoop preferred;
  • Practical knowledge ofScala or similar language;
  • Systems or equivalent experience;
  • Good working knowledge of data warehouse, database design, Microsoft Office Products and Microsoft Vision;
  • Familiarity with both waterfall and iterative development methodologies;
  • Proficient in Kafka Streaming, PostgreSQL, Spark, Scala, Java;
  • 4+ years of IT experience;
  • Strong analytical and diagnostic skills;
  • Participate in day to day activities involving design, development, test support, deployment, production monitoring for Order Management platform.