Data Engineer III Job Description

Data Engineer III Job Description Template

Our company is looking for a Data Engineer III to join our team.

Responsibilities:

  • Create unified enterprise data models for analytics and reporting;
  • As part of Agile development team contribute to architecture, tools and development process improvements;
  • Promotes data modeling standardization, defines and drives adoption of the standards;
  • Coordinate data models, data dictionaries and other database documentation across multiple applications;
  • Design, implement, and support a platform providing access to large datasets;
  • Design and build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark;
  • Leads design reviews of data deliverables such as models, data flows and data quality assessments.

Requirements:

  • Support a Secure Data Extract (SDE) system for strategic accounts; ensure jobs are run efficiently and reliably providing data to clients;
  • 2+ years of experience with Azure and AWS services including AzureSQL, S3, Redshift, EMR and RDS;
  • 5-7 years of experience with SQL-Server, SSRS, SSIS, and T-SQL;
  • Address data pull requests for various consumers and/or oversee other data engineers fulfilling them;
  • 5-7 years of supporting a large data platform and data pipelining;
  • 2+ years of experience with Big Data Technologies (Snowflake, DataLakes, Data Warehouses);
  • 7+ years of relevant experience in one of the following areas: Data engineering, business intelligence or business analytics;
  • 2+ years experience with schema design and dimensional data modeling;
  • Ability to effectively collaborate and communicate complex technical concepts to a broad variety of audiences;
  • 3+ years experience in writing SQL and custom ETL processes;
  • Experience analyzing data to identify deliverables, gaps, and inconsistencies;
  • Knowledge of Python and Java;
  • Experience in version control tools such as Mercurial or Github;
  • 2+ years dashboard development in Tableau 2+ years working with either a MapReduce or an MPP system;
  • 3+ years experience in the data warehouse space.