Sr. Data Engineer

Seattle Children'sSeattle, WA

About The Position

The Senior Data Engineer is responsible for building a data processing pipeline that collects, connects, centralizes, and curates data from various internal and external sources using a variety of languages and tools for the Enterprise Data Warehouse. This involves leveraging modern cloud platforms and services to develop highly scalable, secure, and reliable data engineering solutions for efficiently moving and transforming data across systems. The role also includes designing, implementing, testing, and deploying cloud-based data processing infrastructure, working within an Agile team, and breaking down, estimating, and providing just-in-time design for small increments of work. This position is crucial to Seattle Children's Enterprise Analytics team's mission to transform healthcare for children through patient safety, predictive analysis to cure diseases, and lowering treatment costs.

Requirements

  • Bachelor's Degree in computer science or related field, or equivalent combination of education and experience/technical training that demonstrates analytical and technical competency.
  • Minimum of five (5) years technology industry or related experience, including building highly scalable, scaled-out architectures on large scale database platforms.
  • Experience working in a complex data infrastructure environment.
  • Five (5) years of experience in a data engineering role.
  • Extensive and in depth data pipeline development experience with industry standard data integration tools.
  • Experience building scalable data pipelines using Spark or Spark-SQL with Airflow scheduler/executor framework or similar scheduling tools.
  • Experience with cloud platforms with GCP preferred.
  • Experience with Google BigQuery or equivalent.
  • Advanced competency in SQL with ability to perform query optimization in large scale database platforms.
  • Experience in SDLC process with requirements gathering, analysis, architecture, design, implementation, testing, deployment and technical support.
  • Experience with any industry standard tool for Source Control and Project Management.
  • Experience writing test cases and test scripts for data quality assurance.
  • Experience creating stored procedures and functions.
  • Experience developing dimensional data model with any industry standard tool.

Nice To Haves

  • Experience in Healthcare or related industry.
  • Experience GCP cloud services and data warehouse stores like BigQuery.
  • Experience productizing/automating predictive models that use R, SAS, Python, SPSS, etc.
  • Experience in version control and CI/CD (Continuous integration and continuous delivery) tools.
  • Familiarity with Agile framework and test driven development methodology for analytic solutions.
  • API development.
  • Data visualization and/or dashboard development.

Responsibilities

  • Building a data processing pipeline that collects, connects, centralizes, and curates data from various internal and external sources using a variety of languages and tools to marry systems together for the Enterprise Data Warehouse.
  • Leveraging modern cloud platforms and services to develop highly scalable, secure and reliable data engineering solutions for efficiently moving and transforming data across systems.
  • Designing, implementing, testing, and deploying cloud-based data processing infrastructure.
  • Performing work in an Agile team setting.
  • Breaking down, estimating, and providing just-in-time design for small increments of work.

Benefits

  • Medical plans
  • Dental plans
  • Vision plans
  • 403(b)
  • Life insurance
  • Paid time off
  • Tuition reimbursement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service