Data Engineer III

ChewyBellevue, WA
Hybrid

About The Position

The Data Engineer III role is responsible for contributing to and owning the data platform on the AWS cloud. This involves contributing to the architecture, and building/maintaining infrastructure using cloud computing technology. The role also includes building and maintaining scheduling/workflow orchestration applications, containerized applications with microservices architecture, and establishing continuous integration and continuous delivery pipelines. Key responsibilities include performance tuning of data pipelines, monitoring them for accuracy and completeness, reconciling data issues, and developing complex data ingestion and transformations from multiple sources. The role also involves designing and implementing dimensional data modeling, implementing the data platform strategy for data-driven decision making, and leading the evaluation of new tools and technologies. Collaboration with cross-functional stakeholders to define requirements for data products and creating operational reports using visualization/business intelligence tools are also key aspects of this position.

Requirements

  • Bachelor's degree in Electrical Engineering, Computer Science, Computer Engineering, or related field and 5 years of experience required as a Database Architect or related position/occupation.
  • Master’s degree in Electrical Engineering, Computer Science, Computer Engineering, or related field and 3 years of experience required as a Database Architect or related position/occupation.
  • 3 years with Cloud data platforms like AWS Glue, EMR, S3, SQS, SNS, and Step Functions.
  • Knowledge of Snowflake, Redshift, DynamoDB, PostgreSQL, and other data warehouses/databases.
  • Airflow for orchestration and automation of data workflows.
  • Understanding of data quality, governance, and observability principles.
  • CI/CD pipelines and terraform for cloud infrastructure management.
  • Knowledge of data platforms such as AWS Data platform, Databricks, Cloudera.
  • Design, and develop data pipelines using Python, Spark, Spark streaming and Kafka for building large-scale data processing frameworks.

Responsibilities

  • Own/contribute towards data platform on AWS cloud.
  • Contribute towards the architecture and build/maintain infrastructure using Cloud computing technology.
  • Build and maintain scheduling/workflow orchestration applications.
  • Build containerized applications with microservices architecture.
  • Established continuous integration and continuous deliver pipeline.
  • Performance tuning of data pipelines.
  • Monitor data pipelines for accuracy, missing data, enhancements, changes, and billing volumes to ensure all data is captured and processed accurately and when needed.
  • Reconcile data issues and alerts between various systems, finding opportunities to innovate and drive improvements.
  • Develop and maintain complex data ingestion and transformations for data originating from multiple data sources (structured/unstructured).
  • Design and implement dimension data modeling (Star Schema, Snowflake and Galaxy) for on-premise and cloud data warehouse infrastructure.
  • Implement the strategy, design, execution, system configuration, and operations of the data platform that help in datadriven decision making.
  • Lead the evaluation, implementation, and deployment of emerging tools and technologies.
  • Work with cross-functional stakeholders in defining and documenting requirements for building high-quality and impactful data products.
  • Create operational reports using visualization/business intelligence tools.

Benefits

  • 401k
  • new hire and annual equity grant
  • annual bonus (for C08+ positions)
  • medical/Rx insurance
  • vision insurance
  • dental insurance
  • life insurance
  • disability insurance
  • hospital indemnity insurance
  • critical illness insurance
  • accident insurance
  • parental leave
  • family services benefits
  • backup dependent care
  • flexible spending accounts
  • telemedicine
  • pet adoption reimbursement
  • employee assistance program
  • 10% off pet insurance
  • 20% off at Chewy.com
  • unlimited PTO (for exempt salary team members, subject to manager approval)
  • six paid holidays per year
  • paid sick and family leave (in compliance with applicable state and local regulations)
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service