Data Engineer

SennosDurham, NC
Hybrid

About The Position

The Data Engineer sits within the Data & Analytics organization and supports the development and ongoing improvement of Sennos' modern data platform. This role focuses on building and maintaining data pipelines, implementing transformations, and contributing to a reliable Snowflake-based warehouse that powers analytics, reporting, machine learning, and product capabilities. Working closely with senior data engineering leadership, data architecture, analytics engineering, and product teams, this role combines hands-on technical execution with growing exposure to data modeling, quality enforcement, and scalable platform development.

Requirements

  • Bachelor's degree in Computer Science, Data Science, Engineering, or related field (or equivalent years of professional experience)
  • 2–4 years of experience in data engineering or a related data-focused role
  • Experience working with ETL/ELT processes and structured warehouse data
  • Exposure to cloud-based data platforms (AWS preferred)
  • Strong SQL skills (joins, window functions, and query optimization fundamentals)
  • Proficiency in Python for data processing, scripting, or automation
  • Familiarity with version control systems (e.g., Git)
  • Strong attention to detail and commitment to data accuracy
  • Ability to troubleshoot and debug data workflows effectively
  • Strong written and verbal communication skills
  • Ability to collaborate across technical and non-technical teams

Nice To Haves

  • Experience working with Snowflake or similar cloud data warehouses
  • Exposure to dbt or similar transformation frameworks
  • Introductory experience with dimensional modeling concepts
  • Experience implementing data quality tests or validation frameworks
  • Exposure to data contracts or schema management practices
  • Familiarity with reverse ETL concepts
  • Passing experience with workflow orchestration tools (e.g., Airflow, Dagster, or similar)
  • Familiarity with CI/CD practices for data workflows
  • Experience using AI-assisted tools to support debugging, pipeline development, or data engineering workflows
  • Exposure to BI tools (e.g., Power BI, Tableau, Looker)

Responsibilities

  • Build and maintain ETL/ELT pipelines using SQL and Python under the guidance of senior data engineering leadership
  • Develop and maintain transformations using dbt or similar tools within a Snowflake-based warehouse
  • Create and optimize datasets and views to support analytics, reporting, machine learning, and product feature development
  • Manage ad hoc data requests with accuracy and efficiency while maintaining data integrity and consistency
  • Implement and maintain data quality checks, validation rules, and testing processes to ensure reliability and trust in warehouse data
  • Support the enforcement of data contracts between source systems and the warehouse
  • Assist in reverse ETL workflows to operationalize warehouse data into downstream systems
  • Contribute to ML data preparation and feature pipeline workflows
  • Collaborate closely with Data Architecture, Analytics Engineering, Product, and Software Engineering teams
  • Contribute to documentation, governance practices, and continuous improvement of data engineering standards
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service