About The Position

We’re looking for a Senior Data Engineer to serve as the lead developer on our internal data platform team. You’ll play a central, hands‑on role in building and maintaining a scalable data foundation leveraging Snowflake , dbt , and a modern medallion-style architecture. This position is ideal for an engineer who brings strong data modeling fundamentals, practical experience with modern transformation tooling, and a desire to build high‑quality, reusable data assets for analytics, operations, and administration using DataOps.

Requirements

  • Core Technical Expertise
  • Snowflake Practical hands‑on experience designing schemas, stages, tables, streams, and tasks.
  • Strong understanding of warehouse performance and policies, data sharing and cost-control practices.
  • Able to perform and automate administrative tasks.
  • Integration expertise with AWS, Fivetran, and BI Tools.
  • dbt (Core or Cloud) Expertise in developing models, macros, tests, selectors, and documentation.
  • Familiarity with structuring dbt projects for maintainability across multiple layers.
  • Data Modeling Strong command of star schema and dimensional modeling techniques.
  • Experience implementing SCDs (Types 1 & 2).
  • Understanding of Data Marts and Data Lake architecture
  • Medallion Architecture Hands-on experience building bronze/silver/gold layers with clear logical boundaries.
  • Data Engineering Skills Excellent SQL skills with attention to performance and security.
  • Experience with orchestration tools.
  • Understanding of data governance, lineage, documentation, and discovery.
  • Understanding of multi-tenant data architecture.
  • Experience contributing to CI/CD workflows for data transformation pipelines.

Nice To Haves

  • Role based access (RBAC) and Discretionary access control (DAC) implementation strategies for data warehouses and data lakes
  • Familiarity with Data Vault 2.0 (hubs, links, satellites).
  • Experience working in an AWS environment (S3, IAM, Glue, Lambda, Step Functions, event-driven architectures).
  • Exposure to streaming/event ingestion frameworks (Kafka, Kinesis, etc.).
  • Strong CI/CD skillset with IaC tools such as Terraform, and Gitlab pipelines.
  • Soft Skills & Mindset Strong collaboration skills, especially across engineering, analytics, product, and ML/AI teams.
  • High attention to data quality, testing, and documentation.
  • Pragmatic and outcome-oriented—balancing robust architecture with timely delivery.
  • Curious and proactive about adopting modern data engineering patterns and tools.

Responsibilities

  • Build and maintain dbt models across bronze/silver/gold layers following medallion architecture best practices.
  • Design and maintain star and galaxy schemas , fact tables, and dimension models—including Slowly Changing Dimensions (SCDs) .
  • Develop, optimize, and operationalize Snowflake pipelines with a focus on performance, cost efficiency, and reliability.
  • Implement data quality and validation checks using dbt tests, schema tests, and custom macros.
  • Collaborate with analysts, engineers, and product teams to ensure models support analytics, operational requirements
  • Build or refine ingestion, transformation, and semantic layers used by downstream applications.
  • Implement and maintain CI/CD workflows for data transformations, testing, and deployments using DataOps.
  • Document models, lineage, and business logic for clarity and long-term maintainability.
  • Evaluate and introduce modern data engineering tools and practices when appropriate.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service