Data Engineer

The Scion GroupChicago, IL
Onsite

About The Position

We are seeking a hands-on Data Engineer to design, build, and maintain reliable ELT pipelines and production-grade analytics models in Snowflake. This role requires strong SQL expertise, deep DBT experience, and demonstrated ability to independently build and troubleshoot end-to-end data pipelines — from ingestion (APIs, change data capture) through transformation and BI consumption. The ideal candidate takes ownership of data systems, proactively identifies failure points, and delivers production-ready solutions without sustained oversight.

Requirements

  • 3–6 years of experience in data engineering or analytics engineering.
  • Advanced SQL proficiency (window functions, performance tuning, complex joins).
  • Hands-on experience building DBT models in a production environment.
  • Experience working with Snowflake or similar cloud data warehouse.
  • Experience building and maintaining ELT pipelines.
  • Experience implementing Change Data Capture (CDC).
  • Experience integrating and ingesting data from external APIs.
  • Demonstrated ability to independently diagnose and resolve pipeline and data quality issues.
  • Experience deploying production-ready data assets with minimal oversight.

Nice To Haves

  • Experience with Git-based workflows and CI/CD (e.g., GitHub Actions, dbt Cloud).
  • Experience designing data monitoring frameworks.
  • Experience with Looker or similar BI tools.
  • Familiarity with orchestration tools.

Responsibilities

  • ELT Pipeline Development
  • Design and implement end-to-end ELT pipelines into Snowflake.
  • Build and maintain ingestion pipelines from:
  • Third-party APIs
  • SaaS platforms
  • Internal application databases
  • Implement and maintain Change Data Capture (CDC) processes.
  • Ensure incremental loading, idempotency, and data consistency.
  • Diagnose and resolve ingestion failures independently.
  • Data Modeling & Transformation (DBT)
  • Design layered DBT models (staging, intermediate, marts).
  • Write high-quality, performant SQL transformations.
  • Implement incremental models and testing strategies.
  • Optimize transformations for cost and performance in Snowflake.
  • Ensure transformations are production-ready and validated prior to deployment.
  • Data Quality & Monitoring
  • Implement data validation and anomaly detection checks.
  • Design monitoring around pipeline failure modes.
  • Perform structured root cause analysis for data discrepancies.
  • Ensure production datasets are complete, timely, and trustworthy.
  • Production Ownership
  • Drive work from design through deployment.
  • Validate outputs prior to release.
  • Monitor deployed pipelines and models.
  • Proactively address data gaps or inconsistencies.
  • Document assumptions, transformations, and dependencies clearly.
  • BI & Collaboration
  • Partner with analytics and business stakeholders to translate requirements into scalable data solutions.
  • Support Looker semantic modeling and dashboard reliability.
  • Communicate technical tradeoffs clearly and concisely.

Benefits

  • Health Insurance
  • Dental Insurance
  • Vision Insurance
  • 401k Matching
  • Paid Maternity Leave
  • Discretionary annual EOY bonus
  • Paid Time Off

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

11-50 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service