Data Engineer & ETL Developer

DATDenver, CO
8h$105,000 - $131,000Hybrid

About The Position

DAT is looking for a Data Engineer/ETL Developer to join our Data Engineering team working hybrid in Denver, Colorado.

Requirements

  • Experience: 3-5 years of professional experience in a Data Engineering, Analytics Engineering, or similar technical role (including relevant internship experience).
  • Education: Bachelor's Degree in Computer Science, Information Technology, Engineering, or a related quantitative field.
  • Technical Proficiency: Required: Strong proficiency in Python and advanced knowledge of SQL .
  • Hands-on experience with a cloud data warehouse, preferably Snowflake .
  • Familiarity with data transformation concepts and tools, specifically dbt .
  • Basic experience creating or running jobs/workflows using an orchestration tool like Apache Airflow .
  • Soft Skills: Strong problem-solving abilities, excellent attention to detail, and a proactive, collaborative approach to teamwork.

Responsibilities

  • Data Pipeline Development & Engineering
  • ELT Process Implementation: Assist senior engineers in designing, building, testing, and maintaining cloud-native ELT (Extract, Load, Transform) data pipelines, ensuring data is reliably loaded into Snowflake .
  • Transformation with dbt: Develop and maintain data models using dbt (data build tool) for data cleaning, aggregation, and transformation and utilize the Snowflake data warehouse.
  • SQL Development: Write, optimize, and review complex SQL queries for data manipulation, transformation, and performance tuning on Snowflake.
  • Python Scripting: Utilize Python to build custom data extraction scripts, implement monitoring tools, and contribute to general automation efforts.
  • Workflow Orchestration and Automation
  • Airflow DAGs: Learn to author, schedule, and monitor data workflows defined as Directed Acyclic Graphs ( DAGs ) in Apache Airflow .
  • Pipeline Scheduling: Integrate and orchestrate dbt runs and other pipeline tasks within Airflow to manage dependencies and execution timing.
  • Automation: Focus on automating repetitive tasks across the data lifecycle, reducing manual effort and improving pipeline efficiency.
  • Data Quality, Testing, and Monitoring
  • Data Quality: Implement data validation and testing frameworks using features of dbt (e.g., uniqueness, non-null checks) to ensure high data quality and accuracy within Snowflake data marts.
  • Troubleshooting: Monitor data pipeline health, troubleshoot failed Airflow tasks and dbt runs, and quickly resolve data flow issues.
  • Documentation: Maintain clear and current technical documentation for data models, Airflow DAGs, and pipeline logic.

Benefits

  • Medical, Dental, Vision, Life, and AD&D insurance
  • Parental Leave
  • Flexible Vacation Time (FVT)
  • An additional 10 holidays of paid time off per calendar year
  • 401k matching (immediately vested)
  • Employee Stock Purchase Plan
  • Short- and Long-term disability sick leave
  • Flexible Spending Accounts
  • Health Savings Accounts
  • Employee Assistance Program
  • Additional programs - Employee Referral, Internal Recognition, and Wellness
  • Free TriMet transit pass (Beaverton Office)
  • Competitive salary and benefits package
  • Work on impactful projects in a cutting-edge environment
  • Collaborative and supportive team culture
  • Opportunity to make a real difference in the trucking industry
  • Employee Resource Groups
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service