ETL Developer (Data Engineer Focus)

Access DevelopmentWest Valley City, UT
4hHybrid

About The Position

We are seeking an experienced ETL Developer to join our data engineering team. You will design, build, and maintain robust data pipelines that extract, transform, and load data from various sources into our data warehouse and analytical systems. This role is critical in ensuring high-quality, reliable data flows to support business intelligence, analytics, and decision-making. The ideal candidate is a hands-on developer with deep expertise in Python and SQL, combined with practical experience in building scalable data pipelines and managing databases. FULL-TIME opportunity available. A successful candidate will be data driven, proactive, and enjoy producing creative solutions to complex data problems. Full-time opportunity in office. Work is currently under a Hybrid model working a minimum of Tuesdays and Thrusdays in our Salt Lake City office. Applicants must have current authorization to work in the United States. Starting pay is dependent on experience, plus full employee benefits, including employee discounts nationwide.

Requirements

  • Bachelor degree in a quantitative science; masters degree is a plus
  • Expert-level proficiency in Python for data processing, scripting, and automation (e.g., Pandas, NumPy, custom ETL scripts).
  • Expert-level SQL skills, including advanced query writing, optimization, and experience with relational databases.
  • Proven experience building and maintaining data pipelines; strong preference for hands-on experience with Apache Airflow (especially in managed environments like Astronomer).
  • Hands-on database administration experience with PostgreSQL and Amazon Redshift (or similar data warehousing solutions).
  • Strong understanding of data modeling, warehousing concepts, and best practices for data quality and governance.
  • Excellent problem-solving skills and ability to work independently or in a team.

Nice To Haves

  • Experience with cloud platforms (e.g., AWS) and related services (e.g., S3).
  • Experience with Apache Airflow
  • Knowledge of version control (Git) and CI/CD practices for data pipelines.

Responsibilities

  • Design, develop, and optimize ETL/ELT processes and data pipelines to ingest, transform, and load data efficiently.
  • Write complex SQL queries for data extraction, transformation, and validation.
  • Implement and maintain data pipelines using scripting and orchestration tools.
  • Perform database administration tasks, including performance tuning, schema design, backup/recovery, and security management.
  • Collaborate with data analysts, engineers, and stakeholders to understand data requirements and ensure data quality and integrity.
  • Monitor, troubleshoot, and optimize existing pipelines for performance and reliability.
  • Document data processes, pipelines, and architectures.
  • Optimize Redshift data warehouse schema

Benefits

  • Competitive salary and benefits package.
  • Opportunity to work on impactful data projects in a collaborative environment.
  • Professional growth and learning opportunities in a modern data stack.
  • medical insurance
  • prescription drug coverage
  • a lifestyle discount program for personal/family use
  • a 401k and profit sharing plan
  • paid holidays and personal time
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service