ETL Developer (Data Engineer Focus)

Access DevelopmentSalt Lake City, UT
18hHybrid

About The Position

At Access, we believe great things happen when people come together. Our discount and rewards solutions are designed to connect three very important groups: organizations, their members, and the merchant community. The result? Everyone wins. Merchants tap into the affinity members share with their organizations. Organizations create loyalty and generate revenue . And members are happy because they save money. Access employees enjoy a flexible, friendly, people-oriented work environment with all-employee parties, activities that include family and friends, employee recognition, a fantastic nationwide employee discount program, and a strong focus on career development. Job Description We are seeking an experienced ETL Developer to join our data engineering team. You will design, build, and maintain robust data pipelines that extract, transform, and load data from various sources into our data warehouse and analytical systems. This role is critical in ensuring high-quality, reliable data flows to support business intelligence, analytics, and decision-making. The ideal candidate is a hands-on developer with deep expertise in Python and SQL, combined with practical experience in building scalable data pipelines and managing databases. FULL-TIME opportunity available. A successful candidate will be data driven, proactive, and enjoy producing creative solutions to complex data problems. Full-time opportunity in office. Work is currently under a Hybrid model working a minimum of Tuesdays and Thrusdays in our Salt Lake City office. Applicants must have current authorization to work in the United States. Starting pay is dependent on experience, plus full employee benefits, including employee discounts nationwide.

Requirements

  • Bachelor degree in a quantitative science; masters degree is a plus
  • Expert-level proficiency in Python for data processing, scripting, and automation (e.g., Pandas, NumPy, custom ETL scripts).
  • Expert-level SQL skills, including advanced query writing, optimization, and experience with relational databases.
  • Proven experience building and maintaining data pipelines ; strong preference for hands-on experience with Apache Airflow (especially in managed environments like Astronomer).
  • Hands-on database administration experience with PostgreSQL and Amazon Redshift (or similar data warehousing solutions).
  • Strong understanding of data modeling, warehousing concepts, and best practices for data quality and governance.
  • Excellent problem-solving skills and ability to work independently or in a team.

Nice To Haves

  • Experience with cloud platforms (e.g., AWS) and related services (e.g., S3).
  • Experience with Apache Airflow
  • Knowledge of version control (Git) and CI/CD practices for data pipelines.

Responsibilities

  • Design, develop, and optimize ETL/ELT processes and data pipelines to ingest, transform, and load data efficiently.
  • Write complex SQL queries for data extraction, transformation, and validation.
  • Implement and maintain data pipelines using scripting and orchestration tools.
  • Perform database administration tasks, including performance tuning, schema design, backup/recovery, and security management.
  • Collaborate with data analysts, engineers, and stakeholders to understand data requirements and ensure data quality and integrity.
  • Monitor, troubleshoot, and optimize existing pipelines for performance and reliability.
  • Document data processes, pipelines, and architectures.
  • Optimize Redshift data warehouse schema

Benefits

  • Competitive salary and benefits package.
  • Opportunity to work on impactful data projects in a collaborative environment.
  • Professional growth and learning opportunities in a modern data stack.
  • competitive wages
  • excellent benefits
  • employee discounts nationwide
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service