Data Engineer III

GHXLouisville, CO
2d$98,000 - $130,500

About The Position

The Data Engineer is responsible for developing and executing data solutions that support product and technology initiatives, including general application development activities such as unit testing, code review, code deployment, and technical documentation . This role also collaborates with Product and Engineering teams to design solutions and enable new data capabilities. Key Responsibilities Lead and contribute to the backend and ETL development effort of our data platform using Python and SQL Integrate and optimize data flows between AWS and Snowflake for application, analytics , and reporting use cases Implement and manage data quality, security , and monitoring frameworks Develop and maintain infrastructure-as-code using tools such as CloudFormation or CDK Contribute to DevOps practices for CI/CD pipeline automation, version control, and deployment Provide architectural guidance and developme nt/build standards for the team Troubleshoot and resolve issues in APIs, data pipeline lines , and infrastructure Analyze business requirements and work with teammates to formulate supporting design and design documentation

Requirements

  • Thorough understanding of, and support for, Agile development methodologies
  • Ability to design, collect , and analyze large datasets
  • Ability to communicate technical concepts and designs to cross - functional and offshore teams who have varying levels of technical experience
  • Proven data engineering , problem - solving , and analysis skills
  • Strong demonstrable SQL and Python skills
  • Ability to adapt to changing conditions and lead others through change
  • Demonstrated organizational, prioritization, and time management skills
  • Attention to detail
  • Ability and willingness to travel nationally to remote offices and partners approximately 10% of the time
  • Bachelor’s degree in Computer Science , Mathematics, or related fields
  • 5 + years of data engineering experience building business intelligence applications with exceptional SQL, PL/SQL, and/or Python skills
  • 5 + years of experience of ETL development in a big data environment
  • 5 + years working in an agile development environment
  • Technical writing experience in relevant areas, including queries, reports, and presentations

Nice To Haves

  • Experience in a diverse set of Amazon Web Services services such as SNS/SQS, S3, Glue, Lambda, API Gateway
  • Strong development experience in Python, PySpark, and SQL
  • Experience developing in Snowflake
  • Knowledge of data governance, API security, and best practices for cloud-based systems
  • Application , system or data architecture experience
  • Machine l earning e xperience

Responsibilities

  • Lead and contribute to the backend and ETL development effort of our data platform using Python and SQL
  • Integrate and optimize data flows between AWS and Snowflake for application, analytics , and reporting use cases
  • Implement and manage data quality, security , and monitoring frameworks
  • Develop and maintain infrastructure-as-code using tools such as CloudFormation or CDK
  • Contribute to DevOps practices for CI/CD pipeline automation, version control, and deployment
  • Provide architectural guidance and developme nt/build standards for the team
  • Troubleshoot and resolve issues in APIs, data pipeline lines , and infrastructure
  • Analyze business requirements and work with teammates to formulate supporting design and design documentation

Benefits

  • health, vision, and dental insurance
  • accident and life insurance
  • 401k matching
  • paid-time off
  • education reimbursement
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service