Data Engineer (AWS & Snowflake)

CapgeminiAtlanta, GA
$103,330 - $128,656Onsite

About The Position

Responsible for designing, developing, and maintaining scalable data pipelines and ETL processes using AWS Glue. This role involves working closely with data engineers, analysts, and other IT professionals to ensure data is efficiently integrated, transformed, and made available for business use.

Requirements

  • Bachelors degree in Computer Science, Engineering, or related field.
  • Proven experience in data engineering roles with a focus on Snowflake, AWS services, Python, and DBT.
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities.
  • 3 years of experience working with AWS Glue and other AWS data services.
  • Proven experience in designing and maintaining ETL processes.
  • Strong knowledge of SQL and database management.
  • Familiarity with data warehousing concepts and tools.
  • Experience with Python or other scripting languages.
  • Excellent problem-solving and analytical skills.
  • Strong communication and teamwork abilities.

Nice To Haves

  • Snowflake Snowpark: Deep understanding of Snowflake data warehousing platform and proficiency in using Snowpark for data processing and analytics.
  • DBT: Experience with DBT Data Build Tool for modeling data and creating data transformation pipelines is a plus.
  • AWS services Airflow: Hands-on experience with AWS services, particularly Apache Airflow for orchestrating complex data workflows and pipelines.
  • AWS services Lambda: Proficiency in AWS Lambda for serverless computing and event-driven architecture is essential for this role.
  • AWS services Glue: Well-versed in AWS Glue for ETL (Extract, Transform, Load) processes and data integration.
  • Fivetran HVR: Working knowledge and hands-on experience on Fivetran HVR.
  • Technical certifications are a plus.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using Snowflake, AWS services, Python, and DBT.
  • Collaborate with data scientists and analysts to understand data requirements and implement solutions.
  • Optimize data workflows for performance, scalability, and reliability.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Stay updated on the latest technologies and best practices in data engineering.
  • Design and implement scalable ETL solutions using AWS Glue.
  • Collaborate with data engineers and analysts to understand data requirements.
  • Develop and manage data extraction, transformation, and loading processes.
  • Optimize and improve existing data pipelines and ETL workflows.
  • Ensure data quality and integrity during the transformation process.
  • Monitor and troubleshoot issues with ETL jobs.
  • Maintain detailed documentation of data workflows and processes.
  • Stay updated with the latest AWS services and tools.

Benefits

  • Flexible work
  • Healthcare including dental, vision, mental health, and well-being programs
  • Financial well-being programs such as 401(k) and Employee Share Ownership Plan
  • Paid time off and paid holidays
  • Paid parental leave
  • Family building benefits like adoption assistance, surrogacy, and cryopreservation
  • Social well-being benefits like subsidized back-up child/elder care and tutoring
  • Mentoring, coaching and learning programs
  • Employee Resource Groups
  • Medical, dental, and vision coverage
  • Retirement savings plans (e.g., 401(k) in the U.S., RRSP in Canada)
  • Life and disability insurance
  • Employee assistance programs
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service