USA_Developer

VariteDallas, TX
71d$55 - $60

About The Position

This position is for a Data Engineer with a focus on ETL/ELT processes and data warehousing using Snowflake. The role requires a strong background in Python development and experience in AWS environments. The candidate will be responsible for building data pipelines, processing structured and semi-structured data, and ensuring data quality and performance tuning.

Requirements

  • 5 years of hands-on development experience in Python as an ETL/ELT Data Engineer.
  • Experience with data warehousing on Snowflake.
  • Hands-on coding experience in Pyspark and Python Data Frame.
  • Proficiency in structured and semi-structured data processing.
  • Experience in writing Snow SQL and stored procedures for ETL transformation.
  • Expertise in ELT using Snowflake SQL, Stored Procedures, Tasks/Streams, and Snowpark.
  • Strong communication skills.

Nice To Haves

  • Working knowledge in scheduling jobs preferably in Control M, Autosys, or Airflow.
  • Experience with implementing data quality checks and error handling in data pipelines.
  • SnowPro Core Certification.

Responsibilities

  • Develop and maintain ETL/ELT processes using Python and Snowflake.
  • Build and optimize data pipelines for data ingestion and processing.
  • Write and maintain complex SQL queries and stored procedures for data transformation.
  • Implement data quality checks and error handling in data pipelines.
  • Communicate effectively with team members and stakeholders to articulate technical concepts.

Benefits

  • Competitive pay rate of $55-60.60/hr.
  • Remote work option with required availability in EST or CST timezones.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Industry

Professional, Scientific, and Technical Services

Number of Employees

251-500 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service