ETL Big Data

CTG (Computer Task Group, Inc.)New York, NY
44dOnsite

About The Position

CTG is seeking an experienced ETL Big Data Engineer to design and maintain ETL pipelines for our client in NYC. Duration: 12 months Key Responsibilities: Build and maintain Python-based ETL pipelines from scratch. Develop reusable, efficient code following OOP principles. Model data with schemas and entity relationships for data warehouses. Write and optimize SQL queries across multiple database platforms (SQL Server, DB2, Oracle). Develop Airflow DAGs for workflow orchestration. Programmatically ingest, cleanse, govern, and report data from BigQuery and Databricks Delta Lakehouse. Required Skills: Python, SQL, RDBMS, Node/Node.js, MS SQL Strong ETL pipeline design and development experience Expertise in OOP, data modeling, and workflow orchestration Familiarity with Spark (plus) Nice to have: IBM Apptio Experience: Proven experience with big data technologies and ETL pipelines Hands-on with Airflow, BigQuery, Databricks Delta Lakehouse Education: Bachelor's in CS, IT, or related field (or equivalent experience) Other Details: 3 days in-person at NYC Amex Tower Strong verbal and written English communication Excellent verbal and written English communication skills and the ability to interact professionally with a diverse group are required. CTG does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services for this role. To Apply: To be considered, please apply directly to this requisition using the link provided. For additional information, please contact Recruiter, Rebecca Olan at [email protected]. Kindly forward this to any other interested parties. Thank you! About CTG CTG, a Cegeka company, is at the forefront of digital transformation, providing IT and business solutions that accelerate project momentum and deliver desired value. Over nearly 60 years, we have earned a reputation as a faster and more reliable, results-driven partner. Our vision is to be an indispensable partner to our clients and the preferred career destination for digital and technology experts. CTG leverages the expertise of over 9,000 team members in 19 countries to provide innovative solutions. Together, we operate across the Americas, Europe, and India, working in close cooperation with over 3,000 clients in many of today's highest-growth industries. For more information, visit www.ctg.com. Our culture is a direct result of the people who work at CTG, the values we hold, and the actions we take. In other words, our people define our culture. It's a living, breathing thing that is renewed every day through the ways we engage with each other, our clients, and our communities. Part of our mission is to cultivate a workplace that attracts and develops the best people, reflected by our recognition as a Great Place to Work Certified company across many of our global operations. CTG will consider for employment all qualified applicants including those with criminal histories in a manner consistent with the requirements of all applicable local, state, and federal laws. CTG is an Equal Opportunity Employer. CTG will assure equal opportunity and consideration to all applicants and employees in recruitment, selection, placement, training, benefits, compensation, promotion, transfer, and release of individuals without regard to race, creed, religion, color, national origin, sex, sexual orientation, gender identity and gender expression, age, disability, marital or veteran status, citizenship status, or any other discriminatory factors as required by law. CTG is fully committed to promoting employment opportunities for members of protected classes.

Requirements

  • Python, SQL, RDBMS, Node/Node.js, MS SQL
  • Strong ETL pipeline design and development experience
  • Expertise in OOP, data modeling, and workflow orchestration
  • Familiarity with Spark (plus)
  • Proven experience with big data technologies and ETL pipelines
  • Hands-on with Airflow, BigQuery, Databricks Delta Lakehouse
  • Bachelor's in CS, IT, or related field (or equivalent experience)
  • Strong verbal and written English communication
  • Excellent verbal and written English communication skills and the ability to interact professionally with a diverse group are required.

Nice To Haves

  • IBM Apptio

Responsibilities

  • Build and maintain Python-based ETL pipelines from scratch.
  • Develop reusable, efficient code following OOP principles.
  • Model data with schemas and entity relationships for data warehouses.
  • Write and optimize SQL queries across multiple database platforms (SQL Server, DB2, Oracle).
  • Develop Airflow DAGs for workflow orchestration.
  • Programmatically ingest, cleanse, govern, and report data from BigQuery and Databricks Delta Lakehouse.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Career Level

Mid Level

Industry

Administrative and Support Services

Number of Employees

1,001-5,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service