DBT/Informatica Lead

NTT DATAChicago, IL
13h

About The Position

Define the target ELT architecture using dbt on Snowflake, integrated with Azure services (ADF, ADLS, Synapse/Databricks, Key Vault, Azure DevOps/GitHub). Translate legacy Informatica mappings, workflows, and sessions into modular dbt models (staging, core, mart layers). Establish modeling standards (naming conventions, layer design, folder/package structure) for staging, integration, and mart layers. Define and implement performance-optimized patterns in dbt and Snowflake (incremental models, clustering, partitioning logic, query tuning). Lead the migration strategy, roadmap, and wave planning for converting Informatica jobs to dbt on Snowflake. Analyze existing ETL logic, dependencies, and schedules in Informatica and design equivalent or improved logic in dbt. Design a repeatable migration factory: templates, accelerators, mapping spreadsheets, and conversion playbooks for Informatica → dbt. Oversee conversion, unit testing, and parallel runs to validate that dbt models match legacy outputs (row counts, aggregates, business rules). Lead hands-on development of dbt models, seeds, snapshots, tests, macros, and documentation. Define and implement testing strategy using dbt tests (schema tests, data tests, custom tests) and integrate with broader data quality checks. Set up and maintain dbt environments (dev/test/prod), profiles, and connections to Snowflake on Azure. Introduce and enforce code quality practices. Code reviews & pull requests Modular, reusable models and packages

Requirements

  • 8+ years total experience in Data Engineering / ETL / Data Warehousing
  • 3+ years hands-on experience with dbt (Core or Cloud) building production-grade pipelines
  • Proven experience leading an Informatica → dbt migration to Snowflake on Azure (or similar large-scale ETL modernization)
  • Strong Snowflake experience: designing and developing schemas, views, warehouses, and performance optimization
  • Solid working knowledge of Azure data stack: Azure Data Factory, ADLS, Azure DevOps/GitHub

Responsibilities

  • Define the target ELT architecture using dbt on Snowflake, integrated with Azure services (ADF, ADLS, Synapse/Databricks, Key Vault, Azure DevOps/GitHub)
  • Translate legacy Informatica mappings, workflows, and sessions into modular dbt models (staging, core, mart layers)
  • Establish modeling standards (naming conventions, layer design, folder/package structure) for staging, integration, and mart layers
  • Define and implement performance-optimized patterns in dbt and Snowflake (incremental models, clustering, partitioning logic, query tuning)
  • Lead the migration strategy, roadmap, and wave planning for converting Informatica jobs to dbt on Snowflake
  • Analyze existing ETL logic, dependencies, and schedules in Informatica and design equivalent or improved logic in dbt
  • Design a repeatable migration factory: templates, accelerators, mapping spreadsheets, and conversion playbooks for Informatica → dbt
  • Oversee conversion, unit testing, and parallel runs to validate that dbt models match legacy outputs (row counts, aggregates, business rules)
  • Lead hands-on development of dbt models, seeds, snapshots, tests, macros, and documentation
  • Define and implement testing strategy using dbt tests (schema tests, data tests, custom tests) and integrate with broader data quality checks
  • Set up and maintain dbt environments (dev/test/prod), profiles, and connections to Snowflake on Azure
  • Introduce and enforce code quality practices

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service