TATA Consulting Services-posted about 1 month ago
$130,000 - $140,000/Yr
Full-time • Mid Level
Woodland Hills, CA
5,001-10,000 employees
Professional, Scientific, and Technical Services

We are seeking a highly skilled DBT Data Engineer / Developer to design, develop, and optimize scalable ELT pipelines and data models using DBT and Snowflake. You will play a key role in ensuring data quality, driving architectural decisions, and enabling analytics across the organization through well-managed, production-grade data transformations

  • Design, develop, and optimize modular DBT data models (staging, intermediate, marts) using SQL and Jinja.
  • Build and maintain ELT pipelines using DBT and other ingestion tools.
  • Collaborate with data engineers, analysts, and business stakeholders to translate requirements into data transformations.
  • Implement and maintain automated testing, documentation, and version control for DBT projects.
  • Optimize Snowflake data warehouse performance for both storage and query cost efficiency.
  • Drive data modeling and architecture decisions using best practices and reusable design patterns.
  • Troubleshoot and resolve issues in DBT models and data pipelines.
  • Maintain up-to-date and well-organized DBT Docs and metadata.
  • Ensure data governance and lineage visibility through structured development.
  • Participate in agile ceremonies, sprint planning, and backlog grooming.
  • Support continuous integration and deployment using CI/CD pipelines for DBT projects.
  • Stay informed about modern data stack trends and contribute to innovation.
  • Hands-on experience with DBT (Data Build Tool) in a production environment.
  • Strong SQL skills with advanced knowledge of joins, CTEs, window functions, and optimization.
  • Experience working with Snowflake (or similar cloud data warehouse platforms).
  • Understanding of ELT/ETL pipelines, data transformation, and data modeling principles.
  • Familiarity with Jinja macros, DBT configurations, and best practices for modular data design.
  • Knowledge of version control systems like Git.
  • Experience working in agile teams and cross-functional collaboration.
  • Strong analytical thinking, problem-solving abilities, and communication skills.
  • Experience with orchestration tools such as Apache Airflow, Dagster, or Prefect.
  • Exposure to Fivetran, Stitch, or other data ingestion tools.
  • Familiarity with data governance, data lineage, and metadata management frameworks.
  • Hands-on with CI/CD tools (e.g., GitHub Actions, GitLab CI, Jenkins) for DBT deployment automation.
  • Experience with BI tools (e.g., Looker, Tableau, Power BI) and understanding of analytics needs.
  • Knowledge of data contracts, observability, and monitoring tools.
  • Understanding of data security, access controls, and compliance in a cloud environment.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service