Zendesk-posted 2 days ago
Full-time • Mid Level
Hybrid • Austin, TX

Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What you get to do every single day:

  • Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models
  • Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately
  • Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt
  • Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt
  • Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing
  • Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains
  • Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery
  • Work with data and analytics experts to strive for greater functionality in our data systems
  • 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments
  • 5+ years of experience in Data Modeling and Data Architecture in a production environment
  • 5+ years in writing complex SQL queries
  • 5+ years of experience with Cloud columnar databases (We use Snowflake)
  • 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions
  • Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions.
  • Strong documentation skills for pipeline design and data flow diagrams.
  • Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python
  • Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc
  • Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning
  • Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc
  • Demonstrated experience in one or many business domains (Finance, Sales, Marketing)
  • 3+ completed “production-grade” projects with dbt
  • Expert knowledge in python
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service