Senior Data Engineer

ZoomInfo Technologies LLCBethesda, MD
8dHybrid

About The Position

ZoomInfo is where careers accelerate. We move fast, think boldly, and empower you to do the best work of your life. You’ll be surrounded by teammates who care deeply, challenge each other, and celebrate wins. With tools that amplify your impact and a culture that backs your ambition, you won’t just contribute. You’ll make things happen–fast. We are looking for a highly skilled Senior Data Engineer to become part of our Enterprise Operations team. This role is ideal for someone who can seamlessly blend technical skills with business acumen - someone who can get the data, shape the data, and build tools to tell the story. Who proactively drives continuous improvement. The ideal candidate has a strong background in big data processing, pipeline orchestration, and data modeling, with a proven track record of delivering scalable and high-quality data solutions in fast-paced, data-centric product environments. Given the dynamic nature of emerging technologies, this role requires an individual who excels at exploration and embraces continuous learning as core responsibilities. You'll constantly research and implement innovative solutions while integrating vast, diverse data sources into our AI-powered applications, including our industry-leading LLM-powered systems.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 6+ years of progressive experience in data engineering and data analytics.
  • Expert-level SQL skills for building performant, scalable queries and transformations on large-scale datasets.
  • Strong Python programming experience.
  • Hands-on experience with dbt (Data Build Tool) for advanced data modeling and transformations in a modern data stack.
  • Deep expertise in Snowflake, including data warehouse design, performance optimization, and cost modeling.
  • Strong understanding of data architecture concepts such as data lakes, ETL/ELT, event-driven architectures (e.g., Kafka), and data mesh.
  • Proven analytical skills focused on translating business requirements into innovative, scalable solutions; experience with Tableau.
  • Proficiency with cloud platforms (GCP and/or AWS) and infrastructure-as-code tools (e.g., Terraform).

Nice To Haves

  • Familiarity with LLMOps, LangChain, or RAG (Retrieval Augmented Generation) pipelines.
  • Experience implementing Model Context Protocol (MCP) or similar architectures to feed structured and unstructured data into LLM-powered systems.
  • Familiarity with other distributed systems and databases (e.g., Databricks).
  • Production experience with large-scale batch and streaming data processing, including exposure to data science modeling.
  • Hands-on experience with front-end and back-end application development.

Responsibilities

  • Partner with cross-functional teams to gather, validate, and translate business requirements into high-quality data solutions.
  • Design, develop, and maintain high-performance, product-centric data pipelines using Airflow, DBT, and Python.
  • Build and maintain semantic data layers optimized for consumption by LLM-powered applications.
  • Define, monitor, and enforce data quality SLAs across all data pipelines and products, ensuring accuracy, reliability, and lineage.
  • Influence Technical Decisions: Apply your experience with data architecture to improve pipelines, table design, and reporting structure in collaboration with engineering.
  • Design and develop analytical tools to assess data quality, recommend remediation strategies, implement testing frameworks, and drive continuous improvement.
  • Mentor and coach junior engineers, promoting best practices in code quality, data architecture, and operational excellence.

Benefits

  • In addition to comprehensive benefits we offer holistic mind, body and lifestyle programs designed for overall well-being.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service