Senior Data Analytics Engineer

BlueVoyantCollege Park, MD
1dRemote

About The Position

We are seeking a GTM Data Analytics Engineer to design, build, and scale our analytics platform from the ground up. You will own our Snowflake environment end-to-end, create robust data pipelines from core SaaS tools (Salesforce, Gong, Spiff, Clay, Common Room) and internal product usage data, and enable self-service analytics via Power BI or Tableau. This role partners closely with RevOps, Sales, CS, Product, and Finance, and includes regular communication and presentation to senior leadership.

Requirements

  • 5+ years of data engineering experience, with 3+ years hands-on in Snowflake and a track record of building a Snowflake environment from scratch.
  • Deep expertise with Snowflake features: virtual warehouses, RBAC, security policies, Time Travel, zero-copy cloning, Snowpipe, Streams, Tasks, materialized views, and performance tuning.
  • Proven experience building pipelines from Salesforce, Gong, Spiff, Clay, and Common Room, plus internal product usage telemetry (e.g., events, logs). Comfortable with third-party connectors (e.g., Fivetran/Stitch), APIs/webhooks, and Python-based ingestion.
  • Strong SQL and Python; experience with dbt for transformation and testing, Git-based workflows, and CI/CD.
  • Advanced data modeling skills (dimensional modeling/Kimball), data warehousing concepts, and building semantic layers for BI.
  • Extensive experience with Power BI or Tableau, including data modeling (e.g., DAX or Tableau calculations), dashboard design, and performance optimization with Snowflake as the backend.
  • Excellent communication skills and comfort presenting to senior leadership; ability to translate technical details into business value and actionable insights.
  • Experience in a major cloud (AWS, Azure, or GCP) including IAM/SSO, networking, secrets management, and logging/monitoring.
  • Strong focus on data quality, documentation, and operational excellence.

Nice To Haves

  • Experience with RevOps data, CRM schemas, Gong audio/transcript metadata, compensation/commission data (Spiff), lead enrichment (Clay), and community/product signals (Common Room).
  • Familiarity with product analytics tools and event pipelines (Segment, Snowplow, Amplitude) and modeling product usage and adoption metrics.
  • SnowPro Core/Advanced Architect certification or equivalent cloud certifications.
  • Experience with metric layer tooling, data contracts, and governance frameworks.

Responsibilities

  • Architect and stand up a Snowflake environment, including account setup, databases and schemas, virtual warehouse strategy, RBAC, security, data governance, cost controls, and monitoring.
  • Design and implement reliable ELT pipelines from Salesforce, Gong, Spiff, Clay, Common Room, and internal product data sources using connectors, APIs, and Snowflake-native capabilities (Snowpipe, Streams, Tasks).
  • Ingest and normalize semi-structured data (JSON/VARIANT), implement CDC, and build resilient orchestration with tools such as Airflow, Prefect, Dagster, or cloud-native schedulers.
  • Model data for analytics using best practices (dimensional modeling/Kimball, data marts, medallion architecture) to create trusted, documented datasets for GTM, Product, and Finance use cases.
  • Own dbt-based transformations, testing, and CI/CD, including version control, code reviews, and deployment workflows.
  • Establish data quality, reliability, and lineage standards with automated tests, monitoring, alerting, and SLAs; create runbooks and documentation.
  • Build and optimize BI assets in Power BI or Tableau, including semantic models, governed metrics, executive-ready dashboards, and query performance optimization against Snowflake.
  • Partner with stakeholders to define and operationalize key metrics (pipeline, conversion, ARR, retention, product adoption), and regularly present findings and recommendations to leadership.
  • Implement and maintain security and compliance controls (PII handling, data masking, role-based access, auditing, retention).
  • Monitor and tune performance and cost (warehouse sizing, caching, micro-partitioning, clustering, materialized views); drive continuous improvement in reliability and efficiency.
  • Evaluate and integrate new tools and vendors; contribute to data platform roadmap and best practices.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service