Senior Principal Data Engineer

UKGAtlanta, GA
3d$184,300 - $230,000

About The Position

As a Senior Principal Data Engineer within the Enterprise Data & Analytics Office, you will lead the design, evolution, and operational excellence of Nexus, UKG’s unified, GCP-based data platform, enabling analytics, AI, and enterprise decision-making at scale. You will translate architectural vision into reliable, scalable, cost-efficient data systems, lead platform modernization efforts, and set the engineering bar for data ingestion, transformation, and operational analytics across UKG.

Requirements

  • 15+ years of experience in data engineering, data platforms, or large-scale analytics systems.
  • Proven leadership delivering enterprise-scale data platforms and migrations.
  • Deep hands-on experience large (Petabyte) scale operating data platforms .
  • GCP (required): BigQuery, GCS, Cloud Composer, Pub/Sub, Dataflow, Dataform, Dataproc, Dataplex
  • Databricks: Delta Lake, Delta Live Tables, Spark, Unity Catalog
  • Strong proficiency in Python, SQL, Spark, and distributed systems
  • Expertise in data modeling, ELT patterns, CDC, and streaming architectures
  • Experience supporting AI/ML pipelines, feature stores, and analytics consumption
  • Strong understanding of Dimensional Modeling, ELT and modern data lakehouse patterns.
  • Proven ability to influence without authority across engineering, product, and architecture teams.
  • Strong communication skills with both technical and executive audiences.
  • Track record of mentoring senior engineers and leading complex initiatives.

Nice To Haves

  • Experience modernizing legacy warehouses and ETL platforms.
  • Familiarity with Data Mesh execution and domain-aligned data products.
  • Experience implementing FinOps practices for analytics platforms.
  • Exposure to AI/ML enablement, feature engineering, or agent-driven analytics.
  • Google Cloud certifications (Data Engineer, Architect) preferred

Responsibilities

  • Drive the implementation of standardized medallion architecture (Bronze–Silver–Gold) across all domains.
  • Establish and enforce engineering patterns for batch, streaming, CDC, and event-driven pipelines.
  • Drive design patterns for Fact and Dimension modeling, data mesh, and change data capture (CDC).
  • Design and evolve scalable data pipelines using GCP-native services (BigQuery, GCS, Composer, Pub/Sub, Dataflow) and Databricks (Delta Lake, DLT).
  • Ensure platform reliability, performance, and scalability for petabyte-scale analytics workloads.
  • Lead engineering decisions around partitioning, clustering, query optimization, and cost controls (FinOps).
  • Own observability standards (freshness, SLAs, error handling, monitoring).
  • Define and enforce engineering standards for data quality, testing, CI/CD, and production readiness.
  • Implement automated data quality frameworks, lineage, and metadata integration in partnership with Data Governance and Architecture teams.
  • Ensure all pipelines meet SOX, security, and auditability requirements.
  • Partner with Architects to ensure engineering implementations align to enterprise reference architectures.
  • Partner with Product Managers, Product Owners, Architects, and Analytics leaders to deliver high-impact data capabilities.
  • Act as the technical escalation point for complex production issues and platform risks.
  • Influence roadmap decisions by balancing technical feasibility, delivery velocity, and long-term sustainability.
  • Support ESE Agile delivery by enabling architecture runway and reducing technical blockers.
  • Mentor Principal, Senior, and Mid-level Data Engineers, raising the overall engineering maturity of the organization.
  • Lead by example through design reviews, code reviews, and engineering forums.
  • Build reusable frameworks, libraries, and accelerators that scale engineering productivity.
  • Foster a culture of ownership, quality, and continuous improvement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service