Principal Data Platform Engineer V

Nationwide Marketing Group LLC
3hRemote

About The Position

Nationwide Marketing Group (NMG) is seeking a Principal Data Platform Engineer to own the technical vision, architecture, and evolution of the enterprise data platform. This senior-level individual contributor role is responsible for designing, building, and operating scalable, secure, and cost-efficient data lake and data warehouse solutions that support analytics, product, and operational use cases across the organization. In this role, the Principal Data Platform Engineer serves as both architect and hands-on builder, setting platform standards while remaining deeply engaged in implementation and production support. The role partners closely with analytics, product, and business teams to translate data needs into durable platform capabilities, while mentoring data engineers and establishing best practices that enable reliable, self-service data consumption at scale.

Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering, Software Engineering, Data Engineering, Information Systems, or Information Technology or related disciplines
  • Minimum 10 years of professional experience in software, data, or platform engineering roles.
  • Minimum of 6 years of hands-on experience building, operating, or leading data engineering or data platform solutions in production environments.
  • Demonstrated expertise in modern data platforms and engineering practices, including DevOps and DataOps methodologies.
  • Hands-on experience designing, implementing, and operating enterprise-scale data warehouses and data lakes.
  • Practical experience implementing operational observability, including metrics, logging, monitoring, and alerting for data systems.
  • Experience operating in small, fast-moving, or high-growth organizations where ownership and adaptability are required.
  • Demonstrated ability to lead technical design reviews, conduct code reviews, and make architectural decisions that impact multiple teams.
  • Proven experience leading or owning migrations from legacy data systems to modern, cloud-based data platforms.

Nice To Haves

  • Advanced, hands-on experience with Google Cloud Platform services, including BigQuery and Dataproc, in production environments.
  • Experience designing, operating, and optimizing relational and non-relational databases, including PostgreSQL, MySQL, and MongoDB.
  • Proficiency in at least one scripting or programming language commonly used in data platforms (e.g., Python, JavaScript).
  • Experience establishing and evolving DataOps practices to support reliable, scalable, production-grade data platforms.
  • Experience working with Kubernetes and containerized applications (e.g., Google Kubernetes Engine).
  • Familiarity with GraphQL-based data access patterns and APIs.
  • Experience building or maintaining CI/CD pipelines (e.g., GitHub Actions or similar).
  • Working knowledge of Spring Core and Spring Boot in data or platform-adjacent services.
  • Familiarity with Hasura or comparable data access or API orchestration frameworks.

Responsibilities

  • Owns the technical vision and architecture of the enterprise data platform, including data lake and data warehouse design.
  • Owns cost, performance, and capacity optimization for the data platform, including BigQuery query efficiency, storage lifecycle management, and workload isolation.
  • Designs and implements data ingestion, transformation, and modeling patterns, including dimensional modeling and slowly changing dimensions.
  • Designs and enforces data security, access controls, and governance standards, including IAM, data classification, retention, and auditing.
  • Defines and maintains data contracts, schemas, and interface standards to enable scalable, multi-team data production and consumption.
  • Leads migrations from legacy data systems to modern, cloud-native GCP architectures while maintaining operational stability.
  • Builds and operates ETL/ELT pipelines using managed tools (e.g., Fivetran or similar) and custom transformation logic.
  • Establishes DataOps practices, including data quality metrics, monitoring, alerting, and incident response.
  • Builds self-service tooling, templates, and reference implementations that enable analytics and product teams to onboard with minimal friction.
  • Ensures the data platform reliably supports mission-critical analytics and operational workflows.
  • Translates ideal-state architectures into phased, executable roadmaps aligned with business priorities.
  • Acts as the primary technical partner to analytics, product, and business stakeholders.
  • Provides technical mentorship, design guidance, and code-level support to data engineers.
  • Remains hands-on across design reviews, complex implementations, and production issue resolution.

Benefits

  • Competitive base pay and performance bonus, dependent on role
  • Medical, dental, and vision benefits with low-cost coverage options
  • Employer paid basic life and AD&D
  • Employer paid short-term and long-term disability
  • MetLife supplemental insurance options
  • Matching 401(k) with 100 percent vesting
  • Open PTO policy, paid holidays, and ten weeks of paid parental leave
  • Business casual work environment
  • Rewards and recognition platform where you can earn points and redeem for merchandise
  • Discounts on electronics, cell phones, travel, wellness, health and auto, pet insurance, and more
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service