Principal Data Engineer

AdvancedMDSouth Jordan, UT
4hHybrid

About The Position

AdvancedMD is a unified cloud suite of medical office software hosted on Amazon Web Services/AWS including practice management, electronic health records, and patient engagement, and offers managed medical billing services for independent practices. AdvancedMD serves an expansive national footprint of 65,000 practitioners across 14,000 practices and 900 independent medical billing companies. 8.8M insurance claims are processed every month on the AdvancedMD billing platform! Role Summary Are you a senior data engineering leader who thrives on designing scalable, high-performance data platforms that drive real business impact? We are seeking a Principal Data Engineer to lead the architecture, development, and optimization of enterprise data solutions that power analytics, integrations, and data-driven decision making across the organization. In this role, you’ll serve as a technical authority for cloud-based data platforms, leveraging AWS and Snowflake to build secure, reliable, and cost-effective data ecosystems. You’ll design modern ELT/ETL pipelines, define data modeling and governance standards, and enable seamless data exchange through RESTful and event-driven APIs. Partnering closely with data scientists, analysts, application teams, and business stakeholders, you’ll ensure the data platform aligns with long-term business objectives while supporting both real-time and batch use cases. This is an exciting opportunity for a hands-on technical leader who enjoys mentoring engineers, shaping architectural direction, and continuously evolving a modern data stack. If you’re passionate about cloud data engineering, platform scalability, and building resilient data solutions in a highly collaborative environment, we’d love to meet you.

Requirements

  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field (Master’s preferred)
  • 10+ years of progressive experience in data engineering, data architecture, or a related technical discipline
  • 5+ years of hands-on experience designing and building data solutions on AWS (S3, Glue, Lambda, Redshift, Step Functions, IAM)
  • 3+ years of experience with Snowfl ake, including performance tuning, data sharing, and administration
  • Strong proficiency in SQL, Python, and at least one additional programming language (Java, Scala, or similar)
  • Demonstrated experience designing and consuming RESTful APIs, including API gateway management and integration patterns
  • Experience with Infrastructure as Code (Terraform or CloudFormation) and CI/CD pipeline development
  • Deep understanding of data modeling methodologies (dimensional, data vault, normalized)

Nice To Haves

  • Experience as a technical leader or mentor
  • Experience with Apigee API management platform for enterprise API governance, security policies, and developer portal management
  • AWS certifications (Solutions Architect Professional, Data Analytics Specialty, or Data Engineer Associate)
  • SnowPro Advanced certifications (Data Engineer or Architect)
  • Experience with dbt for data transformation and analytics engineering workflows
  • Familiarity with data observability and quality tools (Monte Carlo, Great Expectations, or similar)
  • Experience with Apache Kafka, Kinesis, or other streaming technologies for real-time data pipelines
  • Knowledge of data privacy regulations (HIPAA, GDPR, CCPA) and implementing compliant data architectures
  • Experience with containerization (Docker, ECS, EKS) and microservices architectures
  • Background in healthcare, fi nancial services, or other highly regulated industries

Responsibilities

  • Architect and maintain enterprise data platforms using AWS services (S3, Glue, Lambda, Redshift, Step Functions, EventBridge) and Snowflake, ensuring high availability, scalability, and performance.
  • Design and implement RESTful and event-driven API architectures to enable secure, governed data exchange across internal systems and external partners
  • Build and optimize ELT/ETL pipelines for batch and real-time data ingestion, transformation, and delivery using tools such as AWS Glue, dbt, and Snowpipe
  • Define and enforce data modeling standards, including dimensional modeling, data vault, and schema design best practices within Snowflake
  • Establish and maintain data governance frameworks, including data quality monitoring, lineage tracking, metadata management, and access controls
  • Lead cost optimization initiatives across AWS and Snowflake environments, including warehouse sizing, storage strategies, and compute resource management
  • Collaborate with data scientists, analysts, and application teams to design data products and self-service analytics capabilities
  • Develop and maintain CI/CD pipelines for data infrastructure using Infrastructure as Code (Terraform, CloudFormation) and version control best practices
  • Mentor and provide technical leadership to data engineers, conducting architecture reviews and establishing coding standards across the team
  • Evaluate and integrate emerging technologies and tools to continuously improve the data platform’s capabilities, reliability, and eDiciency

Benefits

  • Competitive compensation and total rewards benefits
  • Comprehensive health, dental, and vision insurance
  • 401(k) with generous company match
  • Paid time off and holidays
  • Hybrid and remote work opportunities
  • Career growth and development support
  • Collaborative, team-oriented culture
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service