Senior Data Engineer - Remote

UnitedHealth GroupRaleigh, NC
23hRemote

About The Position

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Optum Insight partners with payers, providers, governments and life sciences companies to simplify and enhance clinical, administrative and financial processes through software-enabled services and analytics, while advancing value-based care. Our differentiated products, technology insights, clinical expertise and analytics support the entire health system — ultimately delivering better experiences for consumers. Optum Insight Technology and Engineering is a critical function in Optum Insight driving the innovation and value we provide our customers and partners. This team is focused on products, solutions, platform / enabling capability development, product development lifecycle, engineering excellence and connectivity to Optum Technology. About the Role: We’re hiring a Senior Data Engineer to build data products on Azure and Databricks. We are looking for someone who can take ownership, drive initiatives with minimal supervision, and embrace responsibility. We value individuals with an open mindset for learning and adapting to new technologies, ensuring our solutions stay ahead in a rapidly evolving data and AI landscape. You’ll design enterprise data models, transformations and serving layers that power analytics and AI agents while working within restricted-data environments and modern governance standards. You’ll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week.

Requirements

  • 6+ years of SQL and Data Modeling experience
  • 3+ years of experience with Databricks/Azure, Python, PySpark
  • 2+ years of hands-on experience with Azure Data Factory, Delta Lake, Unity Catalog
  • 1+ years or experience deploying ML models and AI agents to production, with MLOps practices

Nice To Haves

  • Experience working with onshore and offshore teams cross-functionally
  • Experience working with PHI/PII and restricted client data in compliance with HIPAA or similar regulations
  • Proven solid data modeling & performance tuning; security-first mindset in regulated environments
  • Proven communication, stakeholder management, and collaboration skills

Responsibilities

  • Design enterprise data models and ETL/ELT pipelines using Azure Data Factory and Databricks Workflows
  • Develop high performance PySpark jobs and Databricks SQL queries on Delta Lake
  • Enable AI/ML: build, deploy, and monitor AI agents and ML models (MLflow/Model Serving)
  • Leverage Databricks GenAI for no code/low code insights and use GitHub Copilot to accelerate delivery
  • Engineer solutions for clients with restricted data (PII/PHI)—RLS/CLS, Unity Catalog, access controls
  • Design, develop, and deploy AI-powered solutions to address complex business challenges with emphasis on responsible use of AI
  • Leverage enterprise-approved AI tools to streamline workflows, automate tasks, and drive continuous improvement
  • Evaluate emerging trends to inform solution design and strategic innovation

Benefits

  • comprehensive benefits package
  • incentive and recognition programs
  • equity stock purchase
  • 401k contribution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service