Sr. Data Engineer (Hybrid In-Office / Remote)

UnitedHealth GroupLittle Rock, AR
Hybrid

About The Position

Optum Insight is improving the flow of health data and information to create a more connected system. We remove friction and drive alignment between care providers and payers, and ultimately consumers. Our deep expertise in the industry and innovative technology empower us to help organizations reduce costs while improving risk management, quality and revenue growth. Ready to help us deliver results that improve lives? Join us to start Caring. Connecting. Growing together. As a Senior Data Engineer on the Optum Data Management team, this role is responsible for designing, building, and operating scalable, metadata driven data pipelines supporting the Arkansas Medicaid Decision Support System (DSS). The position focuses on developing and optimizing ETL/ELT workflows using Azure Data Factory, Snowflake SQL, and supporting technologies to enable reliable ingestion, integration, and delivery of State Medicaid and CMS data. The Senior Data Engineer leads efforts to resolve data quality and performance issues, supports automated inbound and outbound data extracts for intrastate agencies, and collaborates directly with state partners to implement new file based and system integrations based on detailed technical specifications. This role requires SQL proficiency, deep understanding of enterprise data warehousing patterns, and the ability to operate independently while contributing to complex, regulated data environments. You’ll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week.

Requirements

  • 5+ years of hands on data engineering experience, with an emphasis on enterprise data warehousing and ETL architectures
  • 5+ years of ETL development experience designing and building pipelines, dataflows using Azure Data Factory and/or Snowflake, including parameterized and metadata driven designs
  • 3+ years of experience using Snowflake database including creating data ingestion and extraction through stored procedures and leveraging Snowflake native utilities
  • 3+ years of experience with GitHub version control systems including code merges and cross environment deployments using CD/CI pipelines
  • 2+ years of scripting experience (Batch, PowerShell, or similar) to support automation and operational tasks specifically on Azure platform
  • Experience processing both structured and semi structured data (e.g., fixed width files, delimited files, JSON, XML)
  • Demonstrated working knowledge of NCPDP vF6, HL7 and FHIR healthcare standards including ingestion and transformation of these formats
  • Reside in the Little Rock, AR area and be willing to work hybrid in the office at least three days per week

Nice To Haves

  • Azure or snowflake certifications related to Data Engineering (e.g., Azure Data Engineer Associate)
  • 3+ years of data modeling experience in an enterprise data warehouse environment (dimensional, normalized, or hybrid models)
  • 2+ years of experience developing and maintaining Go-Anywhere MFT tool for inbound and outbound transfer processes
  • 2+ years of Python experience, particularly for data processing, orchestration, or validation workflows
  • 2+ years of experience working with Microsoft Purview/ Macula Automate building lineage from ADF and Snowflake code/objects
  • Experience designing or consuming APIs and service based integrations (REST, RPC) within Snowflake or ADF based workflows
  • Experience with Informatica Intelligent Cloud Services (IICS), particularly Data Integration
  • Experience supporting State Medicaid, Medicare, or healthcare data systems, including regulatory or compliance driven reporting

Responsibilities

  • Collaborate closely with product owners, business users, and stakeholders to translate data product and reporting requirements into scalable, reliable data engineering solutions, incorporating feedback iteratively throughout the delivery lifecycle
  • Design, develop, and maintain end to end data pipelines across staging, integration, and consumption layers, leveraging Azure Data Factory, Snowflake SQL, and metadata driven frameworks
  • Develop and optimize high performance SQL queries, views, and stored procedures to support analytics, reporting, and outbound data extracts, and provide guidance to analytics team members on effective and cost efficient usage
  • Architect, Develop and support inbound and outbound data integration processes including batch file ingestion, automated extracts, and scheduled deliveries to internal consumers, state partners, and external agencies
  • Develop and Maintain Go-Anywhere workflows and monitors for Inbound and outbound file transfers
  • Partner with internal stakeholders to understand business objectives, regulatory needs, and study requirements, and identify appropriate technical solutions, integration patterns, and data models that balance performance, scalability, and maintainability
  • Serve as a technical subject matter expert and mentor, providing training, design guidance, and code reviews for associates on data engineering best practices, Snowflake optimization, and ADF pipeline design
  • Author and maintain detailed technical documentation, including data mappings, record layouts, file specifications, interface contracts, workflow diagrams, and operational runbooks for data loads and extracts
  • Continuously evaluate and improve data quality, reconciliation controls, monitoring, and operational efficiencies across analytic and DSS data processes
  • Establish and maintain partnerships with customers, data senders, and data consumers, acting as a subject matter expert for healthcare data domains
  • Collaborate with business and technical partners to support data acquisition initiatives, including onboarding new data sources, developing business justifications, and executing secure, compliant data transfers
  • Communicate complex technical concepts clearly through written documentation, presentations, and discussions with audiences ranging from technical teams to senior leadership
  • Foster a collaborative, inclusive environment that promotes knowledge sharing, continuous improvement, and professional development across the team
  • Demonstrate full appreciation of and compliance with regulatory, security, and data governance obligations, including HIPAA, state/federal Medicaid requirements, and controls related to external data sharing and file exchange
  • Demonstrated ability to work independently, manage multiple priorities, and adapt to evolving business and technology requirements
  • Demonstrated analytical and troubleshooting skills, with the ability to diagnose pipeline failures, data anomalies, and performance bottlenecks

Benefits

  • a comprehensive benefits package
  • incentive and recognition programs
  • equity stock purchase
  • 401k contribution
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service