Senior Software Engineer

UnitedHealth GroupRichardson, TX

About The Position

Optum Insight is improving the flow of health data and information to create a more connected system. We remove friction and drive alignment between care providers and payers, and ultimately consumers. Our deep expertise in the industry and innovative technology empower us to help organizations reduce costs while improving risk management, quality and revenue growth. Ready to help us deliver results that improve lives? Join us to start Caring. Connecting. Growing together. Our team delivers advanced analytics application development and support for Optum Payment Integrity, enabling data driven identification of overpayments and recovery opportunities across Data Mining, Coordination of Benefits (COB), Fraud, Waste, Abuse & Error (FWAE), and Subrogation. In this role, you will design and build scalable analytics and applications, enhance existing solutions, and support production systems using modern big data and cloud technologies. You will work across the full development lifecycle—partnering with business and technical teams to translate complex requirements into high quality, production ready solutions.

Requirements

  • Bachelor’s degree
  • 7+ years of IT work experience
  • 5+ years of DevOps experience in Azure Cloud environment, including Azure Data Factory(ADF), Databricks
  • 3+ years of experience developing with PySpark, Spark, and Python, SCALA for big data processing
  • 1+ years of experience in front end development experience on Node or React js
  • 1+ years of proficiency in MySQL and Snowflake
  • 1+ years of experience with VS Code and GitHub Copilot for AI-assisted development
  • Experience with Agile/Scrum methodology

Nice To Haves

  • Knowledge of healthcare compliance standards (HIPAA)
  • Familiarity with CI/CD pipelines and cloud deployment strategies
  • Proven exposure to AI/ML best practices for data engineering and automation
  • Proven unit testing for data pipelines using PyTest
  • Proven databricks expertise: cluster management, job optimization, and Delta Lake
  • Proven performance tuning for Spark jobs and distributed data pipelines
  • Local to Richardson, TX or Eden Prairie, MN

Responsibilities

  • Design, develop, test, and maintain scalable big data and cloud based analytics solutions using Python, PySpark, Spark, and Scala
  • Build and enhance analytics rules engines that support payment integrity use cases
  • Develop and maintain applications and services on Azure, including Databricks and Azure Data Factory
  • Design and implement APIs to integrate with enterprise data sources
  • Identify gaps, risks, and optimization opportunities in existing solutions and data pipelines
  • Research, evaluate, and apply new tools, frameworks, and patterns to strengthen platform sustainability
  • Collaborate with product, architecture, QA, and engineering teams throughout the SDLC
  • Produce technical documentation and participate in production support activities
  • Create prototypes, proof of concepts, and participate in design and code reviews
  • Provide input on technical architecture and contribute to accurate development effort estimates
  • Continuously expand technical skills through self directed learning

Benefits

  • a comprehensive benefits package
  • incentive and recognition programs
  • equity stock purchase
  • 401k contribution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service