Principal Data Engineering - Remote or Hybrid in MN and DC

UnitedHealth GroupEden Prairie, MN
Hybrid

About The Position

Optum Tech is a global leader in health care innovation. Our teams develop cutting-edge solutions that help people live healthier lives and help make the health system work better for everyone. From advanced data analytics and AI to cybersecurity, we use innovative approaches to solve some of health care’s most complex challenges. Your contributions here have the potential to change lives. Ready to build the next breakthrough? Join us to start Caring. Connecting. Growing together. CareData Platform is a Microsoft Azure commercial cloud-based data platform. We are building the technology and data platform to enable OptumCare to pursue complementary strategies of developing local market focused value-based care arrangements and driving the quadruple aim at a national level. The CareData Platform enables CDOs to easily manage their own data while also providing a mechanism to allow for enterprise-wide aggregation and normalization of data. Our team owns and operates one of the largest data ingestion pipelines feeding the Care Data Platform (CDP). We build and maintain automated data flows that move, reconcile, and validate data from Care Delivery Organizations into the national Snowflake instance. You’ll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week. You’ll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.

Requirements

  • Bachelor’s degree in computer science, technology, or related field
  • 8+ years of experience designing, coding and supporting distributed data intensive systems at scale
  • 8+ years of experience with relational database delivery (ETL, ELT)
  • 5+ years of working within the Software Development Life Cycle (SDLC)
  • 5+ years of experience with cloud-based data platforms (Google, Azure, or Amazon)
  • 5+ years of experience in Agile Delivery
  • 5+ years of experience working with relational databases (Oracle, SQL Server, MYSQL, etc.)

Nice To Haves

  • Experience in DevOps Automation tools (Oozie, Python, etc.)
  • Experience in delivering Data Platforms
  • Hands-on experience of big data and streaming frameworks - Kafka, Hadoop, Hive, Spark, HDFS
  • Hands-on experience on using PAAS like - Kubernetes, OpenShift
  • Hands-on experience on working on CICD platform and monitoring - namely GitHub, Jenkins, Grafana, Prometheus
  • Proven excellent communication skills with ability to describe data / capability stories and explain value to customers

Responsibilities

  • Maintain, improve, automate, and evaluate data pipelines
  • Improve data efficiency, reliability and quality
  • Anticipate needs and proactively develop solutions to meet them
  • Provide explanations and information to others on the most complex issues
  • Motivate and inspire other team members
  • Review work performed by others and provide recommendations

Benefits

  • a comprehensive benefits package
  • incentive and recognition programs
  • equity stock purchase
  • 401k contribution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service