Senior Data Engineer

UnitedHealth GroupEden Prairie, MN
Remote

About The Position

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Combine two of the fastest-growing fields on the planet with a culture of performance, collaboration and opportunity and this is what you get. Leading edge technology in an industry that's improving the lives of millions. Here, innovation isn't about another gadget, it's about making health care data available wherever and whenever people need it, safely and reliably. There's no room for error. Join us and start doing your life's best work.(sm) Functions may include database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning and other similar roles. Responsibilities may include Platform-as-a-Service and Cloud solution with a focus on data stores and associated eco systems. Duties may include management of design services, providing sizing and configuration assistance, ensuring strict data quality, and performing needs assessments. Analyzes current business practices, processes and procedures as well as identifying future business opportunities for leveraging data storage and retrieval system capabilities. Manages relationships with software and hardware vendors to understand the potential architectural impact of different vendor strategies and data acquisition. May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data. Selects, develops and evaluates personnel to ensure the efficient operation of the function. CareData Platform is a Microsoft Azure commercial cloud-based data platform. We are building the technology and data platform to enable OptumCare to pursue complementary strategies of developing local market focused value-based care arrangements and driving the quadruple aim at a national level. The CareData Platform enables CDOs to easily manage their own data while also providing a mechanism to allow for enterprise-wide aggregation and normalization of data. Our team owns and operates one of the largest data ingestion pipelines feeding the Care Data Platform (CDP). We build and maintain automated data flows that move, reconcile, and validate data from Care Delivery Organizations into the national Snowflake instance. You’ll enjoy the flexibility to telecommute from anywhere within United States as you take on some tough challenges. You’ll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.

Requirements

  • 5+ years of experience designing, coding and supporting distributed data intensive systems at scale
  • 5+ years of experience with relational database delivery (ETL, ELT)
  • 5+ years of experience working within the Software Development Life Cycle (SDLC)
  • 3+ years of experience with cloud-based data platforms (Google, Azure, or Amazon)
  • 3+ years of experience in Agile Delivery
  • 3+ years of experience working with relational databases (Oracle, SQL Server, MYSQL, etc.)

Nice To Haves

  • Undergraduate degree
  • Experience in DevOps Automation tools (Oozie, Python, etc.)
  • Experience in delivering Data Platforms
  • Hands-on experience in full automated testing framework (unit & integration), Cucumber, Spock, Go unit test, Junit
  • Hands-on experience of big data and streaming frameworks - Kafka, Hadoop, Hive, Spark, HDFS
  • Hands-on experience on using PAAS like - Kubernetes, OpenShift
  • Hands-on experience on working on CICD platform and monitoring - namely GitHub, Jenkins, Grafana, Prometheus
  • Excellent communication skills with ability to describe data / capability stories and explain value to customers

Responsibilities

  • Maintain, improve, automate, and evaluate data pipelines
  • Improve data efficiency, reliability and quality
  • Build high performance
  • Ensure data integrity
  • Create and manage data stores at scale
  • Ensure data governance - security, quality, access and compliance

Benefits

  • a comprehensive benefits package
  • incentive and recognition programs
  • equity stock purchase
  • 401k contribution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service