UnitedHealth Group-posted 4 months ago
$89,900 - $160,600/Yr
Full-time • Senior
Remote • Eden Prairie, MN
Insurance Carriers and Related Activities

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. A successful candidate for this role will be a Snowflake certified engineer with extensive experience coding Python with Airflow. You will have deep knowledge of and hands on experience coding in Python and will have experience working in a data products environment. The ideal candidate is skilled in building and supporting complex data pipelines and ETL/ELT and has demonstrated experience supporting industry standard data governance processes and data quality processes. Demonstrated ability to conduct data modeling and data model construction and support are also essential as is the ability to performance tune and optimize data pipelines. You'll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges.

  • Help the Information Security organization Data Enablement team in defining, designing, constructing, processing and supporting data pipelines, data products and data assets.
  • Mature Snowflake processes and implementations, integrating with various Cloud CSP data storage offerings.
  • Integrate Airflow orchestration and scheduling with various internal CICD and Devops packages.
  • Advance ongoing data strategy and solutions with a focus on Medalion Data architecture.
  • Work with other engineers, analysts and clients in understanding business problems, designing solutions and supporting the products.
  • Undergraduate degree in Computer Science, Data Science, Data Analytics, Mathematics, Information Systems or related field with an emphasis on coding, analytics or applied mathematics.
  • Current Snowpro Data Engineer certification.
  • 5+ years of hands-on experience coding complex data pipelines with Python, Pyspark, Scala, SQL and related languages.
  • 4+ years of experience in data engineering on Snowflake with Airflow.
  • 3+ years of experience working with and integrating pipelines and tech stack components to database products like Postgres, MySQL, MS SQLserver, Oracle, Mongo, Cassandra.
  • Experience working with and operating on one of the following cloud storage technologies - AWS, GCP, Azure.
  • Experience operating in an environment with modern data governance processes and protocols.
  • Experience building data quality monitoring solutions.
  • Experience working with modern CICD and devops/dataops principles including automated deployment.
  • Demonstrated problem solving skills.
  • Demonstrated solid communications skills - verbally and written.
  • Advanced degree in computer science, math, analytics, data science or other similarly technical field.
  • Experience with Security data and information security.
  • Experience with Health care data.
  • Experience implementing data privacy controls for variety of data from HIPAA to PII to PCI etc.
  • Experience with streaming technologies like KAFKA or Kinesis.
  • Experience building and ingesting data from APIs.
  • Comprehensive benefits package.
  • Incentive and recognition programs.
  • Equity stock purchase.
  • 401k contribution.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service