UnitedHealth Group-posted 7 months ago
$89,800 - $176,700/Yr
Full-time • Mid Level
Remote • San Antonio, TX
Insurance Carriers and Related Activities

Opportunities at WellMed, part of the Optum family of businesses. We believe all patients are entitled to the highest level of medical care. Here, you will join a team who shares your passion for helping people achieve better health. With opportunities for physicians, clinical staff and non-patient-facing roles, you can make a difference with us as you discover the meaning behind Caring. Connecting. Growing together. We want to achieve more in our mission of health care, and so we have to be really smart about the business of health care. At Optum, we're changing the landscape of our industry. You will put your Data Engineering skills to work as you empower business partners and team members improve healthcare delivery. You will be part of newly formed team in our Dublin office that is focused on developing a new, cutting-edge big-data analytic platform and that will support data pipelines that ultimately improve health outcomes for our members. You will be responsible for the integration of multiple complex data sources into our data platform with a mix of different platforms (i.e. Kubernetes, Hadoop) and various data applications (i.e. Spark, Hive, HBase, Airflow, etc.). You will also work closely with the Data Science and Business Intelligence teams to develop a new, cutting-edge big-data analytic platform and that will support advanced solutions to improve health outcomes for our members. You'll be in the driver's seat on vital projects that have strategic importance to our mission of helping people live healthier lives. Yes, we share a mission that inspires. And we need your organizational talents and business discipline to help fuel that mission. You'll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges.

  • Design and build data pipelines (mostly in Spark) to process terabytes of data
  • Orchestrate in Airflow the data tasks to run on Kubernetes/Hadoop for the ingestion, processing, and cleaning of data
  • Design and build best in class processes to clean and standardize data
  • Troubleshoot production issues in our Elastic Environment
  • Tuning and optimizing data processes
  • Modelling of big volume datasets to maximize performance for our Business Intelligence and Data Science teams
  • Computer Science bachelor's degree or 7+ years of relevant experience
  • 4+ years of experience working in projects with agile/scrum methodologies
  • 4+ years writing complex SQL queries
  • 4+ years building ETL/data pipelines
  • 2+ years developing processes in Spark
  • 1+ years of exposure to Kubernetes and Linux containers (i.e. Docker)
  • 1+ years of related/complementary open-source software platforms and languages (e.g. Scala, Python, Java, Linux)
  • 2+ years of exposure to Airflow, Hive /HBase / Presto, Jenkins / Travis, Kafka, Cloud technologies: Amazon AWS or Microsoft Azure
  • 2+ years of analytical and problem-solving experience applied to a Big Data environment and Distributed Processing
  • 1+ years of previous experience with Relational Databases (RDBMS) & Non-Relational Database
  • Healthcare / Pharmacy experience
  • Exposure to DevOps methodology
  • Knowledge on data warehousing principles, architecture and its implementation in large environments
  • Proven good communication and interpersonal skills
  • Proven autonomous
  • Proven negotiation and Influencing Skills
  • Comprehensive benefits package
  • Incentive and recognition programs
  • Equity stock purchase
  • 401k contribution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service