Senior Software Engineer - Remote

UnitedHealth GroupEden Prairie, MN
1dRemote

About The Position

Optum Tech is a global leader in health care innovation. Our teams develop cutting-edge solutions that help people live healthier lives and help make the health system work better for everyone. From advanced data analytics and AI to cybersecurity, we use innovative approaches to solve some of health care’s most complex challenges. Your contributions here have the potential to change lives. Ready to build the next breakthrough? Join us to start Caring. Connecting. Growing together. You’ll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week.

Requirements

  • 4+ years of software engineering experience
  • 3+ years of experience in Databricks and Apache Spark - Solid hands-on experience with Spark SQL, DataFrames, and Spark Streaming for large-scale data processing
  • 3+ years of experience in Python and SQL
  • 2+ years of experience with AWS, Azure, or GCP for deploying and managing Databricks environments
  • 2+ years of experience in ETL and Data Pipeline Development- Ability to design, build, and optimize batch and streaming pipelines using Databricks workflows and orchestration tools
  • 1+ years of CI/CD and Automation; Experience with DevOps practices, Databricks CLI, REST APIs, and Infrastructure as Code tools (Terraform)
  • 1+ years of experience in Delta Lake features such as ACID transactions, schema enforcement, and time travel

Nice To Haves

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Technology, or a related field
  • Data Governance and Security - Familiarity with Unity Catalog, RBAC, and compliance best practices for secure data management Validates foundational skills in ETL, Spark SQL, and Databricks workflows
  • Familiarity with Scala
  • Demonstrates advanced proficiency in pipeline optimization, governance, and large-scale data architecture
  • Proven analytical and problem-solving skills; ability to troubleshoot performance issues and optimize resource utilization
  • Proven communication and collaboration; strong ability to work with cross-functional teams and explain technical concepts to non-technical stakeholders

Responsibilities

  • Design and Develop Scalable Data Solutions
  • Architect and implement large-scale data pipelines and platforms using Databricks, Spark (PySpark/Scala), and Delta Lake for batch and streaming data processing
  • Optimize Data Workflows
  • Continuously improve ETL/ELT pipelines for performance, reliability, and cost efficiency, leveraging Databricks features like autoscaling and Delta Live Tables
  • Experience with using Databricks Unity Catalog across multiple Databricks or Snowflake instances, with respect to access controls, asset bundles or governance protocols
  • Integrate Databricks with cloud services (AWS, Azure, GCP) and optimize resource utilization for cost-effective operations
  • Ensure Data Quality and Reliability
  • Monitor pipeline performance, troubleshoot issues, and maintain high data integrity across ingestion, transformation, and storage layers
  • Lead Technical Direction
  • Drive architecture decisions, enforce coding standards, and promote best practices for Databricks usage, including notebook development and cluster management
  • Mentor and Guide Team Members
  • Provide technical leadership and mentorship to junior engineers, conduct code reviews, and foster a culture of continuous improvement
  • Cross-Functional Collaboration
  • Work closely with data scientists, analysts, and business stakeholders to align data solutions with organizational goals and analytics needs
  • Implement CI/CD and Automation
  • Use Infrastructure as Code (Terraform/Pulumi) and CI/CD pipelines for automated deployments and operational efficiency
  • Monitor and Optimize Costs
  • Track DBU consumption, cluster utilization, and storage costs; implement strategies like auto-termination and right-sizing to reduce expenses
  • Document data definitions, transformations, and processes to support governance and audit requirements

Benefits

  • In addition to your salary, we offer benefits such as, a comprehensive benefits package, incentive and recognition programs, equity stock purchase and 401k contribution (all benefits are subject to eligibility requirements).
  • No matter where or when you begin a career with us, you’ll find a far-reaching choice of benefits and incentives.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service