Software Engineer - Data processing / Data Integration

UnitedHealth GroupTampa, FL
Remote

About The Position

Optum is a global organization that delivers care, aided by technology, to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Build the future of cloud native software. Join us as a Software Engineer where you’ll design and deliver scalable microservices using Java, Spring Boot, Kafka, and AWS. You’ll own features end to end, collaborate with cross functional teams, and use AI assisted development to deliver high quality, reliable systems faster. You’ll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges. For all hires in the Minneapolis or Washington, D.C. area, you will be required to work in the office a minimum of four days per week.

Requirements

  • Bachelor's degree in CS or IT related field
  • 5+ years of experience in Azure Data Processing skills like Azure data Factory and Azure data Bricks
  • 5+ years of experience in building data pipelines using ADF
  • 5+ years of experience in SQL and complex queries
  • 5+ years of experience with programming languages such as Python, PySpark

Nice To Haves

  • Ability to learn and adapt to new data technologies
  • Knowledge/Experience with Containerization – Docker, Kubernetes
  • Knowledge/Experience with Bigdata/Hadoop ecosystem – Spark, Hive, HBase, Sqoop etc
  • Build / Deployment Automation - Jenkins
  • Knowledge/Experience using Microsoft Visio, Power Point
  • Collaborate with team, architects, and product stakeholders to understand the scope and design of a deliverable
  • Participate in product support activities as needed by the team
  • Understand product architecture, features being built and come up with product improvement ideas and POC's
  • Individual contributor for Data Engineering – Data pipelines, Data modelling and Data warehouse
  • All Telecommuters will be required to adhere to UnitedHealth Group’s Telecommuter Policy.

Responsibilities

  • Data Integration: Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting
  • Data Transformation and Processing: Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks
  • Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines
  • Implement data governance and data de-identification framework in line with company standards
  • Partner with Data Analytics and Product leaders to design best practices and standards for developing and productionalizing analytic pipelines
  • Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Snowflake, others)
  • Monitoring and Support: Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability
  • Design, develop, and deploy AI-powered solutions to address complex business challenges with emphasis on responsible use of AI
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service