UnitedHealth Group-posted 3 months ago
$110,200 - $188,800/Yr
Full-time • Mid Level
Remote • Eden Prairie, MN
5,001-10,000 employees
Insurance Carriers and Related Activities

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. The Event Data Engineer role focuses on creating and maintaining lab/production environments and building a pipeline to ingest data from events and provide a secure environment for data analysts working to understand the data. This role will work as part of the security team to build an investigative data platform. You'll enjoy the flexibility to work remotely from anywhere within the U.S. as you take on some tough challenges.

  • Participate in incident investigations following a data event
  • Partner with the team to design and develop a scalable, high-performance data and reporting platform that serves our customers and stakeholders.
  • Partner with cross-functional stakeholders to understand evolving data needs and define long-term technical solutions
  • Drive strategic initiatives around the use of AI solutions, data quality, observability, lineage, and governance.
  • Introduce and evolve best practices in data modeling, orchestration, testing, and monitoring
  • Identify and champion investments in platform scalability, reusability, and operational efficiency
  • Collaborate with product and infrastructure teams to design data solutions that enable rapid experimentation and innovation
  • Build, maintain, and leverage parsing and analytic libraries.
  • Build and maintain data ingestion pipelines and environments.
  • Work comfortably under time-sensitive conditions while ensuring thoroughness.
  • Maintain high ethical standards and the ability to remain objective and confidential.
  • Bachelor's degree or 5+ years of work experience.
  • 5+ years of writing and deploying Python and/or Java code
  • 5+ years of experience in PySpark and Databricks.
  • 5+ years of experience normalizing unstructured data
  • 4+ years of experience with DevOps and CI/CD tools such as GitHub actions, Kubernetes, Docker, and Terraform.
  • 2+ years of experience leveraging and deploying Generative AI use cases to production environment
  • Expertise in SQL and database fundamentals, with strong experience working with data lakes and warehouses (e.g., Snowflake, Databricks)
  • Experience designing and scaling ELT/ETL frameworks with orchestration tools like Airflow, or similar platform
  • Cyber certifications such as CISSP or Security+
  • 4+ years of experience with data visualization tools and libraries such as plotly, seaborn, Chart.js
  • Experience with machine learning and predictive analytics
  • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud
  • AI/ML experience building classification and fingerprinting files
  • Experience with SIEMs such as Splunk, Elastic, Sentinel
  • Experience delivering reports and analytics in a cyber security organization
  • Familiarity with eDiscovery platforms and working with legal counsel
  • Comprehensive benefits package
  • Incentive and recognition programs
  • Equity stock purchase
  • 401k contribution
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service