About The Position

At CVS Health, we’re building a world of health around every consumer and surrounding ourselves with dedicated colleagues who are passionate about transforming health care. As the nation’s leading health solutions company, we reach millions of Americans through our local presence, digital channels and more than 300,000 purpose-driven colleagues – caring for people where, when and how they choose in a way that is uniquely more connected, more convenient and more compassionate. And we do it all with heart, each and every day. Position Summary Participates in the design, build, manage and responsible for successful delivery of large-scale data structures and Pipelines and efficient Extract/Load/Transform (ETL) workflows. Acts as the data engineer for large and complex projects involving multiple resources and tasks, providing individual mentoring in support of company objectives. Healthcare analytics knowledge would be beneficial Applies understanding of key business drivers to accomplish own work. Uses expertise, judgment, and precedents to contribute to the resolution of moderately complex problems. Leads portions of initiatives of limited scope, with guidance and direction. Deep knowledge of architecture, frameworks, and methodologies for working with and modelling large data sets, such as HDFS, YARN, Spark, Hive and edge nodes and NoSQL databases. Strong SQL skills with commensurate experience in a large database platform. Able to perform code reviews to ensure the code meets the acceptance criteria. Familiar with Data Modelling/Mapping and implementing them in development. Collaborates with client team to transform data and integrate algorithms and models into automated processes. Uses programming skills in Scala, Python, Java, or any of the major languages to build robust data pipelines and dynamic systems. Builds data marts and data models to support clients and other internal customers. Integrates data from a variety of sources, assuring that they adhere to data quality and accessibility standards. Complete SDLC process and Agile Methodology (Scrum/SAFe) Google Cloud technology knowledge a must

Requirements

  • 5+ years of progressively complex related experience.
  • Strong problem-solving skills and critical thinking ability.
  • Strong collaboration and communication skills within and across teams.
  • Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
  • Ability to understand complex systems and solve challenging analytical problems.
  • Experience with bash shell scripts, UNIX utilities & UNIX Commands.
  • Knowledge in Scala, Java, Python, Hive, MySQL, or NoSQL or similar.
  • Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
  • Google Cloud technology knowledge a must
  • Experience building data transformation and processing solutions.
  • Expert high-level coding skills such as SQL/PL-SQL and scripting languages (UNIX) required.
  • Experience with source code control systems (GIT) and CI/CD process (Jenkins/Team City/Octopus)
  • Requires significant knowledge across multiple areas and applications, has a significant healthcare business knowledge and impact to numerous applications.
  • Bachelors Dgree

Nice To Haves

  • Healthcare analytics knowledge would be beneficial
  • AWS knowledge (S3/Redshift/Lambda/Data Pipeline).
  • Working experience in Exasol/Redshift/Snowflake is a plus.
  • Scala is a plus.

Responsibilities

  • Participates in the design, build, manage and responsible for successful delivery of large-scale data structures and Pipelines and efficient Extract/Load/Transform (ETL) workflows.
  • Acts as the data engineer for large and complex projects involving multiple resources and tasks, providing individual mentoring in support of company objectives.
  • Applies understanding of key business drivers to accomplish own work.
  • Uses expertise, judgment, and precedents to contribute to the resolution of moderately complex problems.
  • Leads portions of initiatives of limited scope, with guidance and direction.
  • Able to perform code reviews to ensure the code meets the acceptance criteria.
  • Familiar with Data Modelling/Mapping and implementing them in development.
  • Collaborates with client team to transform data and integrate algorithms and models into automated processes.
  • Builds data marts and data models to support clients and other internal customers.
  • Integrates data from a variety of sources, assuring that they adhere to data quality and accessibility standards.
  • Complete SDLC process and Agile Methodology (Scrum/SAFe)

Benefits

  • Affordable medical plan options, a 401(k) plan (including matching company contributions), and an employee stock purchase plan.
  • No-cost programs for all colleagues including wellness screenings, tobacco cessation and weight management programs, confidential counseling and financial coaching.
  • Benefit solutions that address the different needs and preferences of our colleagues including paid time off, flexible work schedules, family leave, dependent care resources, colleague assistance programs, tuition assistance, retiree medical access and many other benefits depending on eligibility.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service