Data Integration Engineer

Martin's Point Health CarePortland, ME

About The Position

Join Martin's Point Health Care - an innovative, not-for-profit health care organization offering care and coverage to the people of Maine and beyond. As a joined force of "people caring for people," Martin's Point employees are on a mission to transform our health care system while creating a healthier community. Martin's Point employees enjoy an organizational culture of trust and respect, where our values - taking care of ourselves and others, continuous learning, helping each other, and having fun - are brought to life every day. Join us and find out for yourself why Martin's Point has been certified as a "Great Place to Work" since 2015. Position Summary The Data Engineer is responsible for building and maintaining data pipelines and infrastructure to support organizational requests. This role will work closely with the other data professionals with IT Data & Analytics (data architects, data analysts) as well as business users to ensure data is reliable and accurate. The Data Engineer will implement data pipeline integrations, while adhering to security, quality and data transformation standards.

Requirements

  • Bachelor’s degree in Computer Science, Data Engineering or related field.
  • 3+ years in data engineering or related ETL role required, 5+ years preferred.
  • Expertise in data pipeline tools required. SSIS/Talend expertise preferred.
  • Data ETL pipeline industry standards and best practice.
  • SQL, ETL transformation/performance tuning.
  • Strong problem solving and troubleshooting skills.
  • Ability to work on multiple projects/tasks concurrently.

Nice To Haves

  • Master’s degree in related field preferred.
  • Healthcare industry Claims processing/Clinical Data Repository experience preferred.
  • REST APIs, FHIR, Java experience preferred.

Responsibilities

  • Designs and maintain ETL pipelines for ingesting and transformation data into database storage adhering to specification files.
  • Integrates data from multiple different sources to create single-source data sets.
  • Optimizes data pipelines for performance and scalability with a focus on cost savings.
  • Implements quality control processes around pipelines to ensure accuracy, reliability and consistency with a focus on data validation.
  • Documents all data pipeline creation to promote knowledge sharing across teams and business owners.
  • Troubleshoots pipeline failures, missing data files and data quality issues.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service