Data Engineer

CDC Foundation
4h$103,500 - $143,500Remote

About The Position

The Data Engineer will play a crucial role in advancing the CDC Foundation's mission by designing, building, and maintaining data infrastructure for a public health organization. This role is aligned to the Workforce Acceleration Initiative (WAI). WAI is a federally funded CDC Foundation program with the goal of helping the nation’s public health agencies by providing them with the technology and data experts they need to accelerate their information system improvements. Working within New Hampshire Department of Health and Human Services, Division of Public Health, the Data Engineer will design, develop, and maintain Informatica ETL workflows for ingesting, transforming, and integrating environmental health data. Core duties include data modeling, validation, workflow automation, performance monitoring, and documentation. The Data Engineer will support cloud migration of legacy ETL processes and implement scalable, secure data integration practices. The Data Engineer will collaborate with system developers, analysts, and data science teams to ensure ETL outputs align with visualization and analytics needs, and work with technical stakeholders to design architecture for data generation, storage, processing, and analysis. The Data Engineer will be hired by the CDC Foundation and assigned to the New Hampshire Department of Health and Human Services, Division of Public Health. This position is eligible for a fully remote work arrangement for U.S. based candidates.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.
  • Minimum 5 years of relevant professional experience
  • Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL. Candidate should be able to implement data automations within existing frameworks as opposed to writing one off scripts.
  • Strong understanding of database systems, including relational databases (e.g., Oracle, Oracle cloud, MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
  • Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review.
  • Knowledge of data warehousing and modeling concepts and tools. (Data lakes, Data Buzz, and Data March)
  • Experience with cloud computing platforms.
  • Expertise in data modeling, ETL (Extract, Transform, Load) processes/Informatica, and data integration techniques.
  • Familiarity with agile development methodologies, software design patterns, and best practices.
  • Strong analytical thinking and problem-solving abilities.
  • Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively.
  • Flexibility to adapt to evolving project requirements and priorities.
  • Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners.
  • Experience working in a virtual environment with remote partners and teams.
  • Familiarity with Salesforce system development and integration.
  • Proficiency in Microsoft Office.

Responsibilities

  • Create and manage the systems and pipelines that enable efficient and reliable flow of data, including ingestion, processing, and storage, with a focus on migrating existing ETL workflows from on‑premises or vendor‑hosted environments to a secure, State‑managed cloud infrastructure.
  • Collect data from various sources, transforming and cleaning it to ensure accuracy and consistency , and develop advanced ETL workflows to ingest, transform, and integrate environmental health data and other programmatic datasets.
  • Optimize data pipelines, infrastructure, and workflows for performance and scalability, establishing scalable, repeatable data pipelines that adapt to evolving program needs and modernization efforts.
  • Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them, including automation that reduces manual intervention, minimizes errors, and accelerates data availability for real‑time analytics.
  • Implement security measures to protect sensitive information and enhance data quality and governance through validation, reconciliation, and metadata management tools.
  • Collaborate with data scientists, analysts, and other partners to understand their data needs and requirements, and to ensure that the data infrastructure supports the organizational goals, including enabling integration with Salesforce and other platforms to support visualization and predictive analytics.
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
  • Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data.
  • Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses.
  • Knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporating the trends into the organization's data infrastructure.
  • Provide technical guidance to other staff.
  • Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings.
  • Up to 10% domestic travel may be required.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service