Data Engineer

CDC Foundation
Remote

About The Position

The Data Engineer will play a crucial role in advancing the CDC Foundation's mission by designing, building, and maintaining data infrastructure. This role is aligned to the Workforce Acceleration Initiative (WAI). WAI is a federally funded CDC Foundation program with the goal of helping the nation’s public health agencies by providing them with the technology and data experts they need to accelerate their information system improvements. Working with the Informatics and Analytics Department and the Great Plains Tribal Epidemiology Center (GPTEC), the public health authority subsidiary of the Great Plains Tribal Leaders Health Board (GPTLHB), the Data Engineer will deliver the architecture needed for data receipt, generation, storage, processing, analysis, and secure transfer to Tribal Leaders and trusted community members. The Data Engineer will collaborate with data content experts, analysts, data scientists, data modelers, warehouse architects, IT staff, and other organization staff to design and implement proposed solutions and architectures that meet the needs of GPTEC. GPTEC aims to advance modern, comprehensive, user-friendly public health cloud-based data infrastructure for its 18 tribal communities. Data obtained from multiple sources, such as tribal programs, state health departments, and the Indian Health Service, among others, is ingested, cleaned, partitioned, analyzed, and distributed to tribal public health departments to be used to inform public health activities. Ensuring the successful construction of GPTEC’s data infrastructure is the key goal of the Data Engineer. The Data Engineer’s activities support the goal of GPTEC to fully exercise public health authority broadly among the State, Tribal, Local, and Territorial (STLT) ecosystem, and locally at the direction of its member tribes, for the practice of conducting public health investigations, carrying out interventions, providing surveillance and tracking services, and performing epidemiological analysis, visualizations, and reporting. The Data Engineer will be hired by the CDC Foundation aligned to the Workforce Acceleration Initiative and assigned to GPTEC. They will work closely with GPTEC and WAI staff to complete GPTEC goals. This position is eligible for a fully remote work arrangement for U.S. based candidates.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Master’s or PhD in related field (ex: MPH) preferred, but not required.
  • Minimum 5 years of relevant professional experience.
  • Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL. Candidate should be able to implement data automations within existing frameworks as opposed to writing one off scripts.
  • Experience with big data technologies and frameworks like Hadoop, Spark, Kafka, and Flink.
  • Strong understanding of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
  • Experience with Microsoft Fabric preferred.
  • Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review.
  • Knowledge of data warehousing concepts and tools.
  • Experience with cloud computing platforms.
  • Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data integration techniques.
  • Familiarity with agile development methodologies, software design patterns, and best practices.
  • Strong analytical thinking and problem-solving abilities.
  • Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively.
  • Flexibility to adapt to evolving project requirements and priorities.
  • Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners.
  • Experience working in a virtual environment with remote partners and teams
  • Proficiency in Microsoft Office.

Responsibilities

  • Create and manage the systems and pipelines that enable efficient and reliable flow of data, including ingestion, processing, storage, and security.
  • Work with the Data Scientist to connect dashboards to data pipelines using GPTEC data management system.
  • Collect or extract data from various sources, transforming and cleaning data to ensure accuracy and consistency. Load data into storage systems or data warehouses.
  • Optimize data pipelines, infrastructure, and workflows for performance and scalability.
  • Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them.
  • Implement security measures to protect sensitive information.
  • Collaborate with data scientists, analysts, and other partners to understand their data needs and requirements, and to ensure that the data infrastructure supports the organization's goals and objectives.
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
  • Implement and maintain ETL (Extract, Transform, Load) processes to ensure the accuracy, completeness, and consistency of data.
  • Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses.
  • Remain knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporate the trends into the organization's data infrastructure.
  • Provide technical guidance for other staff.
  • Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings.
  • Up to 10% domestic travel may be required.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service