Data Engineer

ALTEN Technology USAGreensboro, NC
36d$80,000 - $90,000

About The Position

We’re ALTEN Technology USA, an engineering company helping clients bring groundbreaking ideas to life—from advancing space exploration and life-saving medical devices to building autonomous electric vehicles. With 3,000+ experts across North America, we partner with leading companies in aerospace, medical devices, robotics, automotive, commercial vehicles, EVs, rail, and more. As part of the global ALTEN Group—57,000+ engineers in 30 countries—we deliver across the entire product development cycle, from consulting to full project outsourcing. When you join ALTEN Technology USA, you’ll collaborate on some of the world’s toughest engineering challenges, supported by mentorship, career growth opportunities, and comprehensive benefits. We take pride in fostering a culture where employees feel valued, supported, and inspired to grow. We are seeking a Data Engineer to join a high-impact, execution-focused project supporting a Pharma manufacturing site. The goal is to replace legacy tools (Statistica, Spotfire, enterprise PI) with a new Snowflake-based data pipeline that integrates MES Werum, PI Historian, LIMS, and MDE systems. You will design, build, and validate production-ready ETL pipelines in Python, map data sources, create discrete and time-series datasets, and configure PI event frames, all under GxP-compliant processes. The end goal is to build production-ready data pipelines and datasets in Snowflake to support Manufacturing Science engineers (trending, CPV, troubleshooting).

Requirements

  • 3–5 years hands-on data engineering experience.
  • Snowflake: schema design, performance tuning, loading strategies (staging, bulk/CDC patterns).
  • Python: production ETL experience (pandas, pyarrow, db connectors, scripting, scheduling).
  • GitHub: branching, PRs, CI basics, code reviews.
  • Direct experience integrating at least one of: PI Historian (OSIsoft/AVEVA PI), MES (Werum), LIMS, ideally in a regulated environment.
  • Strong SQL skills for complex queries, data transformations, and optimizations. Ability to design discrete & time-series tables.
  • Experience working under GxP / pharma compliance or similar regulated data governance.

Nice To Haves

  • Ability to run Python notebooks in cloud environments.
  • Experience with Dataiku, Power BI, or homegrown front-ends.
  • Knowledge of CPV (Continued Process Verification) dataset requirements.
  • Experience creating PI event frames or working with PI AF.
  • Familiarity with Werum MES data models (or similar MES schemas) and LIMS sample metadata.
  • Docker / basic infrastructure-as-code (helpful, not required).

Responsibilities

  • Design, develop, and deploy ETL pipelines in Snowflake using Python to integrate MES Werum, PI Historian, LIMS, and MDE data sources.
  • Create mapping documents, perform data validation, and ensure complete and accurate ingestion of all required product datasets.
  • Build discrete data tables and time-series datasets to meet Manufacturing Science engineers’ trending and reporting needs.
  • Configure PI event frames for process event tracking and align with Snowflake architecture.
  • Maintain version-controlled code in GitHub; follow CI and code review best practices.
  • Collaborate with engineering teams for dataset validation, troubleshooting, and weekly progress updates.
  • Deliver all work in compliance with GxP standards and within security and infrastructure framework (VPN/VDI).

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service