Senior Data Engineer II

Principal Financial GroupDes Moines, IA
Hybrid

About The Position

We’re looking for a Senior Data Engineer to join our Retirement Modernization Data Enablement team. In this role, you’ll sit at the center of our retirement modernization effort, leading how critical transaction data moves from source systems to enterprise analytics, ensuring it’s trusted, well‑governed, and ready to drive insight at scale. You’ll have the opportunity to design and influence data integration and movement patterns across multiple platforms, aligning teams to shared standards and best practices. Serve as a thought leader in data engineering and architecture, influencing decisions without direct authority in a highly collaborative, matrixed environment. Partner closely with upstream transaction teams and downstream analytics teams to ensure data is modeled, governed, and delivered in ways that support analytics, reporting, and AI use cases. Enable the end‑to‑end flow of retirement source and transactional data through to the Enterprise Data Platform. Establish and promote best practices for data pipelines, orchestration, and data governance in a modernized data ecosystem. Balance hands-on engineering with architectural design, stepping into pipeline development when needed while maintaining a broader solution-focused lens. Play a key role in transforming how data is enabled across the Retirement technology domain. As Principal continues to modernize its systems, this role will offer you an exciting opportunity to build solutions that will directly impact our long-term strategy and tech stack, all while ensuring that our products are robust, scalable, and secure! Operating at the intersection of financial services and technology, Principal builds financial tools that help our customers live better lives. We take pride in being a purpose-led firm, motivated by our mission to make financial security accessible to all. Our mission, integrity, and customer focus have made us a trusted leader for more than 140 years.

Requirements

  • Bachelor's degree plus 8+ years related work experience or a Master's in related field plus 4+ years related work experience
  • Proven experience delivering large‑scale data engineering solutions, including data ingestion, transformation, modeling, and storage to support analytics, reporting, and advanced data use cases, with exposure to data mesh principles in modern data platform environments
  • Demonstrated ability to shape data architecture and influence technical strategy across teams, establishing patterns, standards, and best practices while engaging hands‑on when needed
  • Trusted technical leader with strong communication and collaboration skills who can influence across teams and business units, facilitate architectural discussions, and navigate ambiguity in modernizing data environments
  • Experience elevating the people around you – coaching, mentoring, and helping others grow through thoughtful feedback and knowledge sharing
  • Proven experience working across the full data lifecycle, from source‑aligned ingestion and foundational data preparation through to analytics‑ready datasets, with an understanding of how upstream design decisions enable downstream analytics and data products
  • Strong technical foundation working hands‑on with data and data stores, including advanced SQL, data pipeline development, including programming experience in languages such as Python and Java
  • Experience optimizing data pipelines and processing systems for high‑volume, complex datasets
  • Experience implementing data quality management across the data lifecycle, including validation, reconciliation, and mastering, with an emphasis on observability, monitoring, and operational readiness

Nice To Haves

  • Hands-on experience with Snowflake
  • Hands-on experience with Ataccama Data Quality
  • Hands-on experience with Informatica Intelligent Data Management Cloud
  • Experience with source and/or transactional data modeling, including normalized schemas (e.g., 3NF) and strong data integrity practices

Responsibilities

  • Design and influence data integration and movement patterns across multiple platforms, aligning teams to shared standards and best practices
  • Serve as a thought leader in data engineering and architecture, influencing decisions without direct authority in a highly collaborative, matrixed environment
  • Partner closely with upstream transaction teams and downstream analytics teams to ensure data is modeled, governed, and delivered in ways that support analytics, reporting, and AI use cases
  • Enable the end‑to‑end flow of retirement source and transactional data through to the Enterprise Data Platform
  • Establish and promote best practices for data pipelines, orchestration, and data governance in a modernized data ecosystem
  • Balance hands-on engineering with architectural design, stepping into pipeline development when needed while maintaining a broader solution-focused lens
  • Play a key role in transforming how data is enabled across the Retirement technology domain

Benefits

  • Flexible Time Off (FTO)
  • Pension Eligible
  • bonus program
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service