Data Engineering Manager

Navia Benefit SolutionsMoraine, OH
2d$140,000 - $160,000Remote

About The Position

The Data Engineering Manager is accountable for delivering scalable, production grade data solutions that ensure trusted, high quality, and secure data across Navia’s ecosystem. Operating with an AI first mindset, this role designs and hardens pipelines, models, and integration patterns that enable analytics, operational reporting, and application use cases. The Data Engineer Manager provides technical leadership and people management for multiple scrum teams—including data engineering and reliability, and platform integrations including APIs —while partnering closely with Product, Engineering, Security, and Operations to uphold governance, observability, and performance standards. The role incorporates platform architecture evaluation, integration technology stewardship, and support for EDI and external API gateway capabilities, ensuring Navia’s data and integration foundations meet enterprise requirements without exposing strategic roadmaps. This is a technical leadership role overseeing multiple scrumteams while remaining hands-on in architecture and delivery. The role will require hands-on involvement in all areas of the department.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or related field, or equivalent experience (preferred).
  • 6-8+ years in data engineering or closely related roles, delivering and operating pipelines for large, heterogeneous datasets.
  • Demonstrated ability to lead technical teams and set engineering standards.
  • Expert level SQL across major RDBMS (e.g., SQL Server, Oracle, MySQL); advanced query and stored procedure optimization.
  • Proficiency in Python for data engineering (e.g., pandas, PySpark); familiarity with Java/Scala is a plus.
  • Hands-on experience with modern platforms (e.g., Databricks, Snowflake) and Iceberg compatible table formats.
  • Knowledge of orchestration and transformation tooling (e.g., Airflow, Dagster, dbt); CI/CD for data and integration of assets.
  • Understanding of streaming technologies (Kafka, Spark Structured Streaming) and Change Data Capture (CDC) practices.
  • Experience with API design, external API gateway administration, and secure partner integrations; familiarity with iPaaS (e.g., Boomi) and EDI.
  • Strong grasp of data modeling, metadata, lineage, governance, and compliance; proven ability to set and enforce standards.
  • Comfort with Git-based workflows, containers, infrastructure as code (e.g., Terraform), and cloud services; focus on reliability and cost control.
  • Demonstrated AI first approach to engineering—practical use of LLMs and AI tooling to enhance development, testing, documentation, and data quality.
  • Experience working in agile teams; effective communication with technical and executive stakeholders.

Nice To Haves

  • Industry experience in financial services is a plus.

Responsibilities

  • Lead and manage multiple scrum teams (data engineering and platform integrations), setting goals, collaborating closely with Product and QE peers, and coaching engineering excellence.
  • Architect, design, and deliver production ready data pipelines (batch and streaming) and data models that consolidate core sources into reliable, reusable datasets.
  • Develop and optimize ETL/ELT using SQL and Python; author, review, and tune stored procedures and queries on primary SQL platforms.
  • Apply AI assisted development to accelerate coding, generate tests, improve documentation, and detect anomalies; embed automated quality checks and observability.
  • Define integration patterns and standards; oversee API development and platform integration practices across services and partners.
  • Govern and enhance the iPaaS footprint (e.g., Boomi) and EDI flows; ensure reliability, traceability, and compliance with operational controls.
  • Establish and manage the external API gateway capability with appropriate authentication, authorization, rate limiting, logging, and lifecycle management.
  • Implement orchestration, version control, and CI/CD for data and integration of assets; promote reusable templates, code conventions, and infrastructure as code.
  • Define data contracts, lineage, and schema evolution practices; enforce SLAs and SLOs for data product availability, freshness, and quality.
  • Drive performance tuning on SQL workloads and cost-efficient storage strategies; monitor reliability and capacity across environments.
  • Partner with Security and Governance to enforce access controls, encryption, privacy, and auditability; remediate findings proactively.
  • Lead root cause analysis of complex incidents; prioritize and retire technical debt, improve resiliency, and prevent recurrence through design changes.
  • Collaborate with Product and business stakeholders to translate requirements into scalable solutions, shared metrics, and well documented interfaces.
  • Provide technical reviews and mentoring; align database design, pipelines, APIs, and platform choices with Navia’s architecture standards.
  • Other duties as assigned.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service