Senior Manager, Data Engineering

AIGBedford, NH
Onsite

About The Position

Design, develop, and implement scalable data infrastructure and pipelines to support advanced analytics and enterprise data solutions. Collaborate with cross-functional stakeholders to ensure efficient integration, transformation, and delivery of data across multiple platforms and business units. Manage the architecture, development, and maintenance of large-scale data pipelines and data warehouses to ensure data quality, reliability, and performance. Collaborate with data scientists, analysts, product managers, and business stakeholders to gather requirements and translate business needs into technical specifications. Oversee data integration from diverse sources, including structured, semi-structured, and unstructured data into centralized platforms using ETL/ELT tools. Centralize different foundational data products into the data lake, deliver data for reports and dashboards to support actuarial pricing and rate monitoring, and implement coding scripts for credible and quality code in the production environment adhering to end user requirements. Promote engineering best practices, including DevOps, CI/CD, and governance standards across the data lifecycle. Implement and monitor data quality processes, including validation, cleansing, and transformation of datasets. Manage project timelines, resource allocation, and deliverables across multiple projects within the data engineering function. Ensure adherence to software development lifecycle (SDLC) methodologies and agile best practices. Collaborate with business and IT stakeholders to identify use cases where GenAI can enhance rate monitoring, policy management, and customer experience. Use LLMs, GenAI, and code generation models to automate code conversion from legacy systems to modern data stacks. Stay current with advancements in AI/ML, GenAI, and MLOps. Assess and pilot new technologies to strengthen the data engineering landscape.

Requirements

  • Master's degree in Statistics, Mathematics, Business Analytics, Data Science, Information Technology, Business Administration, Operations and Technology Management, or a related field of study, plus two (2) years of experience with building metadata-driven ETL frameworks using Pyspark and Informatica IICS; cloud-native data architecture with Snowflake and AWS; and, application of Gen AI and Code Whisperer for data engineering automation.
  • Alternatively, a Bachelor’s degree in Statistics, Mathematics, Data Science, Information Technology, Business Administration, Operations and Technology Management, or a related field of study, plus five (5) years of experience with building metadata-driven ETL frameworks using Pyspark and Informatica IICS; cloud-native data architecture with Snowflake and AWS; and, application of Gen AI and Code Whisperer for data engineering automation.

Nice To Haves

  • Work with building metadata-driven ETL frameworks using Pyspark and Informatica IICS
  • Cloud-native data architecture with Snowflake and AWS
  • Application of Gen AI and Code Whisperer for data engineering automation
  • LLMs, GenAI, and code generation models
  • Advancements in AI/ML, GenAI, and MLOps

Responsibilities

  • Design, develop, and implement scalable data infrastructure and pipelines
  • Collaborate with cross-functional stakeholders to ensure efficient integration, transformation, and delivery of data
  • Manage the architecture, development, and maintenance of large-scale data pipelines and data warehouses
  • Collaborate with data scientists, analysts, product managers, and business stakeholders to gather requirements and translate business needs into technical specifications
  • Oversee data integration from diverse sources into centralized platforms using ETL/ELT tools
  • Centralize different foundational data products into the data lake
  • Deliver data for reports and dashboards to support actuarial pricing and rate monitoring
  • Implement coding scripts for credible and quality code in the production environment
  • Promote engineering best practices, including DevOps, CI/CD, and governance standards
  • Implement and monitor data quality processes
  • Manage project timelines, resource allocation, and deliverables
  • Ensure adherence to software development lifecycle (SDLC) methodologies and agile best practices
  • Collaborate with business and IT stakeholders to identify use cases where GenAI can enhance rate monitoring, policy management, and customer experience
  • Use LLMs, GenAI, and code generation models to automate code conversion from legacy systems to modern data stacks
  • Stay current with advancements in AI/ML, GenAI, and MLOps
  • Assess and pilot new technologies to strengthen the data engineering landscape

Benefits

  • Total Rewards Program, a comprehensive benefits package
  • Benefits focused on your health, wellbeing and financial security
  • Professional development
  • Volunteer Time Off and Matching Grants Programs
© 2026 Teal Labs, Inc
Privacy PolicyTerms of Service