VP, Data Engineering (Insurance Experience Required)

Coaction GlobalMorristown, NJ
1d$197,000 - $230,000

About The Position

At Coaction, we’re a unique mix of leaders, achievers, thinkers, and team players with a high-performance mindset and a diverse skillset. We bring our industry expertise together to continually push the boundaries of what insurance can be for our clients. We are seeking a hands-on Data Engineering leader to modernize our enterprise data platform and accelerate migration from legacy datamarts to a scalable AWS-based analytics ecosystem. This role will lead CI/CD enablement for data pipelines, implement DataOps best practices, and ensure reliable, high-quality, governed data delivery across underwriting, claims, actuarial, finance, and operations domains. The ideal candidate combines deep technical expertise in cloud data engineering with strong operational discipline, enabling fast, safe releases while improving reliability, scalability, and auditability.

Requirements

  • Experience in insurance is must
  • Understanding of Data Domains, Data Products is must
  • 10+ years of experience in Data Engineering with production-scale pipelines
  • Strong expertise in SQL and Python.
  • Proven experience implementing CI/CD for data pipelines (GitHub Actions, Azure DevOps, Jenkins, or similar).
  • Strong AWS experience including Amazon Redshift, S3, IAM, and orchestration tools.
  • Experience migrating legacy data warehouse/datamart systems to modern cloud platforms.
  • Strong understanding of data modeling and warehouse optimization.

Nice To Haves

  • Experience with dbt or similar transformation frameworks in AWS.
  • Knowledge of data quality frameworks (Great Expectations, Deequ, etc.).
  • Infrastructure-as-Code experience
  • Experience implementing data observability and monitoring platforms.
  • Exposure to regulatory and audit requirements in insurance environments.

Responsibilities

  • Data Platform Modernization (Legacy → AWS/Redshift) • Lead migration of legacy datamarts and transformation logic into Amazon Redshift.
  • Refactor legacy stored procedures and scripts into modular, version-controlled ELT pipelines.
  • Define dimensional modeling and data mart standards aligned to insurance business domains.
  • Establish structured cutover and decommission strategy for legacy environments.
  • CI/CD & DataOps Engineering Excellence • Design and implement CI/CD pipelines for data engineering workflows.
  • Establish code review standards, automated testing, and controlled deployment processes.
  • Implement automated validation gates including schema validation, reconciliation checks, and business rule tests.
  • Drive release discipline and rollback strategies for production stability.
  • Pipeline Engineering & Reliability • Design scalable batch and incremental ETL/ELT pipelines.
  • Implement orchestration, monitoring, alerting, retry logic, and operational runbooks.
  • Optimize Redshift performance, workload management, and cost controls.
  • Ensure production-grade resilience and disaster recovery readiness.
  • Data Quality & Observability • Implement data quality engineering framework (completeness, accuracy, timeliness, consistency).
  • Establish observability metrics such as pipeline success rate, SLA adherence, and incident resolution time.
  • Partner with governance stakeholders to ensure lineage, metadata, and audit compliance.
  • Business & Domain Partnership • Partner with Underwriting, Claims, Actuarial, and Finance teams to ensure trusted data products.
  • Enable domain-aligned data models and consistent KPI definitions.
  • Support analytics and advanced modeling readiness through stable data foundations.

Benefits

  • employees are eligible for standard benefits package including paid time off, medical, dental and retirement.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service