Everest Global Services-posted 4 months ago
$152,000 - $199,100/Yr
Full-time • Senior
Hybrid • Warren, NJ
Professional, Scientific, and Technical Services

As a Lead Data Engineering Platform Architect at Everest, you'll take ownership of designing, building, and managing a modern data and analytics platform that powers our global insurance business. You'll work with cutting-edge technologies like Databricks, Azure Data services, Airflow, and Kafka, and collaborate closely with Data Domain Delivery, Cloud Engineering, DevOps, and Information Security teams to deliver a platform that's secure, scalable, and built for performance. You'll drive automation, observability, and governance across the platform, while mentoring teams and shaping Everest's data architecture standards. You'll also partner with data engineering and analytics teams to deliver reusable frameworks, MLOps pipelines, and semantic layers that support enterprise-wide insights.

  • Architect and manage Everest's Azure-based data platform using Databricks and cloud-native tools.
  • Define and enforce data architecture standards, blueprints, and best practices.
  • Lead adoption of emerging technologies through PoCs and pilot implementations.
  • Translate business and technical requirements into architectural blueprints, roadmaps and solution designs adhering to enterprise standards.
  • Provide technology leadership to delivery teams to build and operate data products effectively.
  • Design and develop reusable components including data ingestion for structured and unstructured data, data transformation, logging, and observability.
  • Drive MLOps and self-service analytics frameworks for model lifecycle management.
  • Design end-to-end observability of all cloud resources and components, including performance, optimization, and FinOps alignment.
  • Automate infrastructure and security using IaC, DevOps, RBAC/ABAC.
  • Partner with cross-functional teams to deliver scalable, resilient data solutions.
  • Mentor engineering teams and elevate technical capabilities across the organization.
  • Implement enterprise-wide governance, privacy, and FinOps-aligned performance monitoring.
  • Stay updated with the latest trends, tools, and technologies in the field of data engineering.
  • Proactively identify opportunities to improve data engineering practices and contribute to the evolution of data infrastructure.
  • A bachelor's (required) or master's (preferred) in Computer Science, Information Systems, or related field.
  • 10+ years in data engineering and architecture, with deep experience in Azure and Databricks.
  • In-depth experience in Medallion/Delta Lake Architecture, ML/AI, and Azure cloud services, Azure Data technologies such as ADLS, ADF, Fabric, SQL, Synapse, PowerBI etc.
  • Strong experience in implementing IaC automation, DevOps, RBAC/ABAC, MLOPS etc.
  • Strong experience with orchestration tools such as Airflow and Event Driven data integration tools such as Kafka etc.
  • Strong experience with programming languages and tools such as Python, PySpark, SQL etc.
  • Proven success building scalable platforms in complex, global environments.
  • Experience deploying pipelines via Azure DevOps with code review and branching strategies.
  • Insurance industry knowledge (especially P&C) is a strong plus.
  • Strong communication, leadership, and stakeholder engagement skills.
  • Certifications in Azure and Databricks preferred.
  • Health insurance coverage
  • Employee wellness program
  • Life and disability insurance
  • 401k match
  • Retirement savings plan
  • Paid holidays
  • Paid time off (PTO)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service