Senior Data Engineer

Mission US
140dRemote

About The Position

The Senior Data Engineer will play a key role in designing, implementing, and optimizing Mission’s data infrastructure as part of our modern data platform initiative. This hands-on engineering role will focus on building scalable data pipelines, enabling a centralized enterprise data warehouse, and supporting business reporting needs. The ideal candidate will collaborate across technology, operations, product, and analytics teams to create high-quality, governed, and reusable data assets, while supporting a long-term architecture aligned with Mission’s growth. This role is highly technical and focused on execution and is ideal for a data engineer who thrives in fast-paced environments and is passionate about data quality, performance, and scalability.

Requirements

  • 5+ years of experience in data engineering, data warehousing, or a related field with at least 1-2 years leading implementation projects.
  • 3+ years of experience in the commercial insurance industry.
  • Hands on experience with Azure data services.
  • Strong SQL and performance tuning skills in Microsoft SQL Server environments.
  • Experience designing data warehouses or data marts with a focus on dimensional modeling and analytics-ready schemas.
  • Fluent in range of data ingestion: RESTful APIs, JSON/XML ingestion, flat files, message queues, and CDC patterns.
  • Understanding of data governance, metadata management, and access control.
  • Familiarity with version control (GitLab) and CI/CD pipelines for data workflows.
  • Experience working with BI tools like Power BI, Looker, or Tableau, including building dashboards or semantic layers.
  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent experience.
  • Demonstrated ability to translate business needs into scalable data systems and design.

Nice To Haves

  • Experience with data cleanup, fuzzy matching, or AI-driven normalization.
  • Familiarity with Python, dbt, or Azure Functions is a plus.
  • Strong communication skills and ability to work cross-functionally with product, engineering, and analytics teams.
  • Ability to travel up to 10% of the year.

Responsibilities

  • Design and implement scalable data pipelines to ingest, transform, and store data from third-party vendors and internal systems using APIs, files, and databases.
  • Build and maintain a cloud-based data warehouse solution in partnership with architects, ensuring clean, normalized, and performant data models.
  • Establish and monitor reliable data ingestion processes across systems with varying grain and cadence, ensuring data quality and completeness.
  • Collaborate with API and integration teams to develop secure, robust data exchange processes with external vendors and internal services.
  • Set up and manage data connections from the warehouse to BI tools (e.g., Power BI, Looker, Tableau) to enable self-service analytics and dashboarding.
  • Document data flows, schemas, and definitions, and help drive data governance and standardization efforts across the organization.
  • Implement data validation, cleansing, and enrichment processes to ensure high-quality data for financial reporting and analytics
  • Ensure compliance with data standards, regulatory requirements (e.g., NAIC, SOX), and data security best practices
  • Provide technical leadership in data engineering best practices, including version control, CI/CD for data pipelines, testing, and observability.
  • Build initial dashboards in BI tools.
  • Give directions to data engineers to ensure that projects are completed on time and to a high standard.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service