Senior Data Engineer, Investments Technology

Liberty Mutual InsuranceBoston, MA
Hybrid

About The Position

At Liberty Mutual Investments Technology, we are building a modern, cloud-based data platform to power investment insights and decision-making across our portfolio. We are seeking a Senior Data Engineer to join our Core Data team, focused on designing and delivering scalable data warehouse solutions in Snowflake on AWS. In this role, you will develop high-quality data pipelines and solutions that enable trusted, accessible data for our investment teams. You’ll have the opportunity to work hands-on with emerging AI capabilities—including Snowflake’s latest AI Cortex features—while leveraging and helping shape the rollout of AI-powered developer tools to accelerate innovation and modernize how we build data solutions. You’ll work in a collaborative, agile environment alongside engineers, analysts, and business partners, helping to shape the evolution of our data platform while contributing to a culture of continuous learning and engineering excellence. Note: This role will have a hybrid work arrangement in our Boston office (2x per week)

Requirements

  • Bachelor’s degree in a technical or business discipline, or equivalent experience.
  • 5+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines and data platforms.
  • Strong proficiency in SQL and solid understanding of data modeling and data warehousing concepts.
  • Experience with Snowflake and AWS services (e.g., S3, Lambda) in a cloud-based data environment.
  • Experience using Python for data transformation, integration, or automation.
  • Familiarity with ELT/ETL pipeline development and modern data architecture patterns.
  • Experience with CI/CD tools and practices (e.g., GitHub Actions, Bamboo, Jenkins, or similar).
  • Experience working with or exposure to workflow orchestration tools.

Nice To Haves

  • Experience in investments, asset management, or financial services.
  • Familiarity with API-based integrations (e.g., REST services).
  • Exposure to data governance, security, and access control practices.
  • Experience with modern data tooling (e.g., data observability, cataloging, or transformation frameworks).

Responsibilities

  • Design, build, and maintain scalable data pipelines and data solutions that support analytics and investment insights.
  • Develop data provisioning workflows, transformations, and integrations using Snowflake’s cloud-native capabilities in an AWS environment.
  • Apply data warehousing best practices, including data modeling, performance optimization, and scalable architecture design.
  • Build and enhance ELT pipelines using Snowflake and Python to ensure reliable and efficient data processing.
  • Partner with data engineers, analysts, and business stakeholders to understand requirements and deliver high-quality data solutions.
  • Support and enhance data quality, governance, and observability practices across the platform.
  • Contribute to CI/CD pipelines and deployment processes, improving automation and reliability of data workflows.
  • Work with enterprise orchestration tools (e.g., ActiveBatch, transitioning to Stonebranch) to manage and schedule data workflows.
  • Participate in code reviews, knowledge sharing, and mentoring, helping to elevate engineering practices across the team.
  • Stay current with evolving data engineering tools and practices, and recommend improvements where appropriate.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service