We deliver our customers peace of mind every day by helping them protect what they value most. Our passion for placing the customer at the center of everything we do is driving a transformational shift at Liberty Mutual. Operating as a tech startup within a Fortune 100 company, we are leading a digital disruption that will redefine how people experience insurance. Job introduction: The US Retail Markets (USRM) Data and Analytics Engineering team is actively searching for a highly productive Data Engineer for a distributed, dynamic agile team to serve as a technical expert designing, developing, analyzing & testing innovative data warehouse solutions. This candidate will join an energetic and engaged Business Data Solutions Engineering team focused on leveraging our customer data to deliver exceptional value to our business partners. You will work collaboratively in an agile squad to design and build data pipelines & workflows, ingest, curate & provision data workflows in a Cloud-based environment as well as own responsibility of thorough end-to-end testing. This is a fast-paced environment providing rapid delivery for our business partners. You will be working in a highly collaborative environment that values speed and quality, with a strong desire to drive change and foster a positive work environment. You will have the opportunity to help with this change as we grow this culture, mindset, and capability. We encourage you to apply if this interests you: Work as ONE team committed to excellence Model and promote a Data First attitude Help advance Data Engineering operations into the future Work with a modern data tech stack About the job: Collaborate in a fast-moving Agile environment with Scrum Masters, Product Owners, and cross-functional teammates to design and deliver data-driven solutions that advance both business and technical goals. Build, maintain, and optimize ingestion pipelines and data integrations to manage the full information lifecycle — from acquisition and curation to provisioning for analytics and reporting. Design and implement ETL pipelines, API-based microservices, physical data models, and transformation logic to make data reliable, accessible, and performant. Work on data architectures and applications (warehouses, data vault structures) that enable reporting, analytics, machine learning, and improved data governance and quality. Participate in peer code reviews and sprint ceremonies, CI/CD best practices, and automated testing to ensure robust, production-ready releases. Accelerate speed to market while building long-term, scalable solutions using modern technologies such as AWS, Snowflake, Kafka, Data Vault methodology, and other cloud-native tools. Create tools and programs to automate ingestion, curation, and provisioning of complex enterprise data to support analytics, reporting, and data science use cases. Collaborate with senior team members and your manager to improve your skills, code quality, and engineering practices. Troubleshoot and resolve technical issues, proactively improvements that increase reliability, reduce latency, and improve data quality. Maintain a continuous learning mindset — evaluate modern technologies, share findings with the team, and adopt approaches that improve delivery and business outcomes.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Entry Level