Senior Data Modeler

Liberty Mutual InsuranceBoston, MA
17hHybrid

About The Position

We deliver our customers peace of mind every day by helping them protect what they value most. Our passion for placing the customer at the center of everything we do is driving a transformational shift at Liberty Mutual. Operating as a tech startup within a Fortune 100 company, we are leading a digital disruption that will redefine how people experience insurance. This role has a hybrid work schedule (2 days onsite) and we are considering candidates based in Portsmouth, NH, Boston, MA, Plano, TX, Columbus, OH and Indianapolis, IN. Job Introduction: Byte Club is a key player within Global Finance Technology's Business Data Solutions Engineering Organization. We leverage data modeling, governance, and modern AWS platforms to build innovative solutions and optimize financial insights. A dynamic Agile group dedicated to transforming complex business requirements into flexible & performant database designs, creating a consolidated and modernized single source of global financial data. We strive to make data accessible and insightful for analytics and reporting, for decision-makers across the organization. If you’re eager to dive deep into a variety of business units and understand the critical data they provide to finance, consider joining Byte Club as we harness the power of data to drive financial excellence! The Byte Club team is looking to add a dynamic Senior Data Engineer / Data Modeler to their team. This candidate will help build the future of financial data at Liberty Mutual. Key skillsets include Data Modeling using Erwin, Data Governance using the Informatica suite, SQL and ETL, cross-team collaboration, translating business requirements into appropriate data elements and definitions, and deploying data models within AWS. Responsibilities will include analysis of data and data elements in collaboration with data governance teams, implementing new elements and changes into existing data models, and integration with GitHub deployment pipeline. Seeking a team member who is self-motivated with and willing to learn and upskill as needed and has a willingness to accept peer review and mentor others. About the job: Designs and develops programs and tools to support ingestion, curation and provisioning of complex enterprise data to achieve analytics, reporting, and data science. Identifies process improvements that address complex technology gaps within a single business process. Builds strong knowledge of technology enablers. Analyzes and prepares complex technology enables recommendations to address gaps within a single business process. Provides successful deployment and provisioning of data solutions to production or other required environments. Designs and builds data provisioning workflows/pipelines, physical data schemas, extracts, data transformations, and data integrations and/or designs and builds data architecture and applications that enable reporting, analytics, data science, and data management and improve accessibility, efficiency, governance, processing, and quality of data. Makes recommendations for how to improve data. Applies machine learning concepts to work as applicable.

Requirements

  • Bachelor's or Master's degree in technical or business discipline or equivalent experience, technical degree preferred.
  • Minimum 5+ years of data engineering / data modeling experience.
  • Extensive data modeling experience using industry standard methodologies and tools (Erwin, UML)
  • Extensive experience with both relational (Postgres) and dimensional data modeling (Snowflake, SSAS).
  • Extensive experience developing DDL for cloud platforms (Amazon Redshift, Amazon Aurora, Snowflake) preferred.
  • Extensive experience utilizing data governance (Informatica IDMC).
  • Experience working with AWS preferred.
  • Experience deploying DDL within Amazon AWS (Redshift, Aurora, Snowflake) preferred.
  • Experience building ETL pipelines preferred.
  • Experience building automation into data modeling and deployments.
  • Experience working with financial data.
  • Experience with insurance industry preferred.
  • Must be proactive and self-driven, demonstrate initiative and be a logical thinker.
  • Strong leadership, communication, collaboration skills with a track record of taking solution ownership.
  • Excellent communication and collaboration skills.

Responsibilities

  • Designs and develops programs and tools to support ingestion, curation and provisioning of complex enterprise data to achieve analytics, reporting, and data science.
  • Identifies process improvements that address complex technology gaps within a single business process.
  • Builds strong knowledge of technology enablers.
  • Analyzes and prepares complex technology enables recommendations to address gaps within a single business process.
  • Provides successful deployment and provisioning of data solutions to production or other required environments.
  • Designs and builds data provisioning workflows/pipelines, physical data schemas, extracts, data transformations, and data integrations and/or designs and builds data architecture and applications that enable reporting, analytics, data science, and data management and improve accessibility, efficiency, governance, processing, and quality of data.
  • Makes recommendations for how to improve data.
  • Applies machine learning concepts to work as applicable.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service