Data Modeler - Life Insurance Domain - ONSITE

NTT DATAAddison, TX
2dOnsite

About The Position

Design high-quality domain (canonical) data models, consumption-layer dimensional models, ensuring compliance with data architecture and governance standards. Collaborate with business and technical stakeholders to define, document, and govern how data flows across the organization - supporting analytics, reporting, operational use cases, and advanced data science initiatives. Work under the guidance of Core Data Architects to create data models for the different layers (Domain and Consumption) of our enterprise data platform hosted in Databricks on Azure. Collaborate with Application Data SMEs to understand the complete structure and business definition of the source data. Collaborate with business analysts and business analytics SMEs to understand the BUS Matrix and associated requirements definition for the data needs. Create and maintain data model using ER Studio tool, complete with DDL generation. Define end to end data lineage (S2T) and document them on Confluence pages as per given standards. Generate SQL query snippets for explaining data transformation logic. Profile and analyze source data to ensure data quality and recommend data refinement and cleansing methods. Perform necessary reviews and obtain sign offs prior to delivery to the DEV and QA teams. Perform handoff walkthroughs of model and S2T to the DEV team and participate in design review sessions for any refinements as necessary. Perform handoff walkthroughs of model and S2T to the QA team and participate in defect triage sessions as necessary. Build SQL queries for end consumption business views in alignment with the business requirements. Participate with business UAT teams to clarify data model questions and business view questions. Provide any other necessary support through the engineering build, QA and UAT phases.

Requirements

  • 5+ years in data architecture, data modeling, or enterprise analytics.
  • 3+ years of experience in life insurance, annuities, policy administration, claims, actuarial, or related financial services data will be excellent.
  • 5 years' experience working with Data Modeling tools (e.g., Erwin, ER/Studio, or similar)
  • Strong experience with canonical / domain modeling, dimensional modeling, and semantic modeling
  • Proficiency in documenting data lineage and metadata (e.g., Purview, Collibra, Alation, or native Databricks capabilities).
  • Proficiency in defining SQL queries is a must
  • Experience working with Azure Databricks (Delta Lake, Unity Catalog, DBFS, SQL endpoints) will be an added advantage
  • Familiarity with Data Mesh, Kimball, Inmon, Data Vault, and Lakehouse modeling patterns.
  • Excellent communication and ability to translate complex concepts for non-technical audiences.
  • Ability to lead architecture discussions and influence stakeholders.
  • Comfortable working in agile delivery environments.
  • Strong documentation habits and detail orientation.

Nice To Haves

  • Understanding of ACORD data standards, product hierarchies, distribution channels, customer/party models, and regulatory reporting will be an added advantage.
  • Any prior experience with the conceptual understanding of IBM IIW, Teradata ILDM, Oracle OIDF, etc., will be helpful
  • Experience with data contracts, API modeling, or event-driven architecture will be helpful
  • Experience working with Alteryx and/or Tableau will be an added advantage
  • Experience with SSIS will be helpful
  • Knowledge of Python will be helpful

Responsibilities

  • Design high-quality domain (canonical) data models, consumption-layer dimensional models, ensuring compliance with data architecture and governance standards.
  • Collaborate with business and technical stakeholders to define, document, and govern how data flows across the organization - supporting analytics, reporting, operational use cases, and advanced data science initiatives.
  • Work under the guidance of Core Data Architects to create data models for the different layers (Domain and Consumption) of our enterprise data platform hosted in Databricks on Azure.
  • Collaborate with Application Data SMEs to understand the complete structure and business definition of the source data.
  • Collaborate with business analysts and business analytics SMEs to understand the BUS Matrix and associated requirements definition for the data needs.
  • Create and maintain data model using ER Studio tool, complete with DDL generation.
  • Define end to end data lineage (S2T) and document them on Confluence pages as per given standards.
  • Generate SQL query snippets for explaining data transformation logic.
  • Profile and analyze source data to ensure data quality and recommend data refinement and cleansing methods.
  • Perform necessary reviews and obtain sign offs prior to delivery to the DEV and QA teams.
  • Perform handoff walkthroughs of model and S2T to the DEV team and participate in design review sessions for any refinements as necessary.
  • Perform handoff walkthroughs of model and S2T to the QA team and participate in defect triage sessions as necessary.
  • Build SQL queries for end consumption business views in alignment with the business requirements.
  • Participate with business UAT teams to clarify data model questions and business view questions.
  • Provide any other necessary support through the engineering build, QA and UAT phases.

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

5,001-10,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service