Data Architect

Willis ReNew York, NY
18hHybrid

About The Position

Willis Re is a technology‑led reinsurance broker built on a cloud‑native, modular and data‑driven platform. As our Data Architect, you will own the enterprise data architecture that powers broking, placement, analytics and operations, ensuring the estate scales globally with speed, reliability and security. You will translate business needs into data capabilities that lift pricing accuracy, placement velocity, client reporting and regulatory compliance. Working in a hybrid delivery model, you will set standards, make critical architectural decisions and embed security, resilience and data protection from inception. Success will be measured through clear KPIs for data quality, availability, performance, compliance and business value realised. About you You are a pragmatic, business‑minded architect who turns strategy into executable data designs that deliver measurable client and commercial outcomes. You balance vision with hands‑on depth, moving comfortably from target state blueprints to detailed models and reference implementations. You default to simplicity, automation and standardisation, and you are relentless about data quality, security and operational resilience. You thrive in fast-moving environments, collaborate across disciplines and regions, and communicate clearly with both executives and engineers. In a start‑up context, you are comfortable operating beyond your usual remit to get things done. You are comfortable setting guardrails, making trade‑offs explicit, and holding clear ownership for results. Above all, you believe data is a product—and you design platforms and processes that make trustworthy, usable data available at scale.

Requirements

  • Proven track record (8+ years) in data architecture for complex, cloud‑native environments; experience in insurance/reinsurance or capital markets strongly preferred.
  • Expertise in conceptual/logical/physical data modelling and standardised design for domains such as client, policy/contract, placement, risk, pricing and claims.
  • Hands‑on design of modern data platforms (e.g., lakehouse/warehouse architectures) on major clouds, with strong command of storage formats, ELT/ETL, streaming and orchestration.
  • Strong knowledge of data governance, MDM, metadata management, lineage, cataloguing and data product thinking; experience establishing data ownership and stewardship models.
  • Security and resilience depth: encryption, IAM/zero trust, network controls, data masking, tokenisation, backup/restore, DR patterns and compliance across multiple jurisdictions.
  • Proficiency with data integration patterns and APIs, including event streaming, CDC, contract‑first design and schema evolution management.
  • Implementation of data quality and observability tooling, with defined SLOs/SLIs and incident management playbooks.
  • Demonstrated ability to set global standards while enabling local extensions; experience operating in multi‑region regulated environments.
  • Vendor and partner management within a hybrid delivery model, including RFPs, selection, and governance against KPIs and cost/performance objectives.

Responsibilities

  • Define and maintain the enterprise data architecture blueprint spanning data models, integration patterns, metadata strategy, lineage, and master and reference data across the broking value chain.
  • Establish global data standards, semantic models and APIs that enable modular, cloud‑native services while allowing regional extensions for regulatory and market nuances.
  • Design scalable, automated data platforms and pipelines (batch and streaming) that support analytics, reporting, pricing, market submissions and client servicing with high reliability and low latency.
  • Embed “secure and resilient by design” principles across data storage, movement and access, including encryption, key management, segregation, backup/restore, disaster recovery and zero‑trust access patterns.
  • Create and enforce data governance frameworks, including data ownership, stewardship, quality rules, SLAs, issue management and lineage to support auditability and regulatory readiness.
  • Partner with business and product leaders to prioritise data initiatives by measurable outcomes; define KPIs for data quality, availability, usage and value realisation, and report progress transparently.
  • Lead data modelling for core entities (e.g., client, contract, placement, risk, pricing, claims) and ensure consistency across domains through standardised models and contracts.
  • Select and govern data technologies and tooling (cloud data platforms, lakehouse/warehouse, catalogue, observability, integration, MDM), balancing cost, performance, security and vendor lock‑in.
  • Oversee data lifecycle management, retention and archival in line with legal, regulatory and client requirements across the UK, US, Bermuda and other jurisdictions.
  • Implement data observability and reliability practices, including monitoring, alerting, SLOs/SLIs, incident response and root‑cause analysis for data issue

Benefits

  • Health and Welfare Benefits: Medical, Dental, Vision, Health Savings Account, Commuter Benefits, Health Care and Dependent Care Flexible Spending Accounts, Accident Insurance, Critical Illness Insurance, Life Insurance, AD&D , Financial wellbeing support, Wellbeing Program and Work/Life Resources (including Employee Assistance Program)
  • Leave Benefits: Paid Holidays, Annual Paid Time Off (includes paid state/local paid leave where required), Short-Term Disability, Long-Term Disability, Other Leaves (e.g., Bereavement, FMLA, ADA, Jury Duty, Military Leave, and Parental and Adoption Leave), Paid Time Off (Washington State only)
  • Retirement Benefits: Savings Plan (401k)
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service