About The Position

We are seeking a (Senior) BI & Data Engineer – Power BI & Azure Databricks to support a global reinsurance client in transforming their claims reporting landscape. You will play a key role in building and operating a centralized reporting platform, contributing to the development of a single source of truth for claims data on Azure Databricks. This role combines strong Power BI expertise with hands-on data engineering skills, enabling you to design scalable data models, deliver high-quality dashboards, and ensure robust, performant, and business-aligned reporting solutions. You will work closely with stakeholders across business and technology to translate complex reporting needs into impactful analytics products.

Requirements

  • Strong hands-on experience with Power BI (data modelling, DAX, performance optimisation)
  • Proven experience building enterprise-grade dashboards and semantic models
  • Practical experience with Azure Databricks (PySpark and/or Spark SQL)
  • Solid understanding of Delta Lake and medallion architecture
  • Strong grounding in data warehousing and BI architecture principles
  • Experience working with large, complex datasets in a reporting environment
  • Ability to work independently in a multi-stakeholder, international setting

Nice To Haves

  • Experience in insurance or reinsurance reporting
  • Familiarity with claims data (losses, reserves, payments, ultimates, movements)
  • Knowledge of data governance, data products, or data mesh concepts
  • Experience with financial or actuarial reporting
  • Exposure to DevOps / CI-CD for Power BI or Databricks
  • Understanding of regulatory or management reporting in insurance

Responsibilities

  • Design, develop, and maintain Power BI datasets, semantic models, and reports for global claims reporting
  • Implement best-practice data modelling (star schema, claim-centric fact tables, conformed dimensions)
  • Develop complex DAX measures for claims KPIs (e.g. incurred, paid, reserves, ultimates, movements)
  • Optimise report performance (model size, DAX efficiency, incremental refresh)
  • Ensure consistency of KPIs and definitions through a centralized data model
  • Work with Azure Databricks (Delta Lake) as the primary reporting data source
  • Contribute to the design and implementation of curated data layers (e.g. silver/gold)
  • Develop and maintain SQL / Spark-based transformations aligned with reporting requirements
  • Support data quality, reconciliation, and traceability from source systems to reporting outputs
  • Collaborate with data product teams and follow platform standards and governance
  • Work closely with Product Owners, stakeholders, and cross-functional teams
  • Translate business reporting requirements into scalable technical solutions
  • Contribute to agile delivery (backlog refinement, sprint execution, documentation)
  • Promote knowledge sharing and best practices within the team

Stand Out From the Crowd

Upload your resume and get instant feedback on how well it matches this job.

Upload and Match Resume

What This Job Offers

Job Type

Full-time

Career Level

Mid Level

Education Level

No Education Listed

Number of Employees

1,001-5,000 employees

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service