A modern datahub to support an insurance organization to prepare for rapidly changing demographics.


  1. Strategy

  2. Engineering

Our role

How do you prepare your organization for the rapidly changing demographics without exactly knowing how this will affect your business?

In this case, by understanding the fundamental role a dynamic and centralized data platform can play in the organization and by advising the customer on how to implement the chosen strategy in such a way that the data platform stays maintainable, scalable and future proof.

From modernizing and standardizing the fragmented data processes across different data teams towards implementing a reliable datahub that enables business analysts, data scientists and business end users in a self-service data analytics way with their increasing demand for data and day-to-day reporting.

From modernizing and standardizing the fragmented data processess across different data teams towards implementing a reliable datahub that enables business analysts and data scientists to support end users in their increasing demand in day-to-day reporting.

The challenge

Most people like what they use, so do data engineers and business analysts. Therefore, data processes are usually fragmented and reinvented throughout the organization while one uniform and centralized way of working is preferred. This transition comes down to a change (migration) in the existing technology stack, which in essence comes down to organizational change management and a new way of implementing data processes.

In this case, apart from the organizational changes, the technical challenge is building a centralized data platform that acts as the organization’s datahub. The datahub ingests data from over 60 different operational systems, such that teams responsible for reporting and data insights are provided with a proper foundation and work more effectively.

The solution

As part of our strategic advice, we navigated the migration towards the desired datahub, while keeping in mind the modularity, maintainability and scalability of the platform. Using this methodology, we make sure the solution is future proof and adaptable to rapidly changing business demands. The developed solution unburdens business analysts and data scientists so they can focus on what they do best: delivering valuable insights.

The solution can be separated into three main parts:

  • A Lakehouse microservice architecture that supports standardized and easy to configure as-code data pipelines based on standard Azure components.
  • Maintainable and scalable infrastructure and services as-code, enabling quick recovery and deployments. The core technologies that are used: Terraform and Terragrunt.
  • A unified release process based on Semantic Release combined with Artifactory, enabling continuous integration and continuous deployment.

The outcome

We implemented a data platform that acts as a datahub for the organization which provides daily ingestion and access to over 60 data sources. The datahub makes use of a highly standardized ingestion and release processes to unburden four data platform teams (>15 engineers) and two data product teams (>10 analysts). As a result, we successfully migrated the organization’s data landscape towards a centralized datahub, such that they could phase out on premise IT systems and associated Microsoft licenses worth around 200k on an annual basis.

  • Migration towards the cloud and phase out IT licenses

     ± €200 k annual savings

  • Unburden analysts with a proper data foundation

     > 24 in multiple teams

  • Domain-driven lakehouse ingesting of sources daily

     > 50