| LACO
| LACO

Belfius accelerates innovation with new data platform through partnership with LACO

A new data platform based on Microsoft Azure is helping Belfius boost operational efficiency, while also enabling faster decision-making and greater innovation. LACO supported the initiative by guiding the process analysis, coaching employees, and leveraging its in-depth knowledge of the Belgian financial sector to facilitate the migration as a local partner.

The problem: a data platform nearing end of life

Belfius is a Belgian banking and insurance group with approximately 3.8 million customers across Belgian society and its economy. Over the years, the group has built a long history with on-premises technology. However, evolving business and technological demands prompted Belfius to reassess its existing environment.

The platform was gradually reaching its limits and could no longer support the demands of their future data ambitions.

This came at a time when the Belfius group was already embarking on a broader transition of their actual technological infrastructure. The IT architecture team was working on strategy with Microsoft Azure cloud as one of the central platforms. “It made sense to include the data platform in that exercise,” says Tom Bisschop, Program Manager ICT Data & Reporting at Belfius Insurance. “After all, the ultimate goal was to develop a future-proof data platform for both the bank and the insurance division.”

The solution:

move to the cloud

After reviewing its options, Belfius concluded that on-premise was not the best solution any more for their needs. “Partly because we are currently seeing a rapid evolution in data usage,” continues Tom, “it quickly became clear that, given our plans, Microsoft Azure would offer us many more possibilities.”

This decision also repositioned LACO within Belfius. While LACO had long been responsible for implementation and support the on-premise data platform, the company has also built strong expertise in Azure in recent years. “LACO made it clear that we were far from the only company considering the move to Azure,” says Tom. “That gave us confidence that we could continue to rely on LACO for this new chapter in our technological journey.”

The first challenge was to establish a solid foundation in Azure for both the banking and insurance businesses. “We began by drawing up a comprehensive business case and defining a clear Cloud Adoption Strategy,” explains Tom. From there, LACO and Belfius developed a phased roadmap for migrating the on-premise data platform to Azure. This included designing a robust architecture and setting up a scalable data platform framework, supported by a thorough security assessment and an accompanying approach.

Next, Belfius and LACO launched a pilot with a small selection of applications. “By migrating those applications to Azure Synapse, we were able to gain valuable hands-on experience.” And through hands-on coaching and training, LACO familiarized Belfius with Spark and Python so that they could continue building on the new platform themselves.

The pilot took some of the pressure off the overall project. “We didn’t have to rush things,” says Tom. Once the pilot proved successful, Belfius accelerated the migration of the entire environment. “LACO’s specific experience with our insurance business was a major added value. We rewrote outdated components for the new Azure environment, which relies on Spark and Python. LACO played an important role in the associated analysis.

The result: a high-performance, future-proof framework

LACO closely monitored the new framework’s performance. This proved to be a key consideration, and it was also a major departure from the old on-premises setup. When working on-premises, the amount of computing power is inherently limited by the available hardware. In the cloud, computing power is unlimited, but of course it comes at a price. LACO succeeded in striking the right balance between performance and cost. The solution on Azure Synapse is now faster than the previous on-premises environment, illustrating how LACO succeeded in elevating Belfius’s cloud maturity while keeping costs under control.

Employee coaching also proved critical to the project’s success. Migrating to a completely new platform brings significant change. “It’s never easy to let go of a familiar environment,” says Tom. “LACO addressed employees’ questions with targeted coaching and acted as a bridge between departments. That approach made it clear that ultimately everyone benefits from the shift to Azure.”

With the migration, LACO played an important role in a key objective for Belfius: a future-proof data platform. “Self-service capabilities will increase significantly,” Tom notes. “The business now has more flexibility without having to call in the IT department at every turn.” This eliminates a significant amount of grunt work for the IT department.

In addition, insurance and banking now share a single data platform. “This means greater efficiency and allows us to roll out new data capabilities more easily to all data users, while also complying with all legal and regulatory requirements.” With the new data platform, Belfius is now in a strong position to fully – and rapidly – dedicate itself to innovation, including AI.

Ready to become a data-driven powerhouse?

Belfius accelerates innovation with new data platform through partnership with LACO2026-02-19T15:50:38+00:00

How analytics engineering enables your business

Data and AI initiatives are scaling fast but many organisations still find themselves stuck in the same place: waiting on the central data team. Every new dashboard, insight or metric request ends up in the same queue, handled by the same overstretched experts. Self-service BI and decentralised models sound great in theory, but in practice they often raise a new question: how do you empower more people to work with data without losing control over quality, security and consistency?

Analytics engineering is the missing piece. It bridges the gap between raw data and reliable insights, turning a well-intentioned mess into a governed, scalable and business-ready foundation.

The challenge

As organisations scale their data environments, traditional centralised models start to crack. The data team becomes a bottleneck, handling every extract, dashboard update and metric debate. In the rush to deliver, quick fixes pile up leading to inconsistent logic, duplicated effort and KPIs that don’t quite match across teams.

Meanwhile, the pressure to enable self-service keeps growing. Business units want to move faster, but opening up access to raw or poorly modelled data only creates new risks: errors, misinterpretation, and dashboards that tell five versions of the truth. Add to that the growing complexity of data platforms — from warehouses to lakes to lakehouses on Microsoft Azure and Microsoft Fabric and suddenly the data landscape feels more like a maze than a launchpad.

The real issue isn’t technology. It’s the lack of a scalable operating model that balances flexibility with control.

The solution

Analytics engineering provides that model. It’s the discipline that sits between data engineering and business analytics, focused on designing clean, reusable and trusted data products. Instead of building dashboards or pipelines in isolation, analytics engineers take ownership of the semantic layer — the structured, business-aligned view of the data that everyone can build on.

At LACO, analytics engineers create this layer with governance and quality by design. They embed validation rules, document logic, and make sure key definitions are consistent across domains. Rather than duplicating effort in every report, teams work with shared, curated datasets which speeds up delivery and reduces rework.

This approach is deeply integrated with Microsoft Azure and Microsoft Fabric. LACO combines warehousing and lakehouse expertise to design scalable, high-performance architectures that business users can actually understand. Starting from business problems, analytics engineers work backwards to define what data is needed, how it should be modelled, and how it can be safely exposed.

The central team shifts from reacting to every ticket to enabling self-service through strong foundations and clear guardrails.

The results

By introducing analytics engineering as a core capability, organisations remove the bottleneck without losing control. Business teams gain faster access to insights, working directly with trusted data products instead of relying on ad-hoc extracts and one-off reports.

The central data team gets breathing room to focus on long-term value instead of short-term fixes. Governance improves, definitions align, and KPI discussions move from “which number is right?” to “what should we do next?”.

Self-service becomes a scalable capability and not a source of chaos. Domain teams explore and innovate within a clear, governed framework. And with a strong Microsoft Azure and Microsoft Fabric foundation, the organisation is better prepared for future analytics and AI growth without having to rebuild the basics each time.

In short: analytics engineering turns your data team from a bottleneck into a strategic enabler.

Ready to remove your analytics bottleneck?

Want to see how analytics engineering can unlock governed self‑service in your organisation? LACO’s analytics engineers help you design and build robust data products on Azure and Microsoft Fabric, so your teams can move faster without losing control.

How analytics engineering enables your business2026-01-08T11:45:55+00:00

Marketing mix modelling

Marketing leaders today face a growing challenge: deliver measurable results in a landscape that’s more complex, fragmented and regulated than ever. Budgets are under pressure, while customer journeys span more channels and more blind spots than before.

With traditional tracking becoming less reliable, it’s harder to understand what’s truly driving performance. That’s where marketing mix modelling comes in. By connecting the dots between campaigns, spend and business outcomes, it brings clarity back to decision-making and replaces assumptions with insight.

The challenge

Today’s marketing landscape is a paradox: more channels, more data, yet less visibility. Tracking customer behaviour has become harder thanks to GDPR, strict consent rules and the looming end of cookies. At the same time, marketing spend is spread across a growing mix of online and offline touchpoints, making it difficult to see what’s really working.

Traditional tracking methods fall short. ROI and performance are often judged by what’s easiest to measure — last-click metrics, web analytics or internal assumptions — rather than by a complete, objective view. The result: fragmented insight, unclear attribution, and growing pressure on marketing leaders to justify budgets without solid evidence.

The solution

LACO helps organisations cut through this complexity with an AI-powered marketing mix modelling (MMM) approach, built on a robust Microsoft Azure and Microsoft Fabric foundation. Rather than relying on user-level tracking, MMM uses advanced statistical and machine learning techniques to connect consolidated marketing inputs with business outcomes like sales, leads or conversions.

  • It starts with the data. We build a secure, scalable platform on Azure to integrate all relevant marketing inputs — media spend, CRM, web and app analytics, and external data like seasonality or macro-economic indicators. The result: a centralised, governed environment that’s consistent, traceable and ready for modelling.
  • From there, LACO defines a unified marketing data model that brings together online, offline and contextual factors. Search, social, display, TV, radio, events. All channels are harmonised into one view, enabling consistent comparisons and meaningful insights.
  • We then apply AI-powered MMM models that estimate how each driver — from media spend to timing to external influences — contributes to business results. These models are deployed directly into the Microsoft Azure or Microsoft Fabric environment and made accessible via tools like Power BI, so marketing and business teams can explore insights independently. Clear visuals and practical explanations show how spend, saturation and timing affect performance.

A key feature?
Scenario planning. Decision-makers can run what‑if analyses via a conversational interface that acts as an AI agent for the marketing organisation to test budget shifts, channel reallocations or campaign timing. What happens if we boost spend on social and cut TV? Launch a promo earlier? Shift regional targeting? These simulations support smarter, evidence-based decisions before money is spent.

Crucially, the solution is transparent and privacy-friendly. Instead of black-box algorithms built on personal data, LACO’s MMM approach relies on aggregated, governed inputs. Assumptions, sources and model logic are documented, building trust and ensuring compliance, even as regulations evolve.

Finally, MMM models feed into a predictive ROI engine that forecasts the impact of future marketing investments. Budget planning becomes proactive, grounded in data rather than gut feeling. Finance and marketing teams get forward-looking guidance on where to invest, at what level, and with what expected return.

The results

With AI-driven MMM on a strong Microsoft Azure and Microsoft Fabric data platform, organisations move from reactive reporting to confident, evidence-based marketing decisions. ROI attribution becomes transparent across all channels, enabling smarter budget allocation and better performance, often without increasing spend.

Forecasting improves, planning becomes more strategic, and marketing finally earns its place as a measurable growth driver. Because the entire approach is built on governed, aggregated data, organisations stay agile and compliant even as privacy rules continue to change. The end result? Clarity. Control. And a marketing function that moves to knowing.

Ready to bring clarity to your marketing mix?

LACO helps you build an AI-powered marketing mix modelling framework on Microsoft Azure and Microsoft Fabric, turning fragmented marketing data into transparent ROI insights and forward-looking scenarios for smarter budget decisions.

Marketing mix modelling2026-01-15T10:15:27+00:00

AI readiness: are you solving the right problems?

Everyone’s talking about AI. From the boardroom to the breakroom, it’s being pitched as the next big thing. But between the buzzwords and bold promises, one question often gets lost: are we solving real business problems, or just playing with shiny tools?

For many organisations, the excitement around AI has led to rushed pilots, unclear goals and underwhelming results. Without a strong link to decisions, data and adoption, even the most promising AI use cases fall flat. It’s time to get practical.

The challenge

AI has officially made it to the boardroom. It’s no longer just a pet project for innovation labs or data scientists with too much time on their hands. But as the hype grows, so does the risk of missing the point. Too many organisations still approach AI as a technology exercise, not a business change.

Ambitious projects take off — generative AI, copilots, predictive models — but without a clear link to decisions, processes or outcomes. Budgets get eaten, time disappears, and valuable expert capacity is spent on pilots that never leave the lab.

The most common pitfalls? Unclear business relevance, poor data quality, and solutions that simply don’t fit how people actually work. Somewhere along the way, teams discover that the required data doesn’t exist or isn’t reliable, that the Azure or Fabric setup can’t support what’s needed, or that users don’t trust — let alone understand — the AI output.

The result? A trail of disconnected pilots, sceptical stakeholders and the growing sense that “AI is expensive and doesn’t deliver”.

The solution

Before you build, pause. LACO uses a practical, structured framework built around three deceptively simple questions:

  1. Does it truly matter for your business?
    Start with problems worth solving. Together with business and IT stakeholders, we identify the decisions and processes where AI can actually make a difference. Is the goal to reduce manual work, improve forecasting, detect risks earlier or personalise customer interactions?By defining what success looks like – fewer errors, shorter lead times, higher conversion, lower cost – we make sure the initiative is anchored in business priorities, not technology curiosity.
  2. Do you have the data and platform to make it work?
    Next, we assess the current data landscape and platform readiness. Are the required data sources available, reliable and governed? Can they be integrated into your Azure or Fabric environment? Are performance, security and cost manageable?This step covers everything from data models and pipelines to monitoring and lifecycle management. The goal: ensure every use case is grounded in a solid, scalable foundation.
  3. Will people actually use it?
    AI that nobody uses is just an expensive demo. That’s why we consider adoption from day one. Who will use the solution? How will it impact their daily work? What’s needed in terms of transparency, controls and training?We think through user journeys, interfaces (Power BI, Fabric, apps…) and guardrails, so AI becomes part of the process and not something bolted on as an afterthought.

These three questions address business value (viability), data and platform readiness (feasibility) and user adoption (desirability). LACO applies this framework through focused workshops and assessments, always grounded in your existing Microsoft Azure and Microsoft Fabric setup.

The outcome? A shortlist of AI use cases that are technically achievable, strategically relevant and supported by the data and governance to actually succeed.

The results

Organisations that apply this framework move beyond experimentation. Instead of spreading resources across disconnected pilots, they focus on a small number of use cases with real business impact. Each initiative is backed by the right data, a scalable platform and clear outcomes — turning AI from a theoretical exercise into a strategic tool.

By building on existing Microsoft Azure and Microsoft Fabric components, time to value is shortened and adoption becomes easier. Business users are involved from the start, ensuring trust, usability and relevance. The result? A portfolio of AI solutions that deliver measurable value and actually support day-to-day decisions.

Ready to assess your AI readiness?

LACO helps you separate hype from real opportunity by mapping your business priorities, data readiness and Azure / Microsoft Fabric platform capabilities. Together, we identify where AI can create tangible value today — and where it should wait.

AI readiness: are you solving the right problems?2026-01-08T11:51:16+00:00

Moving your SAS platform to the cloud: business lessons learned

Once you’ve moved your data platform to the cloud, your work as an IT professional tends to get a lot easier. But to be honest, getting that platform there in the first place can be quite a daunting task. How to tackle that?

Now, when it comes to SAS migration in general, LACO has always been somewhat of a pioneer on the Belgian market. As a matter of fact, LACO was the very first SAS partner to successfully deliver a SAS cloudification project in Belgium. Here are three important lessons we’ve learned from our experience with SAS cloudification so far. You might use them to your advantage!

Lesson 1: Take on the legal and regulatory hurdles from the start

The days that IT professionals sincerely worried about cloud security are long past. Most of us came to trust the high level of security that is built into the major cloud platforms, protecting data by design and by default when used correctly. Unfortunately, however, that hasn’t stopped some of our business colleagues from still worrying about these issues. Their fears need to be acknowledged too, of course. But then it is up to us to address those concerns with clear and hard facts we have at our disposal today. Even more so, as those persistent fears could turn out to be a real showstopper. If your CFO is not comfortable with moving his data to the cloud, for instance, then your project risks never taking off in the first place.

And talking about showstoppers: in certain sectors, such as the insurance industry, there is a set of mandatory legal rules and government restrictions that you absolutely have to take into account, before you can even think of moving your data platform to the cloud. These compliance demands are not insurmountable, but they will require you to obtain several official approvals, sometimes even undergoing a risk assessment. And that usually takes time, as there are no shortcuts or detours for it. Which is why, in these specific sectors, we always start a cloudification project by tackling the legal and regulatory hurdles. If these cannot be overcome, the project simply cannot move ahead.

Lesson 2: Be clear on the business case

A popular misconception that we often come across, even though people should really know better by now, is the idea that running your IT infrastructure in the cloud is by definition cheaper than running it in your own data centre – or having a hosting partner run it for you. In our experience, however, simply migrating your servers to the cloud rarely brings any real value to your business, not even from a purely financial perspective. On the contrary: it often turns out to be more expensive.

No misunderstanding, however, this only applies if you use the cloud the way you would use your former on-premise data centre by leaving all your servers running 24/7 all year round. The great thing about the cloud is that it allows you to run only those servers you require, switching systems on and off at any given notice. It basically lets you add or remove hardware resources in function of your actual computing needs. So if, say you have a reporting environment that is only used intensively by your business colleagues during working hours, you can decrease the server capacity for that environment before and after those hours. Another typical example for cost savings is a testing environment. Instead of keeping it running all the time, even during weekends, you could limit yourself to using that part of your infrastructure only when you actually need to do some testing.

So by using that flexibility, which is typical for the cloud, you can effectively optimise and strengthen your financial business case. Nevertheless, if you ask us, the real key to cloud success is in going beyond the financials approaching SAS cloudification not as a migration project but as an optimisation project. Instead of regarding the cloud merely as an alternative for your own data centre or that of your hosting provider and in a way continuing what you’ve always been doing, you should treat it as a springboard to a new world with possibilities you could only dream of before.

If you look at the cloud for new capabilities and extra functionality, you might just discover that there are applications and functionality within your reach, such as advanced disaster recovery features, that you would never have been able to deploy with just your own data centre.

Lesson 3: Match your licensing models

Moving your SAS data platform to the cloud also requires matching the different licensing models. This is especially challenging when you’re dealing with an older licensing model for your data platforms, since these older models – and not just those used by SAS – are still very much bound to physical hardware such as CPU cores. That is not necessarily the case, of course, with virtualised and cloud environments, where usage- and client-based licensing models continue to grow in popularity.

Matching licensing models is somewhat less challenging, as you can probably imagine, for those customers who have already moved on to SAS’ latest data platform: SAS Viya. Running on a scalable, cloud-native architecture, SAS Viya is an open and cloud-ready platform. Consequently, SAS Viya customers can entirely benefit from an easier cloud migration concerning licence fees.

The exercise of matching your software vendor’s licensing agreement with your needs in terms of scalability and elasticity, has to be done right from the start, as it might be another showstopper. Therefore, we invariably advise our customers to reach out and establish a satisfying agreement with their vendor and in some cases even their cloud provider. Not only they usually have a number of licensing programmes to choose from, sometimes with discounts that customers can profit from. They can also help to establish a smooth transition period. After all, you don’t move to the cloud overnight, do you?

So far for the business lessons learned from our SAS cloudification projects. Feel like diving a little deeper into the actual technology? Head quickly to our blog post with technical lessons learned . But first: check out our SAS cloudification page!

Moving your SAS platform to the cloud: business lessons learned2026-02-16T08:39:24+00:00

ESG reporting with solid data governance

ESG reporting has entered a new era. With CSRD and ESRS, organisations are now required to treat sustainability information with the same level of rigour, traceability and reliability as financial data. This shift demands more than templates or new reporting tools. It requires a strong data foundation, clear ownership and a governance model that unites people and processes across the organisation.

For many organisations, this exposes long-standing weaknesses. ESG data is often fragmented, inconsistent and managed through spreadsheets or informal processes. Metrics do not align, definitions vary by team and nobody fully owns the quality or the outcome.

ESG reporting only becomes credible when the data behind it is governed, repeatable and trusted. That is where LACO makes the difference.

The challenge

ESG reporting is no longer optional. Under CSRD and ESRS, organisations must report on more than eighty indicators that cover environmental, social and governance themes. Each of these indicators must be reliable, audited and traceable back to its source.

However, most organisations are not ready for this level of scrutiny. ESG data is scattered across HR, finance, procurement, operations and sustainability teams. Each department works with different definitions, different formats and different processes. Key metrics live in silos, ownership is unclear and reporting relies heavily on manual, error-prone Excel files.

Without proper governance, ESG becomes chaotic. Indicators conflict, data lineage is missing, validation does not happen and engagement stays low because teams see ESG as administrative work rather than an essential part of business strategy.

The challenge is not the regulation itself. It is the lack of a stable, governed data foundation that can support consistent and meaningful ESG reporting.

The solution

LACO helps organisations build ESG reporting that is credible, consistent and sustainable by focusing first on governance, ownership and structure. Technology only becomes relevant once these foundations exist.

  • We begin by clarifying scope and responsibility. Together with internal teams, we map the CSRD obligations and define clear ownership for every ESG data domain. This gives structure to input, validation and review.
  • Next, we design a practical ESG data framework based on LACO’s expertise in data strategy. This model brings together existing systems, manual sources and business rules into one logical structure that aligns reporting requirements with organisational goals.
  • Once governance and definitions are in place, we standardise and automate data flows within a modern data architecture. Microsoft Fabric, Microsoft Azure or Databricks can be used as part of this foundation, but it is never the starting point. The structure follows the governance principles, not the other way around.
  • Throughout the process, we work on adoption. Sustainable ESG reporting depends on people who trust the data and understand their role. LACO’s change management practice supports communication, collaboration and the training of data stewards so that the process becomes part of daily operations.
  • Finally, ESG frameworks are refined over time through continuous improvement sessions, ensuring that the organisation evolves with new indicators, regulatory changes and internal expectations.

Results

With governance, ownership and adoption in place, ESG reporting becomes a reliable and repeatable process. The organisation moves from fragmented and manual work to a structured system where data quality is high and reporting is audit ready.

Teams work with aligned definitions and shared responsibility instead of isolated spreadsheets. Manual effort decreases as data flows become standardised and automated.

Most importantly, ESG transforms from a compliance obligation into a strategic capability. Trusted indicators support decision making, non-financial insights gain credibility and leadership can act on clear, consistent and validated information.

The organisation becomes ready for both current and future reporting requirements, with a flexible ESG data system that grows as regulations evolve.

Ready to make ESG reporting a business strength?

LACO helps you build ESG frameworks that are credible today and flexible enough for tomorrow. From first steps to full automation, we make ESG reporting practical, trusted and manageable.

ESG reporting with solid data governance2026-01-08T09:53:22+00:00

Building a data governance framework for Federale Verzekering-Assurance

To harness the power of data and fortify its position in the market, Federale Verzekering-Assurance called on the data intelligence experts at LACO. “In order to gain the most value from our data assets and manage them efficiently, we needed a structured approach to data management,” says Amandine Rouvroy, Chief Data Officer at Federale Verzekering-Assurance. Using its battle-tested data governance framework, LACO tailored a data governance strategy that provided the insurer with the strong foundation it required for ongoing data governance.

Setting the stage:

Becoming data-driven

Established in 1911, Federale Verzekering-Assurance is a mid-sized group that employs more than 600 people and had a consolidated balance sheet of €4.1 billion in 2022. The long-standing insurer is well accustomed to adapting to changing market needs. And in today’s business climate, this means transforming into a data-driven organization.

In September 2021, Federale Verzekering-Assurance launched Shape25, an ambitious and strategic program that includes a large-scale digitalization effort. The insurer’s corporate strategy rests on three pillars: “Eliminate complexity,” “From transaction to relation,” and “Build mutual trust.” In other words, by the end of 2025, it aims to simplify its customer relationships, offer more contact options, and create more mutual trust.

The problem: Data quality issues and increasing regulatory pressure

For the insurer’s customer-oriented and data-driven corporate strategy to succeed, reliable data is essential. But ensuring this reliability hasn’t always been easy for the sizeable company: “If you’re working in a silo and only using data from one department, it’s not that hard to ensure data quality,” explains Amandine. “But in a large company, we work with transversal data across multiple departments, such as when preparing marketing campaigns. This makes maintaining data quality more difficult. For example, we’ve had an issue with customer email addresses. They’re used by almost all our services but stored in separate databases. And those databases don’t always communicate seamlessly with one another.”

Along with data reliability, the insurer must also contend with increasing regulatory pressure. “Not only does the National Bank of Belgium impose data quality requirements, but the EU has also introduced new regulations such as DORA, as well as mandatory sustainability reporting in the form of ESG disclosure,” continues Amandine. “Without proper data management and governance, it would be very difficult to meet these requirements.”

The result: A tailored data governance framework

To design a tailored data governance framework, LACO first creates some basic building blocks, such as strategy, organization, directives, measurement, technology, communication, and change management. These are the stepping stones that lead to success, and it’s important to put as many of them in place as possible within the first few months.

At Federale Verzekering-Assurance, this approach resulted in concrete deliverables for each building block, including:

  • a data governance charter (strategy) with an associated operating model (organization);

  • an example of a data quality policy (directives);

  • a measurement dashboard metrics definition (measurement);

  • a business glossary and data dictionary (technology);

  • a communication plan and training plan (communication);

  • a change plan (change management).

The data governance working groups at the company were also able to immediately put many of these deliverables into practice through real use cases.

With these foundational data governance capabilities well and truly established, Federale Verzekering-Assurance has hit the ground running. “LACO gave us a solid foundation for data governance and a customized data governance strategy,” concludes Amandine. “With this, we’ve been able to further develop our capability catalog and create a multi-year roadmap for its delivery—an achievement that wouldn’t have been possible without the foundational work performed by LACO.”

Building a tailored data governance framework?

Building a data governance framework for Federale Verzekering-Assurance2026-02-03T11:14:56+00:00
Go to Top