| LACO
| LACO

Power BI | From data to insight

This training, exclusively organized for your company, provides a structured and practical introduction to Microsoft Power BI, focusing on how to transform raw data into clear, reliable, and actionable insights.

Participants will learn not only how to build reports, but also why certain modelling and design choices matter in a professional BI environment.
The course combines conceptual foundations with hands-on exercises, ensuring that knowledge can be applied immediately in real-world scenarios.
We tailor this training completely to your needs & your environment.

What you’ll learn

  • Understand the Power BI ecosystem (Desktop, Service, data refresh, sharing)
  • Import and transform data using Power Query
  • Build a solid data model using best practices (relationships, star schema)
  • Create DAX measures for calculations and KPIs
  • Design clear, user-friendly, and performant reports
  • Apply basic performance and usability principles
  • Publish and share reports securely within an organisation

Who should attend

Data analysts and reporting professionals

Business users who want to create their own reports

Consultants and BI developers new to Power BI

Anyone working with Excel, SQL, or other reporting tools who wants to move to Power BI

Programme

Introduction & Architecture Power BI overview, components and typical BI workflows

Data Preparation (Power Query) Data connections, cleaning, shaping, combining and reusable transformations

Data Modelling Fact/dimension tables, relationships, cardinality and common pitfalls

DAX Fundamentals Measures vs calculated columns, core functions (CALCULATE, FILTER, time intelligence) and row vs filter context

Report Design & Visualisation Choosing the right visuals, layout, storytelling and user interaction (filters, slicers, drill-through)

Publishing & Sharing Power BI Service, workspaces, access rights and refresh strategies

Content is tailored to participants’ level and preferences.

Location

This training can be held at the LACO office or at your training facilities.

FAQ

What is the required level of prior knowledge or experience for this training?2025-11-26T13:06:45+00:00

No specific prior experience is required. A basic familiarity with general data concepts and terminology (such as data analysis, modelling, or reporting) is helpful, but the bootcamp is designed for both beginners and those with some hands-on experience.

Is lunch, coffee, or catering included in the price?2026-01-12T15:03:42+00:00

When the training is organized at LACO training facilities, lunch and beverages are provided.

Will the training language always be English (or Dutch/French)?2025-11-26T13:08:44+00:00

Yes, the training is delivered in English. On demand, and for specific groups, a Dutch or French session may be arranged. Please let us know your preference upon registration.

What is the duration of the training?2026-01-12T15:02:00+00:00

The duration of the training depends on the content that is taylored for you.

Ready to level up your data skills?

Start the conversation.

This field is for validation purposes and should be left unchanged.
Name(Required)
You know your team, we know our training. Tell us who should join, when it suits you, and where you’d like the session – at LACO or on-site. We’ll take it from there.
More information about how we handle your data can be found in our privacy policy.

Power BI | From data to insight2026-01-23T11:20:58+00:00

Improved customer service based on embedded Power BI

All the benefits of Power BI’s flexibility and user-friendliness for data visualisation, but without the investment in thousands of software licences. That’s the key question question – maybe not literally, but still quite close – that LACO answered for Connecting-Expertise (CE). How we did it? We embedded Power BI into CE’s customer web portal.

Power BI is the way to go for data visualisation. Not just for internal use, as we wrote in this blog in our Power BI series, but also to share reports beyond the borders of the organisation, with customers and partners. That’s exactly what Connecting-Expertise’s customer was after. CE builds software solutions that help companies optimise and facilitate sourcing, contracting and managing their contingent workforce. Operational since 2007, CE is a pioneer in the Belgian market with a leading solution for contingent labor supply contracts.

Until recently, CE provided data to some of its clients, allowing them to build reports on their own. However, as each data file was client-specific, the preparation of these files took quite some time. To further extend its service, CE was looking for a more efficient way to offer clients a set of standard reports and dashboards, based on client-specific data. LACO suggested embedding Power BI as a reporting environment in CE’s web-based software platform, to unlock the full potential of the data and provide new insights to the clients.

Sturdy, cost-efficient architecture

We designed the data architecture and set up an Azure environment for the Power BI embedded service. We chose to set up the sturdiest possible technical architecture, by hosting a VM with MySQL, avoiding the need to migrate local MySQL data to an Azure data warehouse. In doing so, we kept a strong focus on the usage of Azure resources. Even today, we are still able to host all reports using the lowest – and cheapest – tier for the Power BI embedded service on Azure.

To build the right reports, we assisted CE with defining the KPIs that would offer most added value to the clients. Extra attention was given to security. It is of utmost importance that clients using the dashboards and reports only have access to their own data. To get that right, we implemented the complicated Row Level Security CE offers to its clients. Row Level Security provides access on various levels – on personal or division level – and for various types of data. As an interesting side effect, we used Row Level Security as an easy solution for offering and maintaining multi-language reports. How? We unravel that in this blog for you!

Embedded reports and dashboards

As a final step in the project, we assisted CE with embedding the reports in their customer portal. Since this was the first implementation of this type in Belgium, Microsoft lent us a helping hand as well. As a result, Power BI reports and dashboards are now available as an embedded service on the CE customer portal. Only clients with a CE portal login have access to the reports and dashboards, which are solely based on their own data. Furthermore, data is filtered on row level, based on the client’s permissions. For CE, the availability of the Power BI reports and dashboards strengthens the status of its leading solution for contingent labor supply contracts.

Improved customer service based on embedded Power BI2026-02-16T08:40:09+00:00

Embedding Power BI in your company portal

Static, pre-defined reporting is so 2010! Self-service BI is today’s standard. But in an environment with literally thousands of users, even small licence fees add up to high costs. Embedding Power BI offers an interesting – and cost-effective – way around this.

Unlimited number of users

Part of a company’s digital strategy is convenience for the customer. A user-friendly app or webpage makes the life of the customer easy. Depending on the company’s business, the app may show all sorts of information, from purchased books to energy usage or data consumption. The information – based on data visualisation – reflects the customer’s behaviour, which may be of great value for that customer and underlines the company’s unique selling proposition.

Power BI enables this type of visualization at a very low cost and makes it available for an unlimited number of users. For this type of scenario, we, at LACO, choose to embed Power BI in the company’s web portal. Thanks to role-based access, the user can only work with the data he’s allowed to see. But more importantly: the user has access to the navigation functionalities of the tool, to make selections, drill down, and more. And what’s more, we even found a clever way to make multi-language availability of the reports easy.

| LACO

Example of Power BI dashboard embedded in a company portal. Source: Microsoft.com.

Unlimited number of users

But unfortunately, there’s no such thing as a free lunch. Although licence costs per user may be low, when there are thousands – or even tens of thousands – of users, the total cost quickly spins out of control. To avoid that, the trick is to define a small number of power users and provide them with full self-service functionalities and capabilities. The reporting they come up with – at a low total licence cost – can then be shared to the large audience of ‘visual-only’ customers through the company’s portal. By embedding data visualisation in the portal, a lot of the user functionalities with regard to visualization are still usable – such as navigating data, selecting and filtering data, and more – without the need to pay licence fees for every one of those end-users.

The only thing the users can’t do, is make new reports. To make that happen, every user would indeed need a Power BI licence. Another thing to keep costs under control, is the need to set up Power BI correctly on Azure. We share more about that – and some of the other technicalities, including coping with multi-language reporting – in one of the other posts in our Power BI blog series.

| LACO

Example of Power BI dashboard embedded in a company portal. Source: Microsoft.com.

Enabling the digital strategy

For the Chief Digital Officer, Power BI is an extra tool that helps enable the company’s digital strategy. Making reports available for customers and partners, based only on the data they are allowed to see, used to be quite tricky. With Power BI, that’s no longer the case. But what’s the catch, you might ask? Well, your data is at the core of every report or visual that Power BI produces. Did you get your data platform sorted? Then you can start leveraging the possibilities of Power BI.

Using Power BI in the way we described in this blog post opens up the standard visualization capability of this powerful technology. No need for coding! No hard prioritisation of scarce IT time! No weeks of waiting time for a new report and no complexity leading to outrageous costs… By means of embedding Power BI, standard internal functionality is tunnelled through the Internet to the company’s customers, without the risks of the past, such as complex programming and cumbersome data preparation. Who would have thought, back in 2010?

Embedding Power BI in your company portal2026-02-16T08:41:58+00:00

Compliance and operational reporting: from fragmented data to trusted insight

Compliance and operational reporting are becoming more demanding as regulators, auditors and boards expect timely, consistent and explainable numbers, supported by strong risk data aggregation and governance. Organisations must show not only what they report, but also how figures are derived, aggregated and controlled across systems – reflecting principles found in BCBS 239 and broader RDARR guidelines.

At the same time, many reporting landscapes are still built on a mix of legacy platforms, local extracts and spreadsheets, making it difficult to guarantee data quality, lineage and governance end to end when supervisors or internal audit start asking detailed questions.

The data challenge

Behind every compliance report sits a data problem:

  • Critical metrics (financial, risk, ESG, operational KPIs, customer or product metrics) are sourced from different systems, with overlapping or conflicting definitions, leading to inconsistencies between regulatory, risk and management reports.

  • Data moves through multiple steps – ingestion, transformation, aggregation – without consistent documentation or automated controls, so it is hard to trace how a figure in a report links back to the original transaction, which BCBS 239‑style principles explicitly expect.

  • Reporting teams depend on manual reconciliations and ad hoc SQL or Excel logic that only a handful of people fully understand, increasing key person risk and making it harder to evidence robust risk data aggregation.

As reporting requirements grow in volume and granularity under RDARR‑inspired expectations, these data issues become more visible. Organisations need reporting that is faster and more flexible, but also demonstrably governed: complete, accurate, consistent and explainable to internal and external stakeholders.

The solution: a governed data and reporting layer

LACO helps organisations redesign their operational and compliance reporting around a governed data foundation, using modern cloud technologies such as Microsoft Azure, Microsoft Fabric, Power BI and Azure Databricks.

The goal is to create a single, reliable layer where critical data is integrated, modelled and controlled, and from which both day‑to‑day operational reports and BCBS 239 / RDARR‑aligned compliance reports can be served.

Concretely, this means:

  • Data integration: ingesting source data from core systems into a central, secure data platform (for example using Azure Data Lake, Azure Data Factory or Synapse pipelines), with clear ownership and access controls that support regulatory expectations on data governance.

  • Semantic and modelling layer: building governed data models that standardise key definitions – such as exposures, limits, revenue, cost, ESG indicators or operational risk metrics – so the same trusted data feeds BCBS 239 reports, RDARR‑driven risk dashboards and management reporting.

  • Reporting and visualisation with Power BI: exposing governed datasets to business and compliance users via Power BI, with role‑based access, row‑level security and reusable report templates for recurring regulatory and internal reporting cycles.

  • Built‑in data quality, reconciliation and lineage: embedding checks, reconciliations and metadata so teams can trace any reported figure back to its sources and transformation logic, and can demonstrate that data is complete, accurate and consistent – core BCBS 239‑style requirements.

By placing this governed layer at the centre, work shifts from rebuilding logic in each reporting tool to modelling and governing data once and reusing it many times – for regulatory risk reporting, RDARR‑aligned aggregation, internal risk dashboards and operational steering.

Result: explainable, BCBS 239 / RDARR‑ready reporting

Compliance and operational reporting become more repeatable, explainable and resilient, and better aligned with BCBS 239‑ and RDARR‑style expectations.

Reporting teams work with a single set of validated data and definitions, reducing inconsistencies between reports and limiting discussions about which number is the “right” one, both internally and with supervisors.

Business, risk and compliance users gain access to controlled data through modern tools, without bypassing the underlying governance, quality checks or lineage.

Organisations can adapt more easily to new reporting requirements or additional disclosures, because the underlying data architecture and technology stack are already designed for scalability, governance and reuse – creating a reporting landscape that not only supports today’s BCBS 239 / RDARR‑inspired demands, but is also ready for further digitalisation, stricter data rules and new forms of analytics and AI.

The reporting transformation becomes an engine for agility and trust, ready to support future regulatory change.

Ready to strengthen your compliance reporting?

LACO helps you move from fragmented data and manual reconciliations to a governed, Azure‑based reporting platform with clear lineage, consistent definitions and BCBS 239 / RDARR‑ready insight for your stakeholders.

Compliance and operational reporting: from fragmented data to trusted insight2026-01-16T09:24:54+00:00

Integrating SAS with Microsoft Azure

Many organisations rely on SAS as a trusted engine for analytics, reporting and modelling. At the same time, business users increasingly expect the modern flexibility of Microsoft Fabric and Azure Databricks. They want interactive dashboards, faster access to insights and a unified view across teams. This creates a gap between what the organisation already depends on and what the business now requires.

By integrating SAS with Microsoft’s cloud ecosystem, organisations gain the best of both worlds: a governed analytics engine on one side and a streamlined approach that minimises migration investment and accelerates change adoption on the other.

The challenge

SAS remains a powerful platform for processing and modelling, yet it was not built for today’s expectations around real time insights, cloud scalability and self service analytics. As a result, organisations end up switching between a central data warehouse (SAS DI) and end-user compute (SAS EG), manually exporting data and recreating reports. This leads to inconsistent versions, slow refresh cycles and a clear divide between technical teams and business users.

The challenge is not choosing one platform over the other. It is creating a landscape where they reinforce each other.

The solution

LACO helps organisations build a seamless bridge between SAS, Databricks and Microsoft Fabric.

  • The journey begins with a thorough scan of the existing SAS environment to understand dependencies, data sources and reporting processes.
  • Once there is clarity, we design an hybrid architecture where SAS outputs land securely and automatically in Databricks and Microsoft Fabric as certified datasets. These datasets follow shared governance and metadata principles so that access rules, terminology and lineage remain consistent across platforms.
  • We then automate the data flows to ensure that business users always work with up to date information. Manual exports disappear and data refreshes run on predictable schedules. Throughout this process, analysts and business users receive practical training so they can explore SAS outputs in Power BI with confidence.

The (gradual) transition becomes smooth, governed and supported by clear communication.

Results

The organisation gains one connected data landscape instead of two separate tools.

SAS continues to provide the analytical strength and validated outputs that teams rely on, while Power BI and Synapse deliver the flexibility and speed business users expect. Duplicate work disappears because data is prepared once and reused across the entire ecosystem. Reports refresh faster, users adopt the new environment more easily and IT teams spend far less time supporting manual tasks.

Most importantly, insights become both governed and accessible. Business users explore information in real time without recreating models or manipulating data manually, and leadership gains a trusted, consistent and audit ready view of the organisation. By connecting SAS with Microsoft’s cloud platform organisations modernise without replacing what still works and create a future ready foundation for analytics, AI and decision making.

Want to connect SAS and Microsoft Azure in a single data landscape?

We help you build the bridge — safely, efficiently and at your own pace.


Keep the strength of SAS. Add the flexibility of Microsoft Azure. And bring everyone onto the same page.

Integrating SAS with Microsoft Azure2026-01-08T09:54:32+00:00

EuroChem Antwerp lays the foundation for a data-driven organization

The Antwerp branch of fertilizer manufacturer EuroChem knew it was sitting on a goldmine of data. It decided to call on the services of data intelligence specialist LACO to mine this resource. The result is operational excellence that’s now also easily measurable and reportable. Now EuroChem is well on its way to becoming a more data-driven organization.

Setting the scene:
a key production plant

EuroChem Antwerp is a manufacturing facility with good logistics infrastructure within the EuroChem Group with its headquarters in Switzerland. “Our end product is fertilizer granules for use in agriculture and horticulture,” explains Bernard De Vriese, the company’s IT Manager. “We produce about 2.2 million tons of these mineral fertilizers here every year.”

EuroChem Antwerp is also an important logistics hub for the group, thanks to its strategic location in the Port of Antwerp. From there, EuroChem serves the international market. The facility employs 400 people and has an annual turnover of €1 billion.

The problem:

getting the data right

“We are purely a manufacturing company,” emphasizes Continuous Improvement Specialist Pieter Callens. This is evidenced by how the company employs data. Like any industrial manufacturing company, EuroChem generates a huge amount of data. In addition to the usual ERP data from finance and HR, there’s also lots of supply chain and warehouse data and even data about energy consumption — crucial for a chemical company seeking to optimize its cost structure.

And then of course there is all data generated by the production facility itself where IoT technology allows data from production processes to be automatically recorded, from standard process parameters over minor disruptions to major interventions. This production data is used intensively, although mainly for operational purposes. “Our production line simply wouldn’t run without all that information,” says De Vriese. Not only was this data used tactically rather than strategically, but all too often it was processed in a manual, non-automated, and time-consuming manner. In addition, there was a significant risk of human error and unreliable, compromised data.

Having one version of the truth is a very attractive prospect for management. But if everyone is working on their own Excel file, discrepancies can arise, creating the danger that before long, everyone is talking about a different thing.

Extra challenge:
sharing the data

What didn’t help, of course, was that the data was spread across different islands or silos within the company, stored using different technologies, from Access databases to Excel and XML files. “Our way of working has changed so much since the takeover of the Antwerp factory by EuroChem,” says Callens. Until April 2012, EuroChem Antwerp was part of the chemical giant BASF, and the two companies still share a site and a number of central services and logistics activities.

However, the change meant that EuroChem Antwerp had to establish most of the support services that BASF used to provide, from finance to HR. “We didn’t always have the necessary experience in-house,” Callens admits. This has made reporting difficult. Not only was the data not integrated, it was also not automatically shared between the different data silos. The reporting that did take place was quite static and required a lot of repetitive manual work. An additional driver for the data project was the planned migration from Oracle to SAP, another consequence of the carve-out of EuroChem from BASF. Because some data was in danger of being lost during the migration, the company wanted to first secure all its data on a separate platform, which would be connected to the new ERP environment via loose coupling.

The solution:

a data platform based on Microsoft Power BI

“To get rid of all those islands, we needed a central data platform with a central reporting mechanism,” says Callens. In his additional role of Data Management Specialist, Callens is also the main point of contact for the creation and use of reports at EuroChem Antwerp. “Concepts such as BI and data warehousing could provide a solution to the problems we’d been struggling with for some time. One of them was to arrive at a consensus on how to calculate important Key Performance Indicators based on the now centralized and controlled data. That process is now fully underway, thanks to LACO.”

Data intelligence specialist LACO advised and supported EuroChem Antwerp as they implemented an integrated platform that groups all their data and unlocks it for reporting and analytics. “I had previously worked successfully with LACO,” says De Vriese, explaining why he chose the local supplier. To achieve the necessary internal buy-in for the ambitious project, they decided to move forward in steps. “We first did a Data Strategy study with LACO. After we had done a thorough analysis with the stakeholders from the various departments, we took our assessment to the local Board of Directors to ask for the green light. And we got that quickly.”

And finally: data-driven business success through operational efficiency

In addition to an architectural blueprint, a concrete implementation roadmap, aimed at rapid value creation has been delivered as part of the Data Strategy study. EuroChem and LACO drew up a priority list to identify the most important benefits. “We started by creating a value map,” Callens remembers. “One of the priority decisions that has resulted from this is to allow the business to work more with reports.”

At LACO’s suggestion, EuroChem opted for a highly iterative approach to the implementation of the new Microsoft data platform. This was carried out by a fixed team, involving the end users from the outset. Callens has nothing but praise for LACO’s functional analyst: “He was very professional and customer-oriented. And that was never a given, because he had to dig into our context again and again and then respond very flexibly.”

Soon the results were clear. Staff now have to do less manual work in Excel. This increases operational efficiency and reduces human error. And because EuroChem now has a source of high-quality reference data, the much sought-after single version of the truth is also gradually becoming a reality and the delivery of the strategic KPI’s is on its way.

To make this possible, a number of standard reports were also developed. “A striking example is the report for the various capital expenditure initiatives within our branch.” Reporting on capital expenditure turned out to be not only very complex but also something that quite a few departments struggled with. This included the finance department and general management, who must be able to identify any overspend. But it also included engineering departments with project managers, up to and including asset managers, who are responsible for the maintenance of the installations. “In the past, all those colleagues had to use complex Excel files that we feared would have only a limited lifespan. Today, most of those reports have been replaced by a BI report that’s simple and fast to create, as well as flexible to use and always up to date.”

“The most important work has been done,” concludes De Vriese. “The foundation is there.” This foundation provides EuroChem with a solid launch pad for the future as it grows into a true data-driven organization.

Ready to become a data-driven powerhouse?

EuroChem Antwerp lays the foundation for a data-driven organization2026-02-03T11:18:05+00:00
Go to Top