| LACO
| LACO

Power BI | From data to insight

This training, exclusively organized for your company, provides a structured and practical introduction to Microsoft Power BI, focusing on how to transform raw data into clear, reliable, and actionable insights.

Participants will learn not only how to build reports, but also why certain modelling and design choices matter in a professional BI environment.
The course combines conceptual foundations with hands-on exercises, ensuring that knowledge can be applied immediately in real-world scenarios.
We tailor this training completely to your needs & your environment.

What you’ll learn

  • Understand the Power BI ecosystem (Desktop, Service, data refresh, sharing)
  • Import and transform data using Power Query
  • Build a solid data model using best practices (relationships, star schema)
  • Create DAX measures for calculations and KPIs
  • Design clear, user-friendly, and performant reports
  • Apply basic performance and usability principles
  • Publish and share reports securely within an organisation

Who should attend

Data analysts and reporting professionals

Business users who want to create their own reports

Consultants and BI developers new to Power BI

Anyone working with Excel, SQL, or other reporting tools who wants to move to Power BI

Programme

Introduction & Architecture Power BI overview, components and typical BI workflows

Data Preparation (Power Query) Data connections, cleaning, shaping, combining and reusable transformations

Data Modelling Fact/dimension tables, relationships, cardinality and common pitfalls

DAX Fundamentals Measures vs calculated columns, core functions (CALCULATE, FILTER, time intelligence) and row vs filter context

Report Design & Visualisation Choosing the right visuals, layout, storytelling and user interaction (filters, slicers, drill-through)

Publishing & Sharing Power BI Service, workspaces, access rights and refresh strategies

Content is tailored to participants’ level and preferences.

Location

This training can be held at the LACO office or at your training facilities.

FAQ

What is the required level of prior knowledge or experience for this training?2025-11-26T13:06:45+00:00

No specific prior experience is required. A basic familiarity with general data concepts and terminology (such as data analysis, modelling, or reporting) is helpful, but the bootcamp is designed for both beginners and those with some hands-on experience.

Is lunch, coffee, or catering included in the price?2026-01-12T15:03:42+00:00

When the training is organized at LACO training facilities, lunch and beverages are provided.

Will the training language always be English (or Dutch/French)?2025-11-26T13:08:44+00:00

Yes, the training is delivered in English. On demand, and for specific groups, a Dutch or French session may be arranged. Please let us know your preference upon registration.

What is the duration of the training?2026-01-12T15:02:00+00:00

The duration of the training depends on the content that is taylored for you.

Ready to level up your data skills?

Start the conversation.

This field is for validation purposes and should be left unchanged.
Name(Required)
You know your team, we know our training. Tell us who should join, when it suits you, and where you’d like the session – at LACO or on-site. We’ll take it from there.
More information about how we handle your data can be found in our privacy policy.

Power BI | From data to insight2026-01-23T11:20:58+00:00

How analytics engineering enables your business

Data and AI initiatives are scaling fast but many organisations still find themselves stuck in the same place: waiting on the central data team. Every new dashboard, insight or metric request ends up in the same queue, handled by the same overstretched experts. Self-service BI and decentralised models sound great in theory, but in practice they often raise a new question: how do you empower more people to work with data without losing control over quality, security and consistency?

Analytics engineering is the missing piece. It bridges the gap between raw data and reliable insights, turning a well-intentioned mess into a governed, scalable and business-ready foundation.

The challenge

As organisations scale their data environments, traditional centralised models start to crack. The data team becomes a bottleneck, handling every extract, dashboard update and metric debate. In the rush to deliver, quick fixes pile up leading to inconsistent logic, duplicated effort and KPIs that don’t quite match across teams.

Meanwhile, the pressure to enable self-service keeps growing. Business units want to move faster, but opening up access to raw or poorly modelled data only creates new risks: errors, misinterpretation, and dashboards that tell five versions of the truth. Add to that the growing complexity of data platforms — from warehouses to lakes to lakehouses on Microsoft Azure and Microsoft Fabric and suddenly the data landscape feels more like a maze than a launchpad.

The real issue isn’t technology. It’s the lack of a scalable operating model that balances flexibility with control.

The solution

Analytics engineering provides that model. It’s the discipline that sits between data engineering and business analytics, focused on designing clean, reusable and trusted data products. Instead of building dashboards or pipelines in isolation, analytics engineers take ownership of the semantic layer — the structured, business-aligned view of the data that everyone can build on.

At LACO, analytics engineers create this layer with governance and quality by design. They embed validation rules, document logic, and make sure key definitions are consistent across domains. Rather than duplicating effort in every report, teams work with shared, curated datasets which speeds up delivery and reduces rework.

This approach is deeply integrated with Microsoft Azure and Microsoft Fabric. LACO combines warehousing and lakehouse expertise to design scalable, high-performance architectures that business users can actually understand. Starting from business problems, analytics engineers work backwards to define what data is needed, how it should be modelled, and how it can be safely exposed.

The central team shifts from reacting to every ticket to enabling self-service through strong foundations and clear guardrails.

The results

By introducing analytics engineering as a core capability, organisations remove the bottleneck without losing control. Business teams gain faster access to insights, working directly with trusted data products instead of relying on ad-hoc extracts and one-off reports.

The central data team gets breathing room to focus on long-term value instead of short-term fixes. Governance improves, definitions align, and KPI discussions move from “which number is right?” to “what should we do next?”.

Self-service becomes a scalable capability and not a source of chaos. Domain teams explore and innovate within a clear, governed framework. And with a strong Microsoft Azure and Microsoft Fabric foundation, the organisation is better prepared for future analytics and AI growth without having to rebuild the basics each time.

In short: analytics engineering turns your data team from a bottleneck into a strategic enabler.

Ready to remove your analytics bottleneck?

Want to see how analytics engineering can unlock governed self‑service in your organisation? LACO’s analytics engineers help you design and build robust data products on Azure and Microsoft Fabric, so your teams can move faster without losing control.

How analytics engineering enables your business2026-01-08T11:45:55+00:00

Conversational BI: the AI‑powered future of data intelligence

Dashboards used to be the answer,until business users started asking better questions. In fast-paced environments where decisions can’t wait for the next reporting cycle, static views just don’t cut it anymore. Users expect to engage with data as naturally as they would with a colleague: by asking a question and getting a clear, useful answer.

Conversational BI makes that possible. By combining large language models (LLMs) with governed data platforms like Microsoft Fabric and Microsoft Azure, it turns raw data into on-demand intelligence. Less digging, more deciding.

The challenge

Traditional data intelligence environments were built to answer known questions through pre-defined dashboards and reports. That worked until business needs changed faster than reports could be updated. As data volumes grow and decision cycles shorten, teams no longer want to wait days for a new dashboard. They want to ask ad-hoc questions and get answers, instantly.

Meanwhile, data teams are buried in repeat requests: small tweaks, new views, slightly different filters. Time that could be used for value-added analytics is lost to backlog and maintenance. Despite all the tech in place, the experience often feels rigid and slow.

The solution

Conversational BI redefines how people interact with data, using AI to deliver fast, contextual answers in natural language. Instead of clicking through a forest of dashboards, users can simply ask, “How did revenue evolve last quarter by region?”, “Which products are driving margin decline?” or “What changed in churn after the price update?”.

Here’s how it works in a modern Microsoft Azure and Microsoft Fabric environment:

  • Governed data as the foundation: It all starts with a strong, governed data model. Microsoft Fabric, Azure and lakehouse structures expose clean, curated datasets with business-friendly entities like customers, products or orders. Centralised rules around quality, lineage and security ensure that answers are trustworthy.
  • LLM-powered intelligence: Large language models interpret the user’s question, map it to the correct metrics and dimensions, and generate the necessary queries. They summarise insights, highlight trends, and suggest follow-up questions — even visualising results in tools like Power BI. The outcome is not just data, but narrative: a story users can act on.
  • Built-in governance and control: Conversational BI doesn’t bypass governance, it builds on it. Role-based access, row-level security and built-in guardrails ensure users only see what they’re allowed to see. AI responses are explainable and traceable, with previews and validation tools that help data teams monitor, review and improve the system over time.

Together, these elements allow organisations to embed AI-driven experiences directly into the tools their people already use. From embedded chat to smart search, Conversational BI makes data as accessible as a conversation while keeping full control behind the scenes.

The results

For business users, conversational BI feels like having an analyst on standby. Questions that used to take days now take minutes. The data becomes more accessible, decisions become faster, and insights are easier to trust because they’re delivered in plain language.

For data teams, the shift is equally powerful. Instead of acting as dashboard factories, they focus on governance, modelling and quality, the building blocks of a trusted data environment. The result: fewer ad-hoc requests, less firefighting, and a more scalable approach to analytics.

At organisation level, decision-making becomes more democratic and consistent. When more people can safely ask better questions and actually understand the answers. Data intelligence evolves from a reporting system into a strategic conversation partner.

The future of BI isn’t just visual. It’s conversational.

Ready to explore Conversational BI?

LACO helps you design and implement AI-powered conversational BI, combining LLM capabilities with governed data so your teams can ask questions in plain language and get trustworthy answers when they need them.

Conversational BI: the AI‑powered future of data intelligence2026-01-15T10:17:12+00:00

Marketing mix modelling

Marketing leaders today face a growing challenge: deliver measurable results in a landscape that’s more complex, fragmented and regulated than ever. Budgets are under pressure, while customer journeys span more channels and more blind spots than before.

With traditional tracking becoming less reliable, it’s harder to understand what’s truly driving performance. That’s where marketing mix modelling comes in. By connecting the dots between campaigns, spend and business outcomes, it brings clarity back to decision-making and replaces assumptions with insight.

The challenge

Today’s marketing landscape is a paradox: more channels, more data, yet less visibility. Tracking customer behaviour has become harder thanks to GDPR, strict consent rules and the looming end of cookies. At the same time, marketing spend is spread across a growing mix of online and offline touchpoints, making it difficult to see what’s really working.

Traditional tracking methods fall short. ROI and performance are often judged by what’s easiest to measure — last-click metrics, web analytics or internal assumptions — rather than by a complete, objective view. The result: fragmented insight, unclear attribution, and growing pressure on marketing leaders to justify budgets without solid evidence.

The solution

LACO helps organisations cut through this complexity with an AI-powered marketing mix modelling (MMM) approach, built on a robust Microsoft Azure and Microsoft Fabric foundation. Rather than relying on user-level tracking, MMM uses advanced statistical and machine learning techniques to connect consolidated marketing inputs with business outcomes like sales, leads or conversions.

  • It starts with the data. We build a secure, scalable platform on Azure to integrate all relevant marketing inputs — media spend, CRM, web and app analytics, and external data like seasonality or macro-economic indicators. The result: a centralised, governed environment that’s consistent, traceable and ready for modelling.
  • From there, LACO defines a unified marketing data model that brings together online, offline and contextual factors. Search, social, display, TV, radio, events. All channels are harmonised into one view, enabling consistent comparisons and meaningful insights.
  • We then apply AI-powered MMM models that estimate how each driver — from media spend to timing to external influences — contributes to business results. These models are deployed directly into the Microsoft Azure or Microsoft Fabric environment and made accessible via tools like Power BI, so marketing and business teams can explore insights independently. Clear visuals and practical explanations show how spend, saturation and timing affect performance.

A key feature?
Scenario planning. Decision-makers can run what‑if analyses via a conversational interface that acts as an AI agent for the marketing organisation to test budget shifts, channel reallocations or campaign timing. What happens if we boost spend on social and cut TV? Launch a promo earlier? Shift regional targeting? These simulations support smarter, evidence-based decisions before money is spent.

Crucially, the solution is transparent and privacy-friendly. Instead of black-box algorithms built on personal data, LACO’s MMM approach relies on aggregated, governed inputs. Assumptions, sources and model logic are documented, building trust and ensuring compliance, even as regulations evolve.

Finally, MMM models feed into a predictive ROI engine that forecasts the impact of future marketing investments. Budget planning becomes proactive, grounded in data rather than gut feeling. Finance and marketing teams get forward-looking guidance on where to invest, at what level, and with what expected return.

The results

With AI-driven MMM on a strong Microsoft Azure and Microsoft Fabric data platform, organisations move from reactive reporting to confident, evidence-based marketing decisions. ROI attribution becomes transparent across all channels, enabling smarter budget allocation and better performance, often without increasing spend.

Forecasting improves, planning becomes more strategic, and marketing finally earns its place as a measurable growth driver. Because the entire approach is built on governed, aggregated data, organisations stay agile and compliant even as privacy rules continue to change. The end result? Clarity. Control. And a marketing function that moves to knowing.

Ready to bring clarity to your marketing mix?

LACO helps you build an AI-powered marketing mix modelling framework on Microsoft Azure and Microsoft Fabric, turning fragmented marketing data into transparent ROI insights and forward-looking scenarios for smarter budget decisions.

Marketing mix modelling2026-01-15T10:15:27+00:00

Improved customer service based on embedded Power BI

All the benefits of Power BI’s flexibility and user-friendliness for data visualisation, but without the investment in thousands of software licences. That’s the key question question – maybe not literally, but still quite close – that LACO answered for Connecting-Expertise (CE). How we did it? We embedded Power BI into CE’s customer web portal.

Power BI is the way to go for data visualisation. Not just for internal use, as we wrote in this blog in our Power BI series, but also to share reports beyond the borders of the organisation, with customers and partners. That’s exactly what Connecting-Expertise’s customer was after. CE builds software solutions that help companies optimise and facilitate sourcing, contracting and managing their contingent workforce. Operational since 2007, CE is a pioneer in the Belgian market with a leading solution for contingent labor supply contracts.

Until recently, CE provided data to some of its clients, allowing them to build reports on their own. However, as each data file was client-specific, the preparation of these files took quite some time. To further extend its service, CE was looking for a more efficient way to offer clients a set of standard reports and dashboards, based on client-specific data. LACO suggested embedding Power BI as a reporting environment in CE’s web-based software platform, to unlock the full potential of the data and provide new insights to the clients.

Sturdy, cost-efficient architecture

We designed the data architecture and set up an Azure environment for the Power BI embedded service. We chose to set up the sturdiest possible technical architecture, by hosting a VM with MySQL, avoiding the need to migrate local MySQL data to an Azure data warehouse. In doing so, we kept a strong focus on the usage of Azure resources. Even today, we are still able to host all reports using the lowest – and cheapest – tier for the Power BI embedded service on Azure.

To build the right reports, we assisted CE with defining the KPIs that would offer most added value to the clients. Extra attention was given to security. It is of utmost importance that clients using the dashboards and reports only have access to their own data. To get that right, we implemented the complicated Row Level Security CE offers to its clients. Row Level Security provides access on various levels – on personal or division level – and for various types of data. As an interesting side effect, we used Row Level Security as an easy solution for offering and maintaining multi-language reports. How? We unravel that in this blog for you!

Embedded reports and dashboards

As a final step in the project, we assisted CE with embedding the reports in their customer portal. Since this was the first implementation of this type in Belgium, Microsoft lent us a helping hand as well. As a result, Power BI reports and dashboards are now available as an embedded service on the CE customer portal. Only clients with a CE portal login have access to the reports and dashboards, which are solely based on their own data. Furthermore, data is filtered on row level, based on the client’s permissions. For CE, the availability of the Power BI reports and dashboards strengthens the status of its leading solution for contingent labor supply contracts.

Improved customer service based on embedded Power BI2026-02-16T08:40:09+00:00

How to create geographical reports in SAS VA using custom polygons: a three-step approach

Many businesses operate within a certain geography or have a specific geographic relevance. For these businesses, visualising their business data on enhanced maps is of material importance in gaining valuable insights. And SAS Visual Analytics (VA) lets them do just that, even though it does not always offer the necessary geographic variables as a standard feature. That’s where custom polygons come in, allowing businesses to customise every map to their specific business needs.

In general, visualisation already works better than showing tabular data. And visualising your business data on top of a geographical map is yet another important step in rendering that data into valuable information from which to gain actionable business insights.

As we explained in another post, though, in order to obtain those precious insights, it is sometimes necessary to customise a map. And one way of doing that is by creating your very own custom polygons.

SAS offers specific functions to help you create those custom polygons, based on groups of existing polygons such as provinces, municipalities and other geographic variables that are readily available as standard features in Visual Analytics.

Let’s take a closer, more detailed look now at how you can use custom polygons to easily produce your own tailor-made reports in SAS VA.

Step 1: Creating polygon definitions

First, of course, the custom polygons need to be created. There are always shape files you can find, retrieve or buy which contain standard polygon information about a nation’s geography, such as regions, provinces, municipalities and communes. Based on those polygons, you can now start to create your own custom polygons by grouping some of the aforementioned shape files together. In our example, we will use Belgium, our home country, as a nation. Some names of regions will be typically Belgian. A similar logic can be applied to other countries’ regions, though.

SAS has some specific geographical procedures that can be used for this. To start, we need to import the available shape files of the municipalities by using the PROC MAPIMPORT procedure. As a second step, we need to join these imported municipalities with the sectors we have defined ourselves based on grouping some municipalities together in one sector.

| LACO

As a result, we now have the polygon information of each municipality in a sector linked to that sector itself. But to be able to use this properly, we need to redefine the outline of the polygon that groups all those municipalities together. This is achieved by using the PROC GREMOVE procedure of SAS.

| LACO

The only thing remaining for us is now to join the information to the correct location. There are mainly two tables that need to be adapted to be able to use the polygon definitions in our VA reports. Both tables can be found in the VALIB folder in the SAS config folder:

  • ATTRLOOKUP: contains the information about the custom created polygons themselves, both for the groups of all polygons and for each created polygon separately. Here you define an ID, a label, a unique prefix (2 letters), a name, an ISO code and an ISO name.
  • CENTLOOKUP: this table contains the coordinates that need to be connected for each polygon. So, here you define the map name, the ID and the X and Y coordinates for each polygon out of the dataset you created using the PROC GREMOVE procedure.

Step 2: Uploading polygon info to SAS VA

To be able to use our custom polygons in SAS Visual Analytics reports, we need to make sure now that the two previously created tables (ATTRLOOKUP and CENTLOOKUP) are stored on the SAS VA server in the correct location. Then that server needs to be restarted to make sure that the polygons and their definitions are loaded properly into memory, so they are ready for use in the SAS VA reports.

When you have defined formats on the polygon IDs to show names instead of meaningless IDs, you also need to make sure that those formats are set in the table and that the formats catalog is also loaded to the VA platform. User-defined formats are not automatically loaded to SAS Visual Analytics. You need to put the catalog with the formats in the defined location on the SAS configuration of your VA platform. More details can be found here.

Step 3: Creating your own reports in SAS Visual Analytics with custom polygons

To use the custom polygons for your reports, you start by creating a new report and selecting a dataset that contains figures together with the IDs for the sectors you’ve defined.

When viewing the available columns of the selected dataset, you need to right-click on the sector ID and select geographical -> Custom polygon. Then you can select your created custom sector name from the list. When you now add a “Geo Region Map” to the report and drag your ID on it, together with a metric, it will show the polygons.

You can tweak your report by changing the colouring, transparency, contrast, etc. of the polygons based on the selected metric or standard.

| LACO

This is an example of custom regions created from lower-level existing regions.

Conclusion

As you can see, it is not all that difficult to create your own professional SAS Visual Analytics report, using custom polygons with defined regions or sections. All in all, there are just three small steps to take:

  • 1
    Create the custom polygons with SAS code
  • 2
    Upload the custom polygons information to the VA platform
  • 3
    Use the custom polygons to create your own tailor-made geographical reports

Simply follow these steps and in no time you’ll be creating reports that are better adapted to the specific needs of your company and/or your clients.

How to create geographical reports in SAS VA using custom polygons: a three-step approach2026-02-16T08:41:29+00:00

Customising your geographical reports in SAS VA for superior insights

In visual analytics, too, one size does not fit all. That is why SAS allows you to customise, among others, your geographical reports. And one way of doing that is by creating your very own custom polygons. A custom polygon, in simple layman’s terms, is a type of geographic variable supported by SAS Visual Analytics (SAS VA), along with custom coordinates and a number of predefined geographic variables.

Using predefined geographic variables, as listed here, you can easily visualise your business data to create, for instance, an insightful map of the countries, regions, provinces, etc. you are operating in. But what if your business is not organised according to these predefined variables? What if (part of) your sales organisation is specifically geared towards, say, South-West Flanders, the Kempen or the Rhine area? Then those custom polygons sure come in handy!

SAS Visual Analytics: objects and maps

When creating maps in SAS VA, there are multiple object types to choose from:

  • Geo bubbles allow you to place a bubble with a size and a color value in specific places on the map.
  • Geo coordinates allow you to place dots on the map indicating the places of interest.
  • Geo regions is the one we will use for our polygon images.
| LACO

Each of these object types requires a geographic variable, which is a variable with extra information attached to it. Sometimes longitude and latitude values serve as such, in other cases a polygon does. (You can find out more about geographic variables in this SAS blog about geo maps)

Custom polygons: what’s in a name?

When we talk about custom polygons, we are referring to regions, sectors or other geographic variables that are not available as a standard feature or function in SAS Visual Analytics. Lots of businesses and industries in fact have their own specific map divisions, such as the regions in which their stores or agents operate, to give but one example. The polygons for these are not available for download. They are, however, very easy to build yourself. You really don’t have to be a techie at all to do so successfully.

How to create your custom polygons

First things first: to create your own custom polygons, you need a good point to start off from. Fortunately, there are always shape files you can find, retrieve or buy which contain standard polygon information about a nation’s geography, such as regions, provinces, municipalities and communes. Based on those polygons, you can now start to create your own custom polygons by grouping some of the aforementioned shape files together.

SAS has provided several functions you can use for this:

  • MAPIMPORT imports the available shape files.
  • GREDUCE redefines the outline of the polygon.

When finished, the new polygons need to be loaded into the system.

Custom polygons in SAS VA: use case

Suppose you have organised your activities based on a number of regions in Belgium that are specific to your business. The map on the left below presents you with a standard overview of your business activities in all Belgian municipalities. It probably won’t take you long to realise that it will be fairly hard, if not downright impossible, to gain actionable insights from the way those activities are represented here.

Now take a look at the map on the right. It contains the same information about your business activities from the same Belgian municipalities. Only now they are grouped by region: those regions, to be precise, that are specific to your business. A colour range, indicating high and low values, now clearly shows you – in the blink of an eye, so to speak – how your different regions are performing. Since the polygons used to achieve this are nowhere available, they had to be custom-made.

| LACO

As this example of custom regions created from lower-level existing regions also shows, polygons can be used in hierarchies, allowing you to go from a lower level (e.g. Municipality) to a higher level (e.g. Region) – and vice versa. Very often custom polygons fit in some middle layer, where they will open up to the lower structures from which they were created.

In this particular use case we stayed within one country. Another benefit of deploying custom polygons is that it is easily possible to create regions while not looking at country borders.

In conclusion

Did we spark your interest in custom polygons? Great! In deploying them whenever required, your reports are guaranteed to be more adapted to the specific (business) needs of your company or client.

To summarize:
  • 1
    Custom polygons are structures that are not readily available for download, neither bought nor free.
  • 2
    The creation of custom polygons requires some technical steps.
  • 3
    SAS provides functionalities to help with the creation of custom polygons.
  • 4

    Visualisations can now fully adapt to business needs.

 

Customising your geographical reports in SAS VA for superior insights2026-02-16T08:41:42+00:00

Embedding Power BI in your company portal

Static, pre-defined reporting is so 2010! Self-service BI is today’s standard. But in an environment with literally thousands of users, even small licence fees add up to high costs. Embedding Power BI offers an interesting – and cost-effective – way around this.

Unlimited number of users

Part of a company’s digital strategy is convenience for the customer. A user-friendly app or webpage makes the life of the customer easy. Depending on the company’s business, the app may show all sorts of information, from purchased books to energy usage or data consumption. The information – based on data visualisation – reflects the customer’s behaviour, which may be of great value for that customer and underlines the company’s unique selling proposition.

Power BI enables this type of visualization at a very low cost and makes it available for an unlimited number of users. For this type of scenario, we, at LACO, choose to embed Power BI in the company’s web portal. Thanks to role-based access, the user can only work with the data he’s allowed to see. But more importantly: the user has access to the navigation functionalities of the tool, to make selections, drill down, and more. And what’s more, we even found a clever way to make multi-language availability of the reports easy.

| LACO

Example of Power BI dashboard embedded in a company portal. Source: Microsoft.com.

Unlimited number of users

But unfortunately, there’s no such thing as a free lunch. Although licence costs per user may be low, when there are thousands – or even tens of thousands – of users, the total cost quickly spins out of control. To avoid that, the trick is to define a small number of power users and provide them with full self-service functionalities and capabilities. The reporting they come up with – at a low total licence cost – can then be shared to the large audience of ‘visual-only’ customers through the company’s portal. By embedding data visualisation in the portal, a lot of the user functionalities with regard to visualization are still usable – such as navigating data, selecting and filtering data, and more – without the need to pay licence fees for every one of those end-users.

The only thing the users can’t do, is make new reports. To make that happen, every user would indeed need a Power BI licence. Another thing to keep costs under control, is the need to set up Power BI correctly on Azure. We share more about that – and some of the other technicalities, including coping with multi-language reporting – in one of the other posts in our Power BI blog series.

| LACO

Example of Power BI dashboard embedded in a company portal. Source: Microsoft.com.

Enabling the digital strategy

For the Chief Digital Officer, Power BI is an extra tool that helps enable the company’s digital strategy. Making reports available for customers and partners, based only on the data they are allowed to see, used to be quite tricky. With Power BI, that’s no longer the case. But what’s the catch, you might ask? Well, your data is at the core of every report or visual that Power BI produces. Did you get your data platform sorted? Then you can start leveraging the possibilities of Power BI.

Using Power BI in the way we described in this blog post opens up the standard visualization capability of this powerful technology. No need for coding! No hard prioritisation of scarce IT time! No weeks of waiting time for a new report and no complexity leading to outrageous costs… By means of embedding Power BI, standard internal functionality is tunnelled through the Internet to the company’s customers, without the risks of the past, such as complex programming and cumbersome data preparation. Who would have thought, back in 2010?

Embedding Power BI in your company portal2026-02-16T08:41:58+00:00

Obtain insights using data visualisation: 4 steps to take

A picture is worth a thousand words. Well, if it can’t be misinterpreted, that is. Even in today’s world, with its enormous amounts of data and the technology to visualise it in real-time, effective and user-friendly data visualisation remains an art.

“Use a picture. It’s worth a thousand words.” That’s how Tess Flanders was quoted in The Post-Standard, in a debate about journalism and publicity organized by the Syracuse Advertising Men’s Club in 1911. More than 100 years later, the saying still flies. Well thought out and executed visualizations create insights.

| LACO

The very first bar chart (1786) – William Playfair

Since the late 18th century, different types of presentation were invented: from bar charts and pie charts, to radar, spiral, bubble, area and flow charts. In more recent years, the evolution of technology made it easy to visualise data: from PC to smartphone, from spreadsheet and PowerPoint to visualisation app.

In just a few decades, we evolved from a spoken and written culture to a primarily visual culture. We prefer watching a short clip on YouTube to reading a lengthy manual. But as technology makes things easier and cheaper to produce and more attractive to consume, the amount of possibilities and options make it harder to do things right.

In transactional environments such as payments or reservations, user experience is becoming an art, mastered only by true specialists. The same applies to data visualisation in informational environments. If you don’t want our picture – the one that is worth a thousand words – to be misinterpreted, you need a data visualisation specialist to step in.

Strong data visualisation in 4 steps

Step 1: Collect high-quality data

The quality of the collected data determines about everything that follows later on. So check the data sources, pursue data completeness and remove duplicate data.

Make sure the data is tailored to the end consumer’s needs:

  • omit irrelevant attributes for the end consumer’s domain of interest
  • summarize multiples into timeseries and statistical views
  • combine multiple data streams into potentially correlated sets

An example of omitting irrelevant attributes:

| LACO

The multi-colored graph is harder to read because the color use is disruptive. The “Gestalt law” of similarity in the first row of all-gray graphs removes the extra cognitive overload, as does labeling the bars on the axis rather than with a color-coded key. Deliberate color use, however, can make specific data stand out with the law of focal point.

Step 2: Align data visualisation with the end user

Comparing visualisation with art makes sense looking at the needed creativity and skills. But artistic freedom is somewhat limited by the functional goals. And who else than the end user can judge the quality of the end product in view of his intentions?

To align data visualisation with the end user:

  • have a clear view on the target user profile and the purpose of the visualisation

  • characterize the target user profile (e.g. management, students, controller-like functions)
  • define the project’s ultimate goal (e.g. part of a regular process or regulatory publication, one-time shot)
  • gather several points of view from various stakeholders
  • define possible follow-up actions (e.g. data exploration, predictive analysis, regulatory reporting)
  • offer various options to help select the right format
| LACO

An example of various options to help select the right format. You see the original design and then three alternative visualisations.

Step 3: Get the intention of data visualisation right

The visualisation’s purpose can be anything, really. There’s just one prerequisite: make sure it’s crystal clear.

Data visualisation can be used for:

  • have a clear view on the target user profile and the purpose of the visualisation

  • characterize the target user profile (e.g. management, students, controller-like functions)
  • define the project’s ultimate goal (e.g. part of a regular process or regulatory publication, one-time shot)
  • gather several points of view from various stakeholders
  • define possible follow-up actions (e.g. data exploration, predictive analysis, regulatory reporting)
  • offer various options to help select the right format
| LACO

© Reuters
The importance of selecting the right data visualisation format is shown by the ‘Gun deaths in Florida’ graph above. The graph seems to indicate that the amount of gun deaths decreased since the ‘Stand Your Ground’ law came into place. While exactly the opposite was true. Simply orientation the graph in a wrong way, distorts your data.

Step 4: Select the right data visualisation method

To use the appropriate visual for the appropriate case, you need to consider various elements:

  • choosing the right template: maps, traffic lights, tables, pie diagrams, radar charts
  • lay-out: colour palette, base lines, legend, scale, overlay’s, axes, backgrounds
  • usability and level of interactivity (e.g. the possibility to drill down)
  • make and show drafts to adapt and finetune the chosen format
  • iterate with input from the end user to reach the perfect result
| LACO

Choosing the right data visualisation method. © Andrew Abela’s Chart Chooser

In summary: the data visualisation journey

Data visualisation is a mature discipline, but it needs special care. Only when you make the right choices, you will be able to achieve the goal you are aiming for. To do so, you need to master your data, the data visualisation purpose and the right technology.

To get to the data visualisation you have in mind, you need to take things step by step:
  • 1
    collect and cleanse the data
  • 2

    align visualisation with the end user

  • 3

    get the intention of the visualisation right

  • 4

    select the right visualisation method

Making the right choices is the key to success. Follow the four steps we discussed, and they will help you turn data into information and maximize insight. And as you probably already noticed: we didn’t talk much about technology in this blog post. Because, as always, technology is an enabler and not a goal in itself.

Obtain insights using data visualisation: 4 steps to take2026-02-16T08:42:11+00:00

Integrating SAS with Microsoft Azure

Many organisations rely on SAS as a trusted engine for analytics, reporting and modelling. At the same time, business users increasingly expect the modern flexibility of Microsoft Fabric and Azure Databricks. They want interactive dashboards, faster access to insights and a unified view across teams. This creates a gap between what the organisation already depends on and what the business now requires.

By integrating SAS with Microsoft’s cloud ecosystem, organisations gain the best of both worlds: a governed analytics engine on one side and a streamlined approach that minimises migration investment and accelerates change adoption on the other.

The challenge

SAS remains a powerful platform for processing and modelling, yet it was not built for today’s expectations around real time insights, cloud scalability and self service analytics. As a result, organisations end up switching between a central data warehouse (SAS DI) and end-user compute (SAS EG), manually exporting data and recreating reports. This leads to inconsistent versions, slow refresh cycles and a clear divide between technical teams and business users.

The challenge is not choosing one platform over the other. It is creating a landscape where they reinforce each other.

The solution

LACO helps organisations build a seamless bridge between SAS, Databricks and Microsoft Fabric.

  • The journey begins with a thorough scan of the existing SAS environment to understand dependencies, data sources and reporting processes.
  • Once there is clarity, we design an hybrid architecture where SAS outputs land securely and automatically in Databricks and Microsoft Fabric as certified datasets. These datasets follow shared governance and metadata principles so that access rules, terminology and lineage remain consistent across platforms.
  • We then automate the data flows to ensure that business users always work with up to date information. Manual exports disappear and data refreshes run on predictable schedules. Throughout this process, analysts and business users receive practical training so they can explore SAS outputs in Power BI with confidence.

The (gradual) transition becomes smooth, governed and supported by clear communication.

Results

The organisation gains one connected data landscape instead of two separate tools.

SAS continues to provide the analytical strength and validated outputs that teams rely on, while Power BI and Synapse deliver the flexibility and speed business users expect. Duplicate work disappears because data is prepared once and reused across the entire ecosystem. Reports refresh faster, users adopt the new environment more easily and IT teams spend far less time supporting manual tasks.

Most importantly, insights become both governed and accessible. Business users explore information in real time without recreating models or manipulating data manually, and leadership gains a trusted, consistent and audit ready view of the organisation. By connecting SAS with Microsoft’s cloud platform organisations modernise without replacing what still works and create a future ready foundation for analytics, AI and decision making.

Want to connect SAS and Microsoft Azure in a single data landscape?

We help you build the bridge — safely, efficiently and at your own pace.


Keep the strength of SAS. Add the flexibility of Microsoft Azure. And bring everyone onto the same page.

Integrating SAS with Microsoft Azure2026-01-08T09:54:32+00:00
Go to Top