| LACO
| LACO

About thijs

This author has not yet filled in any details.
So far thijs has created 51 blog entries.

Senior Fabric Data Engineer

Do you thrive on owning complex data solutions from A to Z? Are you ready to lead Microsoft Fabric projects that go beyond infrastructure and deliver real business value? Then we’ve got something worth talking about.

As our Senior Fabric Data Engineer, you own the end-to-end delivery of data and analytics solutions within the Microsoft Fabric platform, working closely with cross-functional teams to design, build, and deploy scalable, high-performance data architectures that drive business value.

Your job

  • Design and build scalable data solutions in Microsoft Azure, with a strong focus on Microsoft Fabric
  • Lead end-to-end data projects focused on the Microsoft Fabric environment, from requirements analysis and architecture design through to implementation and delivery
  • Ensure all solutions meet enterprise-level standards for security, governance and compliance
  • Collaborate closely with clients and cross-functional teams, acting as a trusted technical partner

  • Contribute to innovative data initiatives in sectors like financial services, life sciences and energy, as part of the LACO team
  • Deep your expertise in Fabric, Azure and AI, and actively share insights to strengthen the team

Your job

  • Design and build scalable data solutions in Microsoft Azure, with a strong focus on Microsoft Fabric
  • Lead end-to-end data projects focused on the Microsoft Fabric environment, from requirements analysis and architecture design through to implementation and delivery
  • Ensure all solutions meet enterprise-level standards for security, governance and compliance
  • Collaborate closely with clients and cross-functional teams, acting as a trusted technical partner

  • Contribute to innovative data initiatives in sectors like financial services, life sciences and energy, as part of the LACO team
  • Deep your expertise in Fabric, Azure and AI, and actively share insights to strengthen the team
  • You have 5+ years of experience as a data engineer with growing ownership
  • You bring strong expertise in the Microsoft Azure ecosystem
  • You’ve delivered end-to-end Microsoft Fabric implementations
  • You’re confident in data modelling, from concept to execution
  • You’re an expert in SQL and Python
  • You are fluent in Dutch and/or French and English, both spoken and written
  • Experience with dbt is a plus

Your skils

  • You have 5+ years of experience as a data engineer with growing ownership
  • You bring strong expertise in the Microsoft Azure ecosystem
  • You’ve delivered end-to-end Microsoft Fabric implementations
  • You’re confident in data modelling, from concept to execution
  • You’re an expert in SQL and Python
  • You are fluent in Dutch and/or French and English, both spoken and written
  • Experience with dbt is a plus

Your skils

Our offer

  • A senior role with a package that reflects your expertise and ambition

  • Access to cutting-edge Microsoft data & AI projects that challenge and grow your skills
  • Coaching, peer support and a strong team to help you thrive
  • Plenty of autonomy to lead, innovate and make your mark
  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • A senior role with a package that reflects your expertise and ambition

  • Access to cutting-edge Microsoft data & AI projects that challenge and grow your skills
  • Coaching, peer support and a strong team to help you thrive
  • Plenty of autonomy to lead, innovate and make your mark
  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to lead Fabric projects that make a difference? Get in touch!

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Senior Fabric Data Engineer2026-02-02T12:52:21+00:00

Medior Fabric Data Engineer

Are you eager to work hands-on with Microsoft Fabric and help shape the next generation of cloud data solutions? Do you enjoy turning technical complexity into scalable, usable architecture? Then LACO has a challenge made for you.

As our Medior Fabric Data Engineer, you support the design, development and deployment of data solutions within the Microsoft Fabric ecosystem on our clients’ Azure platforms. You work closely with other data engineers, analytics specialists, and developers to build scalable, reliable, and high-performance data pipelines and architectures.

Your job

  • Design and build scalable data solutions in Microsoft Azure, with a focus on Microsoft Fabric
  • Contribute to data projects within the Microsoft Fabric environment, supporting tasks from requirements analysis and architecture design to implementation

  • Ensure all data solutions follow enterprise standards for security, governance and compliance
  • Collaborate with clients and cross-functional teams, providing technical input and supporting project delivery
  • Take part in innovative data initiatives across sectors such as financial services, life sciences, and energy, as part of the LACO team

  • Grow your knowledge in Azure, Fabric and AI, and share insights to support team development

Your job

  • Design and build scalable data solutions in Microsoft Azure, with a focus on Microsoft Fabric
  • Contribute to data projects within the Microsoft Fabric environment, supporting tasks from requirements analysis and architecture design to implementation

  • Ensure all data solutions follow enterprise standards for security, governance and compliance
  • Collaborate with clients and cross-functional teams, providing technical input and supporting project delivery
  • Take part in innovative data initiatives across sectors such as financial services, life sciences, and energy, as part of the LACO team

  • Grow your knowledge in Azure, Fabric and AI, and share insights to support team development
  • You have 2+ years of experience as a data engineer
  • You bring strong hands-on experience with the Microsoft Azure ecosystem
  • You’ve worked within Microsoft Fabric and understand its architecture and components
  • You have a solid grasp of data modelling principles
  • You’re skilled in SQL and Python
  • You are fluent in Dutch and/or French and English
  • Familiarity with dbt is a plus

Your skils

  • You have 2+ years of experience as a data engineer
  • You bring strong hands-on experience with the Microsoft Azure ecosystem
  • You’ve worked within Microsoft Fabric and understand its architecture and components
  • You have a solid grasp of data modelling principles
  • You’re skilled in SQL and Python
  • You are fluent in Dutch and/or French and English
  • Familiarity with dbt is a plus

Your skils

Our offer

  • A strong career opportunity with conditions that match your skills, experience and ambition
  • Continuous learning and access to cutting-edge Microsoft data & AI projects
  • Coaching and support from experienced LACO colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • A strong career opportunity with conditions that match your skills, experience and ambition
  • Continuous learning and access to cutting-edge Microsoft data & AI projects
  • Coaching and support from experienced LACO colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to shape the future of data with Microsoft Fabric? Get in touch!

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Medior Fabric Data Engineer2026-02-02T12:53:56+00:00

Senior Databricks Data Engineer

Do you enjoy taking full ownership of complex data solutions? Are you the kind of engineer who not only builds scalable pipelines but also sets the bar for best practices in Databricks and Azure? Then this one’s for you.

As our Senior Databricks Data Engineer, specialising in Databricks, you take ownership of the end-to-end delivery of scalable, high-performance data solutions on Microsoft Azure. You work closely with cross-functional teams to architect, develop, and optimize data pipelines and Databricks environments that enable reliable, enterprise-grade analytics and drive measurable business value.

Your job

  • Lead the design, development and optimisation of scalable data solutions on Azure, with a strong focus on Databricks and Spark
  • Architect and implement end-to-end data pipelines (ETL/ELT) for batch and streaming workloads, ensuring performance, reliability, and maintainability
  • Oversee and optimise Databricks environments and Azure data platforms, including CI/CD automation and cloud infrastructure best practices

  • Ensure all solutions meet enterprise standards for security, governance and compliance
  • Collaborate with stakeholders and cross-functional teams to deliver data solutions with measurable business value
  • Mentor medior and junior engineers, sharing best practices in Databricks, Spark and cloud architecture

  • Stay at the forefront with Azure, Databricks and big data technologies and promoting innovative approaches within the team

Your job

  • Lead the design, development and optimisation of scalable data solutions on Azure, with a strong focus on Databricks and Spark
  • Architect and implement end-to-end data pipelines (ETL/ELT) for batch and streaming workloads, ensuring performance, reliability, and maintainability
  • Oversee and optimise Databricks environments and Azure data platforms, including CI/CD automation and cloud infrastructure best practices

  • Ensure all solutions meet enterprise standards for security, governance and compliance
  • Collaborate with stakeholders and cross-functional teams to deliver data solutions with measurable business value
  • Mentor medior and junior engineers, sharing best practices in Databricks, Spark and cloud architecture

  • Stay at the forefront with Azure, Databricks and big data technologies and promoting innovative approaches within the team

  • You have 5+ years of experience as a data engineer, with strong project ownership
  • You have deep expertise in the Microsoft Azure ecosystem
  • You are an expert in Databricks and Spark (PySpark, Spark SQL)
  • You’re an expert in SQL and Python, and fluent in building complex ETL/ELT pipelines
  • You know your way around data modelling techniques and best practices
  • You understand cloud architecture and DevOps workflows, including CI/CD for data engineering
  • You communicate fluently in Dutch and/or French and English, written and spoken

Your skils

  • You have 5+ years of experience as a data engineer, with strong project ownership
  • You have deep expertise in the Microsoft Azure ecosystem
  • You are an expert in Databricks and Spark (PySpark, Spark SQL)
  • You’re an expert in SQL and Python, and fluent in building complex ETL/ELT pipelines
  • You know your way around data modelling techniques and best practices
  • You understand cloud architecture and DevOps workflows, including CI/CD for data engineering
  • You communicate fluently in Dutch and/or French and English, written and spoken

Your skils

Our offer

  • A strong career opportunity with a package that reflects your expertise, leadership and impact
  • Access to cutting-edge Microsoft data & AI projects where your skills make a difference
  • Coaching and collaboration with experienced LACO colleagues who support your growth
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • A strong career opportunity with a package that reflects your expertise, leadership and impact
  • Access to cutting-edge Microsoft data & AI projects where your skills make a difference
  • Coaching and collaboration with experienced LACO colleagues who support your growth
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to lead as a Senior Databricks Engineer? Get in touch!

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Senior Databricks Data Engineer2026-02-02T12:54:21+00:00

Medior Databricks Data Engineer

Do you enjoy working hands-on with Databricks and Spark to turn raw data into scalable pipelines? Ready to help shape the future of cloud-native data platforms? Then LACO has a role you’ll want to dig into.

As our Medior Databricks Data Engineer, you support the design, development and deployment of data solutions within the Databricks ecosystem on our clients’ Azure platforms. You work closely with other data engineers, analytics specialists, and developers to build scalable, reliable, and high-performance data pipelines and architectures.

Your job

  • Contribute to the design, development and optimisation of scalable data solutions in Microsoft Azure, with a strong focus on Databricks and Spark

  • Implement data pipelines (ETL/ELT) for batch and streaming workloads, ensuring performance, reliability, and maintainability

  • Support the setup and optimisation of Databricks and Azure environments, including CI/CD automation and cloud infrastructure practices

  • Help ensure solutions meet enterprise standards for security, governance and compliance.

  • Collaborate with stakeholders and cross-functional teams to deliver data solutions that drive business impact

  • Share knowledge and best practices with junior engineers, while continuing to grow your own expertise in Databricks, Spark, ETL and cloud architecture

Your job

  • Contribute to the design, development and optimisation of scalable data solutions in Microsoft Azure, with a strong focus on Databricks and Spark

  • Implement data pipelines (ETL/ELT) for batch and streaming workloads, ensuring performance, reliability, and maintainability

  • Support the setup and optimisation of Databricks and Azure environments, including CI/CD automation and cloud infrastructure practices

  • Help ensure solutions meet enterprise standards for security, governance and compliance.

  • Collaborate with stakeholders and cross-functional teams to deliver data solutions that drive business impact

  • Share knowledge and best practices with junior engineers, while continuing to grow your own expertise in Databricks, Spark, ETL and cloud architecture

  • You have at least 2 years of experience as a data engineer
  • You have a solid understanding of the Microsoft Azure ecosystem
  • You have hands-on experience with Databricks, including Spark (PySpark, Spark SQL)
  • You have strong SQL and Python skills
  • You have a solid understanding of data modelling concepts

  • You have experience with building robust ETL/ELT pipelines
  • You’re familiar with cloud architecture and DevOps practices, including CI/CD for data engineering
  • You communicate fluently in Dutch and/or French and English, written and spoken

Your skils

  • You have at least 2 years of experience as a data engineer
  • You have a solid understanding of the Microsoft Azure ecosystem
  • You have hands-on experience with Databricks, including Spark (PySpark, Spark SQL)
  • You have strong SQL and Python skills
  • You have a solid understanding of data modelling concepts

  • You have experience with building robust ETL/ELT pipelines
  • You’re familiar with cloud architecture and DevOps practices, including CI/CD for data engineering
  • You communicate fluently in Dutch and/or French and English, written and spoken

Your skils

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Access to cutting-edge Microsoft data & AI projects that keep your knowledge sharp
  • Continuous learning and coaching from experienced LACO colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Access to cutting-edge Microsoft data & AI projects that keep your knowledge sharp
  • Continuous learning and coaching from experienced LACO colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to build better data solutions with Databricks? Get in touch!

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Medior Databricks Data Engineer2026-02-02T12:54:50+00:00

Analytics Engineer

Do you enjoy transforming raw data into well-structured, realiable and accessible data products that empower business teams and decision- makers? Then you’ll feel at home at LACO.

As our Analytics Engineer, you’ll sit right at the intersection of data engineering, analytics and business insight. You’ll translate business requirements into functional data pipelines, models and visualizations, focusing on analytics deliverables rather than platform administration.

Your job

  • Understand business needs and translate them into technical solutions by gathering requirements from stakeholders, understanding their KPIs and information needs, and designing data products that meet those requirements

  • Data modeling: design and maintain analytical data models (such as dimensional models and data marts) that reflect business logic and support reporting/analysis
  • Data transformation and pipelines: build, maintain and optimise ETL/ELT pipelines using SQL, Python, and cloud-based data platforms to extract, clean, transform and load data

  • Data visualisation and reporting: build dashboards and reports in BI tools like Power BI, Tableau or an equivalent, enabling business users to explore and understand metrics and trends
  • Collaborate with business and IT stakeholders to ensure data solutions match real business needs, acting as a bridge between technical and non-technical teams

  • Data governance & quality: ensure data models and pipelines comply with governance, documentation, and quality standards; implement data validation, testing, lineage, and full documentation of data flows

  • Focus on data products: deliver usable, stable and well-documented data products (models, tables, dashboards) that business users can rely on, going beyond just building infrastructure

Your job

  • Understand business needs and translate them into technical solutions by gathering requirements from stakeholders, understanding their KPIs and information needs, and designing data products that meet those requirements

  • Data modeling: design and maintain analytical data models (such as dimensional models and data marts) that reflect business logic and support reporting/analysis
  • Data transformation and pipelines: build, maintain and optimise ETL/ELT pipelines using SQL, Python, and cloud-based data platforms to extract, clean, transform and load data

  • Data visualisation and reporting: build dashboards and reports in BI tools like Power BI, Tableau or an equivalent, enabling business users to explore and understand metrics and trends
  • Collaborate with business and IT stakeholders to ensure data solutions match real business needs, acting as a bridge between technical and non-technical teams

  • Data governance & quality: ensure data models and pipelines comply with governance, documentation, and quality standards; implement data validation, testing, lineage, and full documentation of data flows

  • Focus on data products: deliver usable, stable and well-documented data products (models, tables, dashboards) that business users can rely on, going beyond just building infrastructure

  • You have strong SQL skills, able to write complex queries, optimise performance, and build robust data pipelines and data models

  • You have solid experience in data modeling, including dimensional modeling, fact/dimension tables and star or snowflake schemas (or equivalents)

  • You have working knowledge of Python, which is a plus.

  • You can use BI and data visualisation tools (Power BI, Tableau) to create dashboards and reports that truly support business decisions

  • You are familiar with cloud data platforms, data warehouses and data lakehouses

  • You have strong business and data understanding: you can work with stakeholders, translate business needs into data requirements, understand domain logic and deliver useful data products

  • You have excellent communication skills, able to explain technical data concepts to non-technical stakeholders and clearly document data models, transformations, lineage and business logic

Your skils

  • You have strong SQL skills, able to write complex queries, optimise performance, and build robust data pipelines and data models

  • You have solid experience in data modeling, including dimensional modeling, fact/dimension tables and star or snowflake schemas (or equivalents)

  • You have working knowledge of Python, which is a plus.

  • You can use BI and data visualisation tools (Power BI, Tableau) to create dashboards and reports that truly support business decisions

  • You are familiar with cloud data platforms, data warehouses and data lakehouses

  • You have strong business and data understanding: you can work with stakeholders, translate business needs into data requirements, understand domain logic and deliver useful data products

  • You have excellent communication skills, able to explain technical data concepts to non-technical stakeholders and clearly document data models, transformations, lineage and business logic

Your skils

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Continuous learning and access to cutting-edge Microsoft data & AI projects that keep you ahead of the curve

  • Learning and growth through hands-on coaching and support from experienced colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Continuous learning and access to cutting-edge Microsoft data & AI projects that keep you ahead of the curve

  • Learning and growth through hands-on coaching and support from experienced colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to join our team as an analytics engineer? Get in touch!

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Analytics Engineer2026-02-02T12:55:33+00:00

Senior AI Engineer

Do you speak fluent Python? Does automating MLOps pipelines on Microsoft Azure and fine-tuning GPTs sound like your idea of a good time? Great, then LACO has a challenge with your name on it.

As our Senior AI Engineer, you own the end-to-end delivery of AI solutions on Microsoft Azure. You won’t just be building models, you will be delivering real-world AI. Working closely with cross-functional teams, you’ll architect, develop, and deploy high-performance, scalable AI models that drive real business value. From fine-tuning large language models to shaping MLOps pipelines, you’ll lead projects with impact and help others grow along the way.

Your job

  • Lead the design, development and deployment of machine learning models using Azure Machine Learning
  • Lead the design, development and deployment of Generative AI applications and AI copilots using Azure AI Foundry

  • Architect and implement end-to-end data and MLOps pipelines for training, deployment, monitoring and CI/CD automation

  • Overseeing and optimising AI solutions in production to ensure scalability, reliability and performance
  • Champion responsible AI principles, ensuring fairness, explainability and privacy across all projects

  • Collaborate with stakeholders and cross-functional teams to deliver AI solutions that generate measurable business value

  • Mentor and guide junior engineers in best practices for ML development, MLOps and Azure-based AI solutions

Your job

  • Lead the design, development and deployment of machine learning models using Azure Machine Learning
  • Lead the design, development and deployment of Generative AI applications and AI copilots using Azure AI Foundry

  • Architect and implement end-to-end data and MLOps pipelines for training, deployment, monitoring and CI/CD automation

  • Overseeing and optimising AI solutions in production to ensure scalability, reliability and performance
  • Champion responsible AI principles, ensuring fairness, explainability and privacy across all projects

  • Collaborate with stakeholders and cross-functional teams to deliver AI solutions that generate measurable business value

  • Mentor and guide junior engineers in best practices for ML development, MLOps and Azure-based AI solutions
  • You have at least 5 years of experience as a senior AI or ML engineer, taking end-to-end ownership of AI projects (desig, development and deployment)
  • You are proficient in Python and experienced with ML frameworks such as scikit-learn, TensorFlow and PyTorch, as well as open-source models like Hugging Face
  • You bring extensive hands-on experience with the Microsoft Azure ecosystem, including Azure Machine Learning, Azure Data Lake, Azure Data Factory, Azure DevOps, AKS and AI Foundry
  • You are familiar with OpenAI’s GPT-4 Turbo with Vision, Falcon, Stable Diffusion, LLaMA 2, Mistral, and Gemma, and understand hybrid and semantic search for retrieval-augmented generation (RAG) applications

  • You apply best practices in MLOps, including version control, experiment tracking and automated pipeline deployment
  • You are skilled in CI/CD, Docker and IaaS (ARM, Bicep or Terraform) to support robust, scalable AI workflows

  • You are experienced in analysing and optimising model performance, and implementing improvements for production-grade AI systems
  • You combine strong analytical and problem-solving skills with the ability to guide teams and communicate clearly with both technical and non-technical stakeholders

Your skils

  • You have at least 5 years of experience as a senior AI or ML engineer, taking end-to-end ownership of AI projects (desig, development and deployment)
  • You are proficient in Python and experienced with ML frameworks such as scikit-learn, TensorFlow and PyTorch, as well as open-source models like Hugging Face
  • You bring extensive hands-on experience with the Microsoft Azure ecosystem, including Azure Machine Learning, Azure Data Lake, Azure Data Factory, Azure DevOps, AKS and AI Foundry
  • You are familiar with OpenAI’s GPT-4 Turbo with Vision, Falcon, Stable Diffusion, LLaMA 2, Mistral, and Gemma, and understand hybrid and semantic search for retrieval-augmented generation (RAG) applications

  • You apply best practices in MLOps, including version control, experiment tracking and automated pipeline deployment
  • You are skilled in CI/CD, Docker and IaaS (ARM, Bicep or Terraform) to support robust, scalable AI workflows

  • You are experienced in analysing and optimising model performance, and implementing improvements for production-grade AI systems
  • You combine strong analytical and problem-solving skills with the ability to guide teams and communicate clearly with both technical and non-technical stakeholders

Your skils

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Continuous learning and access to cutting-edge Microsoft data & AI projects that keep you ahead of the curve
  • Coaching and guidance from experienced LACO colleagues to help you thrive

  • Plenty of freedom to take initiative and make your mark
  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Continuous learning and access to cutting-edge Microsoft data & AI projects that keep you ahead of the curve
  • Coaching and guidance from experienced LACO colleagues to help you thrive

  • Plenty of freedom to take initiative and make your mark
  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check

  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to share your experience as senior AI engineer? Get in touch!

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Senior AI Engineer2026-02-02T12:56:03+00:00

Medior AI Engineer

Love working in Microsoft Azure and bringing AI concepts to life? LACO has the perfect role for you.

As our Medior AI Engineer, you will go beyond supporting projects. You will actively shape real-world AI in close collaboration with data scientists, data engineers and software developers. From designing and deploying scalable, reliable AI models and building pipelines to experimenting with Generative AI, you’ll turn complex challenges into scalable solutions while growing your expertise every step of the way.

Your job

  • Support the development and implementation of AI solutions on our client’s Azure platform

  • Deploy machine learning models using Azure Machine Learning Studio

  • Develop Generative AI applications, including AI copilots, using Azure AI Foundry

  • Build data pipelines for data ingestion, training and deployment (CI/CD) of models
  • Monitor, adapt and maintain AI solutions within a production environment
  • Apply responsible AI principles such as fairness, explainability and privacy in every step
  • Collaborate with stakeholders and teams to implement AI applications that deliver value

Your job

  • Support the development and implementation of AI solutions on our client’s Azure platform

  • Deploy machine learning models using Azure Machine Learning Studio

  • Develop Generative AI applications, including AI copilots, using Azure AI Foundry

  • Build data pipelines for data ingestion, training and deployment (CI/CD) of models
  • Monitor, adapt and maintain AI solutions within a production environment
  • Apply responsible AI principles such as fairness, explainability and privacy in every step
  • Collaborate with stakeholders and teams to implement AI applications that deliver value

  • You have at least 2 years of experience as an AI or ML engineer, with hands-on project delivery
  • You are proficient in Python and experienced with ML frameworks such as scikit-learn, TensorFlow and PyTorch, as well as open-source models like Hugging Face.

  • You have worked with OpenAI’s GPT-4 Turbo with Vision, Falcon, Stable Diffusion and Llama 2, using hybrid and semantic search to power retrieval-augmented generation (RAG) applications. Familiarity with other LLMs, including Mistral, Llama, and Gemma, is highly valued.

  • You are comfortable to work within the Microsoft Azure ecosystem, including Azure Machine Learning, Azure Data Lake, Azure Data Factory, Azure DevOps, AKS and AI Foundry

  • You apply MLOps practices, including version control, experiment tracking and pipeline automation

  • You are familiar with CI/CD, Docker and IaaS (ARM, Bicep or Terraform)

  • You analyse model performance and implement improvements

  • You combine strong analytical thinking with a hands-on problem-solving mindset

Your skils

  • You have at least 2 years of experience as an AI or ML engineer, with hands-on project delivery
  • You are proficient in Python and experienced with ML frameworks such as scikit-learn, TensorFlow and PyTorch, as well as open-source models like Hugging Face.

  • You have worked with OpenAI’s GPT-4 Turbo with Vision, Falcon, Stable Diffusion and Llama 2, using hybrid and semantic search to power retrieval-augmented generation (RAG) applications. Familiarity with other LLMs, including Mistral, Llama, and Gemma, is highly valued.

  • You are comfortable to work within the Microsoft Azure ecosystem, including Azure Machine Learning, Azure Data Lake, Azure Data Factory, Azure DevOps, AKS and AI Foundry

  • You apply MLOps practices, including version control, experiment tracking and pipeline automation

  • You are familiar with CI/CD, Docker and IaaS (ARM, Bicep or Terraform)

  • You analyse model performance and implement improvements

  • You combine strong analytical thinking with a hands-on problem-solving mindset

Your skils

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Access to innovative, cutting-edge Microsoft data & AI projects to challenge and grow your expertise
  • Continuous learning and guidance from experienced LACO coaches and colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check
  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Our offer

  • An exciting career opportunity in a dynamic environment, offering a competitive salary that rewards your skills, drive and potential, fully aligned with sector standards

  • Access to innovative, cutting-edge Microsoft data & AI projects to challenge and grow your expertise
  • Continuous learning and guidance from experienced LACO coaches and colleagues
  • Plenty of freedom to take initiative and make your mark

  • A full-time 40-hour workweek with 20 legal holidays and 12 additional compensation days to keep your work-life balance in check
  • A warm, vibrant and supportive culture where teamwork, fun and regular events are part of the DNA

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Apply in 4 steps

1

Let’s meet!

Time to get acquainted. We’ll dive deeper into your specific skills and expertise, and you’ll get the opportunity to ask us anything about how we work and what life is like at LACO.

2

Our second date

You’ll meet one of our team leads or a subject-matter expert. Together, we explore your technical knowledge, your strengths and the topics you feel most confident in. Sometimes we include a small, practical exercise to understand how you think and how approach data challenges.

3

The final date

We’re almost there. Time to meet our HR and Operations Director in our office in Diegem. During a friendly talk, we’ll get to know you and you can get a feel of the atmosphere and culture at LACO.

4

An offer you can’t refuse

Next up: getting down to the nitty-gritty. We’ll make you an offer that’s tailored to your needs and expectations, along with a personal onboarding plan. Welcome to the family!

Ready to grow as a Medior AI Engineer?

This field is for validation purposes and should be left unchanged.
Name(Required)
Email
Phone
What is the best time to contact you?
Anything else to add?
Upload your cv
Accepted file types: pdf, png, word, jpg, , Max. file size: 256 MB.
More information about how we handle your data can be found in our privacy policy.
| LACO

Hi, I’m Anne.
I look forward to getting to know you and exploring together whether LACO could be the right place for you to grow.

Medior AI Engineer2026-02-02T12:56:26+00:00

Belfius accelerates innovation by migrating its data platform to Microsoft Azure

A new data platform based on Microsoft Azure is helping Belfius boost operational efficiency, while also enabling faster decision-making and greater innovation. LACO supported the initiative by guiding the process analysis, coaching employees, and leveraging its in-depth knowledge of the Belgian financial sector to facilitate the migration as a local partner.

The problem: a data platform nearing end of life

Belfius is a Belgian banking and insurance group with approximately 3.8 million customers across Belgian society and its economy. Over the years, the group has built a long history with on-premises technology from SAS. However, a dual challenge forced Belfius to review its existing SAS environment.

On the one hand, the platform was gradually reaching its limits and could no longer provide sufficient support for future data challenges – including the use of AI. On the other hand, SAS advised Belfius to migrate from its on-premises platform to SAS Viya, a cloud-based solution, by 2027.

This challenge arose at a time when the Belfius group was already embarking on a broader transition of the actual technological infrastructure. The IT architecture team was working on strategy with Microsoft Azure cloud as one of the central platforms. “It made sense to include the data platform in that exercise,” says Tom Bisschop, Program Manager ICT Data & Reporting at Belfius Insurance. “After all, the ultimate goal was to develop a future-proof data platform for both the bank and the insurance division.”

The solution:

move to the cloud

After reviewing its options, Belfius concluded that SAS was not the best solution any more for their needs. “Partly because we are currently seeing a rapid evolution in data usage,” continues Tom, “it quickly became clear that, given our plans, Microsoft Azure would offer us many more possibilities.”

This decision also repositioned LACO within Belfius. While LACO had long been responsible for implementation and support for SAS, the company has also built strong expertise in Azure in recent years. “LACO made it clear that we were far from the only company considering the move to Azure,” says Tom. “That gave us confidence that we could continue to rely on LACO for this new chapter in our technological journey.”

The first challenge was to establish a solid foundation in Azure for both the banking and insurance businesses. “We began by drawing up a comprehensive business case and defining a clear Cloud Adoption Strategy,” explains Tom. From there, LACO and Belfius developed a phased roadmap for migrating the on-premise data platform to Azure. This included designing a robust architecture and setting up a scalable data platform framework, supported by a thorough security assessment and an accompanying approach.

Next, Belfius and LACO launched a pilot with a small selection of SAS applications. “By migrating those applications to Azure Synapse, we were able to gain valuable hands-on experience.” And through hands-on coaching and training, LACO familiarized Belfius with Spark and Python so that they could continue building on the new platform themselves.

The pilot took some of the pressure off the overall project. “We didn’t have to rush things,” says Tom. Once the pilot proved successful, Belfius accelerated the migration of the entire environment. “LACO’s specific experience with our insurance business was a major added value. We rewrote outdated components for the new Azure environment, which relies on Spark and Python. LACO played an important role in the associated analysis.”

The result: a high-performance, future-proof framework

LACO closely monitored the new framework’s performance. This proved to be a key consideration, and it was also a major departure from the old on-premises setup. When working on-premises, the amount of computing power is inherently limited by the available hardware. In the cloud, computing power is unlimited, but of course it comes at a price. LACO succeeded in striking the right balance between performance and cost. The solution on Azure Synapse is now faster than the previous on-premises environment, illustrating how LACO succeeded in elevating Belfius’s cloud maturity while keeping costs under control.

Employee coaching also proved critical to the project’s success. Migrating to a completely new platform brings significant change. “It’s never easy to let go of a familiar environment,” says Tom. “LACO addressed employees’ questions with targeted coaching and acted as a bridge between departments. That approach made it clear that ultimately everyone benefits from the shift to Azure.”

With the migration, LACO played an important role in a key objective for Belfius: a future-proof data platform. “Self-service capabilities will increase significantly,” Tom notes. “The business now has more flexibility without having to call in the IT department at every turn.” This eliminates a significant amount of grunt work for the IT department.

In addition, insurance and banking now share a single data platform. “This means greater efficiency and allows us to roll out new data capabilities more easily to all data users, while also complying with all legal and regulatory requirements.” With the new data platform, Belfius is now in a strong position to fully – and rapidly – dedicate itself to innovation, including AI.

Ready to become a data-driven powerhouse?

Belfius accelerates innovation by migrating its data platform to Microsoft Azure2026-02-03T11:15:56+00:00

Power BI | From data to insight

This training, exclusively organized for your company, provides a structured and practical introduction to Microsoft Power BI, focusing on how to transform raw data into clear, reliable, and actionable insights.

Participants will learn not only how to build reports, but also why certain modelling and design choices matter in a professional BI environment.
The course combines conceptual foundations with hands-on exercises, ensuring that knowledge can be applied immediately in real-world scenarios.
We tailor this training completely to your needs & your environment.

What you’ll learn

  • Understand the Power BI ecosystem (Desktop, Service, data refresh, sharing)
  • Import and transform data using Power Query
  • Build a solid data model using best practices (relationships, star schema)
  • Create DAX measures for calculations and KPIs
  • Design clear, user-friendly, and performant reports
  • Apply basic performance and usability principles
  • Publish and share reports securely within an organisation

Who should attend

Data analysts and reporting professionals

Business users who want to create their own reports

Consultants and BI developers new to Power BI

Anyone working with Excel, SQL, or other reporting tools who wants to move to Power BI

Programme

Introduction & Architecture Power BI overview, components and typical BI workflows

Data Preparation (Power Query) Data connections, cleaning, shaping, combining and reusable transformations

Data Modelling Fact/dimension tables, relationships, cardinality and common pitfalls

DAX Fundamentals Measures vs calculated columns, core functions (CALCULATE, FILTER, time intelligence) and row vs filter context

Report Design & Visualisation Choosing the right visuals, layout, storytelling and user interaction (filters, slicers, drill-through)

Publishing & Sharing Power BI Service, workspaces, access rights and refresh strategies

Content is tailored to participants’ level and preferences.

Location

This training can be held at the LACO office or at your training facilities.

FAQ

What is the required level of prior knowledge or experience for this training?2025-11-26T13:06:45+00:00

No specific prior experience is required. A basic familiarity with general data concepts and terminology (such as data analysis, modelling, or reporting) is helpful, but the bootcamp is designed for both beginners and those with some hands-on experience.

Is lunch, coffee, or catering included in the price?2026-01-12T15:03:42+00:00

When the training is organized at LACO training facilities, lunch and beverages are provided.

Will the training language always be English (or Dutch/French)?2025-11-26T13:08:44+00:00

Yes, the training is delivered in English. On demand, and for specific groups, a Dutch or French session may be arranged. Please let us know your preference upon registration.

What is the duration of the training?2026-01-12T15:02:00+00:00

The duration of the training depends on the content that is taylored for you.

Ready to level up your data skills?

Start the conversation.

This field is for validation purposes and should be left unchanged.
Name(Required)
You know your team, we know our training. Tell us who should join, when it suits you, and where you’d like the session – at LACO or on-site. We’ll take it from there.
More information about how we handle your data can be found in our privacy policy.

Power BI | From data to insight2026-01-23T11:20:58+00:00

The evolution of ESG Reporting: a new era of transparency

In today’s increasingly sustainability-focused world, businesses are more than mere profit-making entities. They play a vital role in shaping our planet’s future. As such, Corporate Sustainability Reporting – the act of publicly sharing a company’s environmental, social, and governance (ESG) goals and their progress towards them – has become an essential aspect of modern business practice. This level of transparency not only showcases a company’s commitment to sustainable practices but also builds trust with stakeholders – such as investors, customers, employees, and the wider community.

In Europe, this move towards transparency has been significantly propelled by the Non-Financial Reporting Directive (NFRD). This key piece of legislation mandates large companies to disclose certain non-financial data related to sustainability, creating a culture of accountability.

European Corporate Sustainability Reporting has evolved from the implementation of NFRD, through the rise of the European Green Deal, to the birth of the Corporate Sustainability Reporting Directives (CSRD) and the European Sustainability Reporting Standards (ESRS). What do these distinct directives and standards signify? How have they developed? And what is their impact on business?

Introducing the Non-Financial Reporting Directive (NFRD)

The Non-Financial Reporting Directive (NFRD) is a significant piece of legislation adopted by the European Union in 2014 (see figure 1). It requires certain companies to provide non-financial information, often in the form of ‘sustainability reports’, along with their annual reports. This directive applies to large public-interest entities, such as listed companies, banks, and insurance companies, with more than 500 employees, constituting approximately 11,6000 companies and groups within the EU.

The NFRD aims to evaluate the non-financial performance of these companies and encourages them to develop a responsible approach to business. The directive requires public disclosure documents to include topics such as

  • environmental protection,
  • social responsibility,
  • treatment of employees,
  • respect for human rights,
  • anti-corruption, and
  • bribery issues.

The main purposes of the NFRD are to make non-financial information available to stakeholders and investors, and to increase business transparency and accountability. While these EU guidelines are not mandatory, they have set a clear course towards greater corporate sustainability reporting.

The interplay between NFRD and ESG

Environmental, Social, and Governance (ESG) refers to three central factors used in measuring the sustainability and societal impact of a company or business.

  • The Environmental aspect focuses on how a company’s operations affect the natural environment, considering elements such as waste management, energy efficiency, and carbon footprint.
  • The Social component examines how a company manages relationships with its employees, suppliers, customers, and communities where it operates. This includes aspects like labor practices, diversity and inclusion, and human rights.
  • Governance pertains to a company’s leadership, executive pay, audits, internal controls, and shareholder rights. It reflects how a company is governed and the standards it upholds in its business practices.

The European Union’s Non-Financial Reporting Directive (NFRD) is important for ESG reporting. It encourages businesses to take ESG factors into account and report on them, which promotes sustainable business practices throughout Europe.

The emergence of the European Green Deal

The European Green Deal is a set of policy initiatives by the European Commission aimed at making Europe climate neutral by 2050. Unveiled in December 2019 (see figure 1), it represents a roadmap towards a sustainable economy and involves significant investment in green technologies, sustainable solutions, and innovative businesses. It outlines specific policy initiatives across various sectors, from significantly cutting greenhouse gases, investing in innovative research and innovation, to rolling out cleaner, cheaper, and healthier forms of private and public transport.

The NFRD, on the other hand, is a key tool that aligns with and supports these goals. By requiring large companies to report on their environmental and social impacts, and governance practices, the NFRD helps ensure that the corporate sector is contributing to – rather than hindering – the objectives of the European Green Deal. For instance, a company reporting under NFRD would need to disclose its carbon emissions and plans for reduction, which directly ties into the Green Deal’s goal of reducing greenhouse gas emissions.

In essence, while the European Green Deal sets the broader vision and targets for a sustainable European economy, the NFRD provides a framework for businesses to contribute to this vision through increased transparency and accountability. This reporting not only increases transparency but also encourages companies to develop more sustainable business models and practices.

| LACO

Figure 1: Timeline of the evolution of the European Corporate Sustainability Reporting Directive

The evolution from NFRD to the Corporate Sustainability Reporting Directives (CSRD)

The NFRD has been a significant step towards enhancing corporate transparency in Europe. However, recognising the need for more comprehensive and standardised reporting, the European Union has recently introduced a new directive, the Corporate Sustainability Reporting Directive (CSRD), which builds upon and expands the scope of the NFRD.

1. Extending the company scope

One of the key changes brought about by the CSRD is the increase in the number of companies required to provide non-financial information (see Figure 2). While the NFRD applied to large public-interest entities with over 500 employees, the CSRD extends this requirement to all large companies and all companies listed on regulated markets (except for micro-enterprises). This expansion of regulations encompasses a total of 49,000 entities, a significant increase from the previous 11,600.

2. Introduction of mandatory EU reporting standards

The CSRD also provides more detailed reporting requirements. It introduces mandatory EU sustainability reporting standards, aiming to ensure that reports across different companies and sectors are comparable. This is a significant shift from the NFRD, which provided only general guidelines for reporting.

3. Obligatory audit

Furthermore, the CSRD requires an audit (assurance) of reported information, similar to the audits required for financial information. This marks a major step towards ensuring the reliability and accuracy of non-financial reports.

The introduction of the CSRD represents a substantial advancement in the EU’s commitment to sustainable finance and corporate transparency. As companies begin to adapt to these new regulations, they will play a critical role in Europe’s broader ambition to achieve a sustainable, net-zero economy.

| LACO

Figure 2: Extended coverage from NFRD to CSRD

The implications of the European Sustainability Reporting Standards (ESRS)

“The CSRD determines which companies must report, on what topics, where and when. The ESRS provides the ‘how’.”

While the CSRD is an overarching directive that mandates companies to disclose specific non-financial information, including their impacts on the environment and society. The European Sustainability Reporting Standards (ESRS), on the other hand, are the detailed standards that companies subject to the CSRD will have to use when preparing their sustainability reports. These standards, drafted by the European Financial Reporting Advisory Group (EFRAG), specify what companies must report on, providing detailed guidelines on various topics such as climate change, water and biodiversity, and employee-related matters.

In essence, while the CSRD determines which companies must report, on what topics, where and when, the ESRS provides the ‘how’ – the specific principles and requirements that companies must follow when reporting on these topics.

Example of difference between CSRD and ESRS

An example to illustrate this relationship could be seen in the area of climate change reporting. The CSRD might require a company to report on its climate impacts, risks, and opportunities, whereas the ESRS would provide detailed instructions on how the company should measure and report its greenhouse gas emissions, how it should assess and explain its climate-related risks, and how it should disclose its strategies and targets for climate mitigation and adaptation.

Therefore, the ESRS and the CSRD work hand-in-hand to ensure that companies across the EU provide comprehensive, consistent, and comparable sustainability disclosures, supporting the EU’s broader goals of sustainability and transparency in the corporate sector.

Conclusion: the role of data in Corporate Sustainability Reporting

Data plays an integral role in the context of CSRD and ESRS. In the context of corporate sustainability, data can be thought of as concrete evidence that demonstrates a company’s environmental, social, and governance performance. For example, a company might record data on its carbon emissions, its waste management procedures, or its diversity policies.

From CSRD’s perspective, accurate and reliable data is criticalas it forms the basis for these disclosures and ensures that sustainability reports across different companies and sectors are comparable. Therefore, maintaining high-quality data is not just a compliance requirement, but a vital element in driving forward the sustainability agenda. Data Governance plays a critical role here , acting as the backbone that ensures the integrity, accuracy, and reliability of reported information. In essence, data governance provides the framework to ensure that companies are able to provide accurate and trustworthy sustainability information, thereby leveraging their data as a strategic asset.

For business executives, the European Green Deal presents both challenges and opportunities, as companies will need to adapt to new regulations and standards, but also stand to benefit from increased investment in the green economy. And in the world of CSRD, data is not just numbers, but a powerful tool for driving positive change. With data governance at its core, companies can ensure that their sustainability disclosures are reliable and trustworthy, positioning them to seize the opportunities presented by this new era of responsible business.

The evolution of ESG Reporting: a new era of transparency2025-12-30T11:27:51+00:00
Go to Top