| LACO
| LACO

Moving your SAS platform to the cloud: business lessons learned

Once you’ve moved your data platform to the cloud, your work as an IT professional tends to get a lot easier. But to be honest, getting that platform there in the first place can be quite a daunting task. How to tackle that?

Now, when it comes to SAS migration in general, LACO has always been somewhat of a pioneer on the Belgian market. As a matter of fact, LACO was the very first SAS partner to successfully deliver a SAS cloudification project in Belgium. Here are three important lessons we’ve learned from our experience with SAS cloudification so far. You might use them to your advantage!

Lesson 1: Take on the legal and regulatory hurdles from the start

The days that IT professionals sincerely worried about cloud security are long past. Most of us came to trust the high level of security that is built into the major cloud platforms, protecting data by design and by default when used correctly. Unfortunately, however, that hasn’t stopped some of our business colleagues from still worrying about these issues. Their fears need to be acknowledged too, of course. But then it is up to us to address those concerns with clear and hard facts we have at our disposal today. Even more so, as those persistent fears could turn out to be a real showstopper. If your CFO is not comfortable with moving his data to the cloud, for instance, then your project risks never taking off in the first place.

And talking about showstoppers: in certain sectors, such as the insurance industry, there is a set of mandatory legal rules and government restrictions that you absolutely have to take into account, before you can even think of moving your data platform to the cloud. These compliance demands are not insurmountable, but they will require you to obtain several official approvals, sometimes even undergoing a risk assessment. And that usually takes time, as there are no shortcuts or detours for it. Which is why, in these specific sectors, we always start a cloudification project by tackling the legal and regulatory hurdles. If these cannot be overcome, the project simply cannot move ahead.

Lesson 2: Be clear on the business case

A popular misconception that we often come across, even though people should really know better by now, is the idea that running your IT infrastructure in the cloud is by definition cheaper than running it in your own data centre – or having a hosting partner run it for you. In our experience, however, simply migrating your servers to the cloud rarely brings any real value to your business, not even from a purely financial perspective. On the contrary: it often turns out to be more expensive.

No misunderstanding, however, this only applies if you use the cloud the way you would use your former on-premise data centre by leaving all your servers running 24/7 all year round. The great thing about the cloud is that it allows you to run only those servers you require, switching systems on and off at any given notice. It basically lets you add or remove hardware resources in function of your actual computing needs. So if, say you have a reporting environment that is only used intensively by your business colleagues during working hours, you can decrease the server capacity for that environment before and after those hours. Another typical example for cost savings is a testing environment. Instead of keeping it running all the time, even during weekends, you could limit yourself to using that part of your infrastructure only when you actually need to do some testing.

So by using that flexibility, which is typical for the cloud, you can effectively optimise and strengthen your financial business case. Nevertheless, if you ask us, the real key to cloud success is in going beyond the financials approaching SAS cloudification not as a migration project but as an optimisation project. Instead of regarding the cloud merely as an alternative for your own data centre or that of your hosting provider and in a way continuing what you’ve always been doing, you should treat it as a springboard to a new world with possibilities you could only dream of before.

If you look at the cloud for new capabilities and extra functionality, you might just discover that there are applications and functionality within your reach, such as advanced disaster recovery features, that you would never have been able to deploy with just your own data centre.

Lesson 3: Match your licensing models

Moving your SAS data platform to the cloud also requires matching the different licensing models. This is especially challenging when you’re dealing with an older licensing model for your data platforms, since these older models – and not just those used by SAS – are still very much bound to physical hardware such as CPU cores. That is not necessarily the case, of course, with virtualised and cloud environments, where usage- and client-based licensing models continue to grow in popularity.

Matching licensing models is somewhat less challenging, as you can probably imagine, for those customers who have already moved on to SAS’ latest data platform: SAS Viya. Running on a scalable, cloud-native architecture, SAS Viya is an open and cloud-ready platform. Consequently, SAS Viya customers can entirely benefit from an easier cloud migration concerning licence fees.

The exercise of matching your software vendor’s licensing agreement with your needs in terms of scalability and elasticity, has to be done right from the start, as it might be another showstopper. Therefore, we invariably advise our customers to reach out and establish a satisfying agreement with their vendor and in some cases even their cloud provider. Not only they usually have a number of licensing programmes to choose from, sometimes with discounts that customers can profit from. They can also help to establish a smooth transition period. After all, you don’t move to the cloud overnight, do you?

So far for the business lessons learned from our SAS cloudification projects. Feel like diving a little deeper into the actual technology? Head quickly to our blog post with technical lessons learned . But first: check out our SAS cloudification page!

Moving your SAS platform to the cloud: business lessons learned2026-02-16T08:39:24+00:00

Getting everyone on board for your data platform cloudification project: 4 major hurdles to take

Your data platform has to move to the cloud: you yourself are fully and absolutely convinced of the ever more urgent need to make that happen. More importantly, you see the many potential benefits of such a move.

This is the easy part, however, for now you still have to convince all other internal stakeholders involved, if you want your cloudification project actually to succeed. So this means selling your project internally, to your colleagues or at least to those colleagues who have the authority and necessary leverage to get your entire organisation on board.

All in all, there are four major hurdles to clear before you can really kick off such an ambitious, strategic project. In what follows, we will explain how you can best prepare your organisation to take those hurdles. In other words: which checks you need to perform and tasks to do, to gain a better insight in your cloudification initiative and turn all the stakeholders in your organisation into co-promoters.

Technical hurdles: some basics to check with your CIO

Check your bandwidth

You will need to check and test the effective bandwidth and quality of the network that connects your data centre with your cloud provider, enabling the up- and downloading of data between both.

Check your data transfer volumes

In your network analysis, you also need to take into account your current way of working with your data platform and the resulting data volumes that will be transferred between your local and cloud environment. If that turns out to be a potential bottleneck, you should re-evaluate it and seriously consider adopting new and more appropriate techniques and potential design changes. Otherwise this could turn out to be a complete showstopper!

Check your load strategy

Talking about showstoppers, do not forget to implement the basics of data management in your ingestion strategy. Check for instance where a full load strategy or a delta load strategy, using a change data capture (CDC) solution, would be appropriate or even required.

Legal hurdles: check the regulatory constraints with your CISO, DPO or CDO

This ought to be a no-brainer, really, but do not forget to check the data security and privacy regulations with your Legal department and your Chief Information Security Officer (CISO), Data Protection Officer (DPO) and/or Chief Data Officer (CDO). Make sure you have these colleagues on board from the outset, so you can avoid unpleasant surprises or even showstoppers along the way. Getting upfront security clearance will allow you to proceed without interruption with the execution of the cloud roadmap for your data platform.

This is also where you decide which data to move to the cloud and which to keep on-premises. As a rule, non-sensitive data tend to be moved more easily to the cloud, whereas for example personal or financial data tend to rather remain on-premises.

You can read more about the legal and regulatory hurdles in our blog about the business lessons learned when moving to the cloud.

Financial hurdles: keeping costs under control

(or is there truly no limit on your credit card? ;-)

When taking on a strategic project of this kind, you most certainly want to avoid financial surprises as much as technical surprises. That’s why you better engage with your CFO and/or your financial department.

Check your business case

First and foremost, consider what you could do more or better in the cloud. Migrating your data platform to the cloud could present itself, for instance, as an opportunity to implement advanced analytics, data science or self-service on a broader scale. You can read more about the importance of finding the right business drivers for your cloud migration project here.

Discuss your data governance strategy

Don’t hesitate to use your cloud initiative to correctly implement your data governance strategy. Also, do not forget or avoid to discuss and clear out some important governance issues with data user community. Here are just two examples:

  • Align with your data scientists what freedom they can obtain to ingest and process data.
  • Check with your reporting user community how far they want to go in self-servicing for the data ingestion/connection part of your platform.

Calculate your TCO

As with any architecture exercise, do not forget to do a TCO calculation at the start of your assessment. You can make convenient use of the built-in cloud cost calculators and advisors of the different cloud platforms.

Monitor continuously

Even more importantly: do not forget to do some financial monitoring during the execution of your cloud migration project. Experience has shown that this needs to be a continuous exercise in order to avoid unpleasant events, such as the sudden explosion of your monthly cloud invoice.

You also need to continuously track your background processes as well as the usage of your reporting/analytics environment, to be sure that appropriate actions can be taken to counter the highly intensive and therefore costly usage of your data platform.

‘Human’ hurdles: technology without people won’t work

Communication and training – two cornerstones of the change management process – are also key to get your colleagues and, more importantly, your end users and managers on board. Remember above all to keep it real, and find the balance between your long- and short-term goals.

Stay realistic: go for quick wins but also work on longer-term benefits

One of the worst things that can go wrong in any cloud project, is that you managed to oversell the cloud to your colleagues in and outside IT. There is nothing that undermines an endangers the successful execution of a cloud roadmap so much as the mere idea or suggestion that the cloud will solve all issues and problems.

The cloud is not some kind of technological Walhalla, so remain pragmatic using it. Try to find some quick wins instead, this to help you get the buy-in from your user community. Here’s just a handful of quick examples:

  • Set up a proof-of-concept (POC) to showcase how easy it is to upscale and downscale your IT resources in the cloud. This can show flexibility in performance and speed-to-delivery but also from a cost perspective.
  • Demonstrate the strong disaster recovery functionalities the cloud has to offer.
  • Optimise cost and flexibility on your DEV & QA environment.

  • Find the use cases where a cloud solution is explicitly showing its strengths in flexibility and scalability. Take a case, for instance, where you have a combination of structured and non-structured data, the majority of which is already in the cloud, and where some users are already experimenting with cloud solutions.

Develop the specific cloud competencies you require

As with any change in existing technology or adoption of new technologies, you need to invest in a training plan or programme to develop those specific cloud competences that can assure the successful execution of your cloud roadmap. In other words: be prepared to make the necessary investments to define and execute a detailed training plan for your internal staff, or find an experienced and competent partner to help you get started with it. Outsourcing the service of the new environment is of course also a valid option.

In any case, do not underestimate the mental and cultural changes that are required from your organisation in general and some of the users of your data platform in particular, whether they are working in an IT or business environment.

And last but not least, should you get confronted with an acute lack of in-house expertise, don’t hesitate to use some plug-and-play PaaS components to make your life – or rather your work – somewhat easier. Finally always keep in mind that Rome too wasn’t built in a day!

Getting everyone on board for your data platform cloudification project: 4 major hurdles to take2026-02-16T08:39:39+00:00

Moving your data platform to the cloud? Check your data gravity!

The sheer volume of your datasets can be a serious hurdle when you consider moving them to the cloud. In fact, as datasets grow larger, they simply become harder to move. That’s when dealing with data gravity becomes your next challenge.

Data gravity: what is it?

Data gravity is a metaphor introduced into the IT lexicon by a software engineer named Dave McCrory in a 2010 blog post. The general idea is that data and applications are attracted to each other, similar to the attraction between objects that is explained by the law of gravity. In the current Enterprise Data Analytics context, as datasets grow larger and larger, they become harder and harder to move. So, the data stays put. It’s the gravity — and other things that are attracted to the data, like applications and processing power — that moves to where the data resides.

Data gravity: a determining driver

One of the basic questions to ask yourself, before you can even think of moving your current on-premises data platform to the cloud, is where your data gravity currently lies. And where it could or should lie in the future. In other words: where does the majority of your data reside that you are or will be ingesting into your data platform? And where does the majority of the ‘consumption’ of your data take place, now as well as in the future?

To clarify this matter, here are some further questions for analysis:

  • Is the minority/majority of the data sources that need to be ingested into your data platform still on-premises or already in the cloud?

  • Is a big transformation or migration track ongoing or underway, which will move a majority of these data sources to the cloud?
  • Does the minority/majority of your ‘data consumption’ still involve heavy local data processing? Or are most data consumers already used to optimising all the available resources, including centralised and cloud computing, instead of utilising to the max their local PCs, drives, etc.?
  • Are you expecting to service more and more ‘external’ data consumers, such as clients, partners and suppliers?

In short, if your data gravity is already shifting from your own on-premises data centre to the cloud, then you should probably consider moving your entire data platform to that new centre of gravity. And this advice is even more pertinent if the use of that platform is increasingly being extended to external data consumers.

Cloudification hurdles: what’s actually stopping you?

Before effectively ‘cloudifying’ your data warehouse, data lake or reporting and analytics platform, you may need to tackle a number of technical, legal, financial and human hurdles upfront.

  • Technical hurdles: Is your IT environment ready?
  • Legal hurdles: What is allowed by law?
  • Financial hurdles: Is there a solid business case?
  • ‘Human’ hurdles: What is the human resistance to the cloudification idea within your organisation?

You can read more about these cloudification hurdles in this blog post.

Moving your data platform to the cloud? Check your data gravity!2026-02-16T08:39:54+00:00
Go to Top