This is the transcript of a German language podcast.

It’s so easy: start Excel and enter the newly decided product and article overview. This is much easier than agreeing a solution with IT and then waiting for it. Every day, critical business data lands up and down the country in self-built mini-applications of the specialist departments – and the shadow IT grows.

Although these mini-applications are sometimes professionally and individually helpful, they represent a business risk for companies in terms of data integrity, data security and availability. If the specialist departments are given greater sovereignty in internal IT, both requirements can be combined.

A conversation with project manager Thomas Kneist about the background of shadow IT and possible ways out.

Daniel Rasch: Today we want to talk a little bit about shadow IT and how you can get shadow IT into the sun. Maybe right at the beginning: shadow IT – do you associate that with a curse or blessing? And why?

Thomas Kneist: It is both. It depends on which perspective you come from. If you come from the perspective of a specialist department, which perhaps wants to solve a problem very efficiently, very quickly, shadow IT makes perfect sense. If you come from the perspective of the IT department or the entire company, or even from the perspective of responsibility, it is a curse. Quite clearly. Because shadow IT simply bypasses many standards and that is dangerous.

We will certainly come back to that in a moment. But first a basic question, so to speak: What is shadow IT anyway and what variations do you experience in your daily work on projects?

Shadow IT is everything that passes the IT standards of a company. It starts with end devices and devices. I have a mobile device and have my work emails or data on it. This goes from the use of social media topics to applications where data is managed. And from our point of view that is the critical thing. We as mgm are a software house, we make business applications. This means that we are concerned with this business application data and whenever it is managed in shadow IT, it becomes difficult.

Biggest part: Excel – reason: pragmatism

In other words, both my beloved Excel list, in which I calculate my margins, for example, and a cloud solution ordered in the specialist department via short official channels, are both shadow IT solutions?

That’s right. So the absolute majority are actually Excel spreadsheets or, in the broadest sense, Office applications, Office files that then manage business-critical data. These are price lists, these are calculations, these are customer data, these are concepts, whatever.

How are these solutions created and why do they exist at all?

That is a question of pragmatism. So a specialist department is given the task of managing something. The first thing that comes to mind is Excel. That is very often the case. And then I get very far with Excel. I can manage my data, I can calculate it somehow. I can incorporate macros, I can visualise it. That is a very quick help. And then at some point the problems come.

What problems, for example?

There are many of them. The first is that I have a version problem, of course. I can’t work on an Excel spreadsheet in a division of labour. I have an open version that I work on, that I save, that I pass on when in doubt. Then it is edited. When working in parallel on two files, I again have the problem that I can’t synchronise it. You can see the result by the file names. Then there are file names, called calculation 2020 Underline, date, underline, time, editor, last status, final something. So the longer these file names become, the greater is obviously the risk I have.

A risk of data loss or how would you describe it? How can this go on?

In all directions. So data loss is a topic for once. That of course, when I’m working on business-critical data, I simply want to make sure that I can still access this data after failures, hardware failures, whatever. Back-up and recovery concepts are one thing. Data protection, data security is perhaps even more critical. On the one hand, personal data, DSGVO regulations, they all have to be complied with. However, data protection for business-critical, confidential data is simply not available. At the end of the day, data is passed around uncontrolled, because the tables and files are sent around by e-mail when in doubt, and I can no longer control who has access to them.

So the examples with the long file names and versions are probably known by everyone. But then why don’t business departments go to the IT department and say, we have these and these requirements and please solve them for us?

Because it involves a greater effort. So an office programme is usually available. That I have, that I can use. And as I said, it is also useful in many cases and also leads to reasonable results. There are limits or thresholds. And if I exceed them, then it becomes critical. On the other hand, an IT department is also often very busy. It has its own processes, its own schedules, even when it comes to such things. And then I don’t get my solution as quickly or as easily as I would like

Does this affect all kinds of companies? So large corporations as well as medium-sized businesses? And maybe even administrations, public sector?

My experience says yes.

And in particular forms?

Well, my experience is not broad, but in the end shadow IT is a problem that runs through all areas and affects everyone in some form. I have seen price lists that are sent around by Excel file. Really business critical price lists that are the foundation of a business model. It’s hard to understand why you do that.

 

Focused island solutions – but better: low code platforms

Let us return briefly to the beginning. At Wikipedia there is of course also an entry on shadow IT. There are a few sources, essays compiled and there are even three chances mentioned. For example, it says that the positive thing is that business departments basically deal with IT at all and see IT as a solution. It is mentioned that self-developed solutions are often much closer to the real needs of the departments. Isn’t that enough to say, human, in the end such applications must be used, then this is the best way?

Well, that also has advantages and disadvantages. Of course, a department has a very precise idea of the technical logic it is managing and is very well able to map it somehow. Always provided that it has the possibilities to map it exactly the same way. If these possibilities exist, the result is very ergonomic, very professional, and very focused on the application I have. But also very limited to what I am doing.

And with all the risks. For example, I certainly don’t have a sophisticated authorisation concept behind it, no updating, as is standard for a company in an Active Directory anyway. And there is no strategy for securing the data in any form. How to ensure audit-proof archiving, perhaps deletion and such things. But in order to map a technical logic, this procedure is of course very relevant.

How do we get rid of such a shadow IT for example? Let’s assume that an organisation has come to the realisation that it is indeed becoming more prevalent. What needs to be done?

We have already identified two parties involved. One is the IT department, which is responsible for IT standards. The authorisation systems, user management system, who are responsible for data management. They know which IT guidelines are important in the whole context, including ergonomics, accessibility and such topics. And we have the specialist department, which of course knows very well about their technical logic and the application case. If you combine that, if you give both their core competence, then you have a chance to get out of it.

Ultimately, this means that a specialist department must be given the opportunity to contribute its expertise to a project in some form. And then to implement this technically in such a way that an IT department can map the standards on the one hand, but is not overloaded on the other.

That sounds like a major project again. Exactly what the specialist departments want to avoid.

Of course, this can now be done in different ways. I personally speak of a low-code application. Ideally a low-code application specialised in business applications. And in this form, a specialist department would be able to model its specialist knowledge, i.e. data, data models, structures, the relationship, but also the value-added processes that lie behind it with the associated interfaces, interaction mechanisms and so on, and these models, which represent the specialist knowledge, would then be technically applied and interpreted. For this purpose there is this very technology, which brings the corresponding artefacts with it, which then simply brings the interpreters and these models to life.

Does this only concern the introduction of such a low-code application, so to speak, or then also the continuous operation for n years?

This is particularly interesting for continuous operation. There is the concept of total cost of ownership in software development. So the question is, how much does a software actually cost? Over its entire life cycle. And there are three major cost factors. The first is the initial development. That is usually very transparent, it can be calculated, and perhaps it is adhered to.

But much greater than the initial implementation are two other cost factors. The first is further development. And that is the technical development. A sensible business application with a halfway complex technicality will develop over time. It will grow, it will change. There are new regulations, new processes, and new partners in the business as one of many possibilities for change.

And the third major cost block is technical development, support. In other words, everything that I have to do in the background to keep the whole thing technologically up-to-date and secure. And it is precisely these two blocks that are often underestimated and are answered by this process model, namely to transfer technical expertise into models and thus keep it out of the technology or to separate it from the technology and thus also have an independent technology.

With Low Code quickly to the executable application

In my example with a low-code platform with modelled professionalism, it says that ideally a professional colleague changes the models and this automatically affects the application.

But does that mean that if we stick to the example of price lists, e.g. a sales person who has been using Excel up to now, will he or she have to model, develop, code or something else in the long term?

No. He will have to model the business logic once, either himself or with the support of a service provider, which will then be converted into an application. That is a very simple description. Of course there is a bit more magic behind it. But with the appropriate low-code platform and the corresponding services associated with it, you have your application.

The nice thing is that the application is then not just a prototype, but is actually executable. This means that you very quickly have a visual result with which you can actually work or even test. And which can then be further developed iteratively.

Let me give you an example that everyone has experienced in the past in a software development process in the classic sense: the initial implementation costs one million and takes a year. Now the system has been introduced and now the specialist department has three small wishes. Another field is to be added and there is still some dependency, some new view. And that now costs half a million and takes half a year. And that is a relation that nobody ever understands, but which is actually the case when I do the whole thing the classical way.

In my example with low-code platform with modelled professionalism, it says that ideally a professional colleague changes the models and that this automatically affects the application. This is also formulated very simply. Perhaps I will need synchronisation mechanisms, migration mechanisms after all. But that is a clear difference. It changes the entire relationship, the entire cost structure, the entire value creation significantly. In other words, with a view to total cost of ownership, especially the total runtime of an application is cheaper in the approach I describe.

Does this mean that the technical experts are then also responsible for further maintenance? If, for example, this requirement – we need a box in the top right-hand corner now – is not assigned to a developer even after one year, but is it done within the specialist department with the models and such things?

Correct. So let’s get back to the different people involved. We now have a specialist department. We call them business analysts, and we have the technology, the software developers. Up to now, in a classic project, a specialist department was mainly occupied with writing concepts, somehow documenting its specialist logic and passing it on to the development department in some written form, perhaps visualised to some extent, which in turn has to interpret, read and understand it in order to implement it. Now it is the case that the specialist department can actually take over parts, or even large parts, of the added value itself by modelling itself.

This happens with the help of a so-called domain-specific language that can be learned. This language describes the technicality in a very formal, very standardised way. And these models are then, as I said, only applied. In other words, IT concentrates to a certain extent on the core competence or on demanding tasks. These are then individual development solutions for particularly complex technical use cases. And that is the entire integration, including the entire standard part. In other words, the connection of user management systems, the connection of data management systems, of stepping systems, interfaces, everything that plays a role.

New value-added process without limits

And what types of business applications can be mapped with it? You just mentioned a few examples at the beginning, from Excel to the more complicated things. Are there no limits then?

In principle there are no limits. In other words, any business application that manages data in any form, whether it’s customers, prices, users, whatever, and guides this data through a value-added process, can be mapped with it.

Since it is a newer approach than classical software development, it sounds like a change process. Who should it come from? Does it come from the business department or from the IT department? Then again, we have the problem that the IT department comes from above and wants and prescribes something.

That is different. In my experience, the requirements can come from a business department as well as from an IT department. It depends on who is dealing with the problem right now or who is more likely to recognise the opportunities and want to implement them.

Are there factors critical to success, which is better, just to increase acceptance?

No.

Good. Let’s slowly come to a conclusion. Let’s assume that it has now been implemented in a model-based way, what does everyday life look like? What has changed since then? For both parties involved, the business department and the IT department?

First of all the risks are gone. So now I have a web application that answers all questions. Which represents one specialist area as completely as possible. And also with terms like ergonomics or user experience. That is also an important point. So I have mapped my professionalism. I have ensured data protection and data security. That means there is an authorisation concept and control of who does what with the data and when. And there is also an audit-proof archiving concept behind it. Another key point is digital sovereignty.

And what do the users gain from this? Apart from the fact that they no longer have to write version 83 behind file names?

Exactly that they do not have to do that. They can be sure that they can get into the application, that the data is consistent, that it is up-to-date, that they always work with the latest version and that they are ideally part of a larger process. A further disadvantage of shadow IT is that these applications take place in a very isolated manner, i.e. they cannot use data from previous systems or pass on data to other systems. And now I am simply integrated in a large value-added process and can make my contribution.

Last question: Are companies even conceivable where there is no shadow IT, where everything is in the sun, conceivable and possible?

Certainly in theory, but not in practice.

All right. Thomas, thank you very much for the interview. I’m sure we’ll be talking again about a similar topic and about Excel and Co. Thank you.

You’re very welcome.