White paper

Self-healing business – fact or fantasy?

Dan Burrows asks whether the latest advances in analytics and business insight, and the sharing of information and data across technology platforms, means we’re teetering on the brink of being able to see into the business future.

One goal, two hats

For a number of years now, business intelligence and analytics, and latterly the concept of enterprise-wide performance management have been growing in popularity with businesses. From the smallest retailer to the largest utility company, the need to monitor performance using easily understood Key Performance Indicators (KPIs) has been recognised and is rapidly becoming ubiquitous.

As a business consultant, I spend a lot of time talking to organisations about the ways in which they can build a better business; conversations which invariably end up making their way round to measuring business activity, and monitoring performance, often using a balanced scorecard as the vehicle for their identification. Unfortunately, whilst it is absolutely the case that benchmarking against realistic targets is a powerful ally in an organisational war on inefficiency and poor performance, KPIs are not in themselves the silver bullet so many perceive them to be.

The secret to better business is understanding every aspect of what is happening within it, identifying where problems lie or improvements can be made, and closely monitoring the effect of day-to-day actions. Only through insight can the real value of information be unlocked; all too often I come across a list of KPIs as long as the proverbial arm, without any questions being asked as to why they’re being measured.

With another hat on, as a technology consultant specialising in the exploitation of business information to develop insights into an organisation’s undertakings, I spend a lot of time talking to businesses about the technology, systems and data which can support them in achieving their goals. These are conversations which, although similar to those described above, revolve around the more tangible aspects of delivering real value from the information sources a business has at its disposal. These discussions focus on turning a conceptual scorecard or list of measures derived from the business strategy into real-world results, real-life business awareness and value-adding performance management insights; here we spend time working out how to actually deliver the knowledge required to satisfy the better business secret I mentioned before.

Beyond the thinking, lies the doing

These two disciplines co-exist happily alongside each other, with the business consultant helping to define the strategy, measures and targets; and the technology consultant providing assistance in identifying data sources, defining outputs, developing data management strategies and selecting the technology platform to deliver the desired results. There’s a natural and logical progression to these two pieces of work, and I consider myself fortunate to be able to contribute to both as it gives me a wider perspective on both the business and technology aspects of business analytics and insight. Once I’ve done my bit(s!) I confidently hand over to a talented team of developers and technology specialists who make the concept a reality for the organisation with whom I’m working.

So then, it’s straightforward enough to sort out management and control of your business using KPIs; job done, and back home in time for tea and medals? Unfortunately not; it is almost never as easy as all that, and even if it were, actually reaching the stage where one can easily monitor performance using one’s shiny new analytics platform is only the beginning. Acting on the information revealed, and using it to your business’ advantage, is the next challenge to be faced.

If you want something doing, don’t do it yourself

On a tour around a massive fuel storage and distribution terminal, I got to thinking about how this challenge can be faced and the means by which its magnitude can be minimised. I was struck by the sheer number of untapped data sources; systems and equipment producing useful information which was not being exploited; information which if used correctly could not only assist in monitoring and managing performance, but which could actually autonomously benefit efficiency and improve quality of service.

The thought occurred to me that as a technology society, we are rapidly moving towards the possibility of a genuinely self-healing business; able to identify, resolve and report on problems as they occur; empowered by technology to enable autonomous continuous improvement; and minimising human involvement in some of the more risky or dangerous operations and decisions which are made every day in hazardous environments.

Modern industrial environments rely heavily already on automation. Take as an example any power station, electricity distribution network, sewerage system, gas storage and supply system, factory or assembly line. Each of these environments, though each performing very different tasks, may rely on machines giving information to humans in order that decisions can be made:

  • If power demand is increasing, should an additional generator be brought online?
  • If a power line trips out of service, can a breaker reset bring it back to life?
  • If incoming flow from surface water is too great, should flow be diverted?
  • If pressure drops, should supply be cut off due to the possibility of a leak?
  • If a bearing runs hot, should the production line be slowed to allow it to cool?

The mechanisms by which the information is produced, and subsequently acted upon, may well differ between each setting, but the basic premise is the same; numbers come out of a ‘black box’ such as a PLC (programmable logic controller) and someone reacts to the numbers, but only when they are not within a normal range or demonstrate something unexpected is going on.

Surely, then, it would be simple to automate some of these decisions? Obviously there are many automated fail-safe systems already in place; witness the fact that a wind turbine will feather its blades when wind speed reaches too high a level to guarantee safety, or that a nuclear reactor control system will automatically drop its control rods (or SCRAM) in the event that it reaches potential meltdown; however, it should be possible to go much further than this.

Current thinking, particularly from German manufacturing giants such as Siemens, puts a name on this kind of autonomy – ‘Industrie 4.0’ – the fourth industrial revolution. Smart factories, which will allow mass-manufacturing capabilities to be applied to individualised, bespoke and heavily customised products are already being built to showcase the capabilities of this new world; satisfying the inexorable rise of expectations amongst consumers of their products being delivered to their specification and on their terms.

Potential, already realised?

Modern PLCs used in manufacturing and process control and monitoring are designed to operate on the same network as the servers, desktop PCs and printers that the human workers use. There are, or of course should be, restrictions on what interactions they can have; one does not want a keen staff member ‘accidentally’ stumbling across the configuration interface for the factory control system; but the foundation for communication is in place and a fairly straightforward network separation exercise will prevent ‘cross contamination’ of the data shared by the PLCs and that of the desktop PCs.

A cutting edge manufacturing environment will have a wide range of sensors, control systems and interfaces which, in many examples, will employ ‘M2M’ (machine to machine) communication to allow interaction between them; for example, a robotised spray booth in an automotive factory will constantly monitor booth temperature, humidity, paint feed flow rate, position, line speed, and many other variables in order to automate its management and reduce the human interactions involved in ensuring its efficiency.

In the multi-billion pound automotive industry, the spray booth will also communicate with the body preparation line, welding robots, galvanising dippers, and similar in order to allow a failure at one point to initiate an action or series of actions at others. These events, actions, and reactions could allow the whole production or assembly operation to largely manage itself, taking account of what is happening to maximise its own effectiveness at all times and all without a human needing to be involved; although moving to a fully-automated line will take time and commitment before it becomes commonplace.

The question is; with all of this potential, and with examples such as those set by the global motor industry to follow, why isn’t every business doing away with its humans? The simplest answer is cost; implementing a fully integrated environment where everything is monitored and automatically controlled is hugely expensive, especially where a business has evolved from humbler beginnings. Few manufacturers are fortunate enough to be able to simply build a factory from scratch with all new machinery and the latest electronic control systems, and upgrading older machinery to the latest control systems can be not only expensive but often simply technically challenging. That’s not the biggest barrier to progress however; ultimately the ability of us humans to accept and adapt to change represents the greatest challenge to be overcome.

As a result of these factors, it is extremely common to find a hybrid mixture of modern computerised control systems co-habiting with ‘traditional’ monitoring and control technology; the hard wired systems that rely on a control panel covered in lights to indicate what’s happening at each sensor location, and buttons to make something happen when a light the wrong colour comes on; buttons which still, in most cases, are ultimately pressed by a human finger.

Fortunately, whilst these traditional systems may start to sound archaic when compared to the future factories of Industrie 4.0, many machine manufacturers can offer ‘halfway-house’ upgrades which, whilst not necessarily making them ‘clever’, can make the data they generate accessible. Where a light comes on and goes off a hardware interface can generate a binary 1 or 0 to indicate its state; where a button connects two wires to turn something on and off, the same interface can convert a mouse click to a physical movement; human interaction can be phased out on a gradual basis, addressing the problems of change on an incremental level.

Gradually, more and more control systems are being integrated with computer systems; users in the control rooms of oil refineries, on drilling rigs, in power stations increasingly do their work from a console of computer screens rather than a room full of buttons, levers and warning lights. The same things may be happening behind the scenes, but the means of making them happen is less tactile, less physical, and although often not exploited provides a platform for greater intelligence and automation than we have ever had before. Moving into the future, surely it will not be long before the responsibility for making operational decisions is devolved to the new generation of machines with their infinitely greater ‘understanding’ of how they are performing?

From shop-floor to top-floor

From the perspective of a technical analyst, these new data sources introduce a whole new set of possibilities. Using simple workflow technology, automation of actions using the data gathered becomes simple. In the examples I used above, simple monitoring of data against nominal values allows action to be taken without human decision makers being involved. Flicking a switch to engage another generator, resetting a substation breaker, opening or closing a valve; all made straightforward by the application of technology. Shouldn’t it also be possible, then, to use the lessons learned in industry to improve life in the boardroom? The principle of allowing technology assisted decision making based on established decision pathways is nothing new.

How long, then, before business decisions follow operational decisions? Are we on the brink of the truly self-healing business, where boardrooms are provided not with decisions to make, but the results of decisions already made?

In this modern world of predictive analytics, we can work out what will happen next week, next month, or next year with a frightening degree of accuracy, so is it not logical that we’re not far away from allowing our businesses to decide their own operational plans, or even their own business strategy?

Ultimately, I believe not; or certainly, not yet. That said, however, whilst decisions are already made easier and made more quickly when supported by clear, consistent information (and the facts represented therein), ultimately the human element, and the empathy with or understanding of people and their needs will continue to play a part in decisions. The intangible elements of what we do in business and how we do it, the way the people involved actually feel, cannot be put into the context of numbers on a scorecard. Until that changes, which it probably never will, human beings will always be a part of business.

*Opinions and statements are correct at time of writing, but if anything’s changed before publication, I for one welcome the reign of our new robot masters.

Getting the most out of your data in Manufacturing

04 April 2019 , The Studio, Glasgow

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we'll assume that you are happy to receive all cookies. However, you can change your cookie settings at any time. For further information about how we use cookies and how to change your settings, please read our Cookie Notice

I'm fine with this