Dr John Bates, CEO, of Eggplant, explores the impact Robotic Process Automation is having across the industrial landscape.
In a world dominated by high volume production, machine-led production line processes are commonplace in every large scale operation. Factories around the world have grown exponentially, riding the wave of robots when it comes to repetitive tasks on the production floor.
It’s a cycle that has become so sophisticated, it is now infiltrating the back-office operations in a mechanism commonly referred to as Robotic Process Automation (RPA). This is hardware and/or software that helps augment or replace human workers in repetitive, mundane processes. It often replaces fairly rudimentary tasks such as data entry, processing it more quickly, more efficiently, and with fewer errors. This is why RPA is so compelling.
RPA in action
Renault, for example, is deploying RPA to support everything from design to the physical manufacturing process, while the same technology is also being used to automate and test systems like product lifecycle management (PL/M). This requires the creation of robotic users, which can automate standard calibration tasks, and then measure to see if the PL/M system is performing correctly.
This ensures smooth running, with cost and time savings, and can anticipate expensive slow-downs or production issues for large manufacturers.
Another area is automated in-vehicle systems such as satnav or even elements of self-driving. This sort of control system can apply to anything, such as testing an aircraft, a tank or a drone, used in many heavy industries, like defence. These industries are incredibly mission-critical, so the automation and testing of these processes is a highly complex (and repetitive) business.
But getting RPA right extends way beyond the realm of classified military operations. Underlying software systems in scenarios like a tube train have dedicated time windows for processing certain tasks – a so-called “hard real-time system” – with numerous fail safes attached to them, which guarantee they have enough bandwidth to handle any task. For instance, a train won’t fail to stop at a station because it is ‘too busy’ running the air conditioning. That’s never going to happen because it has been statistically analysed to make sure it has enough cycles to handle the peak of whatever tasks it has to do at any given time.
RPA becoming life critical
So, when automating with RPA, businesses need to assess how “life critical” a particular process is, and, if necessary, anticipate every circumstance so that it won’t run out of bandwidth at a critical time. In a factory, there may be an incident on a production line where one out-of-control bot leads to tragedy. Not to belittle it by any means, but an RPA failure in this instance might have limited impact to people in its vicinity. Consider the potential impact of a bot running a line of code or an algorithm linked to drug production or control systems in the military, which has far more reaching effects to potential harm to society at scale. If it is a life-critical system, businesses need to think about applying lessons from hard real-time systems so it is scaled to the maximum peak bandwidth.
On another level entirely, if you take the ultimate in automated tasks, such as self-driving vehicles, then all the involved stakeholders need to consider the ethics involved in automating critical decisions. This is taking it to the extreme, of course. Where it’s not just the automation of manual tasks, but actually being able to adapt those tasks with some intelligent thinking. This brings in a whole level of ethics in terms of how a machine decides in these circumstances, as well as raises a number of questions. Who will regulate RPA for heavy industry? When is the industry able to use AI, and who will be liable?
Continue reading article…