Robotics and autonomous systems (RAS) Safety: A barrier or an opportunity for growth
Is ensuring the safety of robotics and automated systems (RAS) a barrier to, or an opportunity for growth? Is technology racing ahead faster than regulation can keep pace with it?
They were among the questions tackled by four speakers at the Hazards Forum event in June. This was held in collaboration with the new National Committee on RAS Regulation, Standards and Ethics. The committee has a vital role in supporting development of a modern regulatory framework for RAS, advancing shared understanding of these technologies and promoting best practice.
The speakers were Anneka Wilson, Health and Safety Executive policy advisor, Michael Sinclair-Williams, group HSQE director for Renewable Energy Systems (RES), Cuebong Wong, senior robotic systems technologist in the National Nuclear Laboratory’s Remote Engineering, Design and Robotics team, and Rav Chunilal, head of robotics and AI at Sellafield.
It’s a balancing act – on the one side, RAS offers the opportunity of reduced plant downtime, cost savings, commercial opportunities, improved safety, and enhanced efficiency. On the other side, RAS presents challenges in terms of data security, new safety risks, more complex lines of accountability, and the potential for workforce distrust. The risks and benefits of RAS need to be addressed in a cross-sector, interdisciplinary and outward-looking way.
Anneka Wilson stressed the importance of the human factor and ensuring technical benefits do not increase risk. Simply put, a robot can work at height more safely than a human, but it can also fall from height and hit someone on the ground. And knowledge gaps were recognised, such as in a highly skilled RAS programmer who is not aware of legislation and compliance.
Nuclear decommissioning at Sellafield represents a significant potential opportunity for RAS. It’s an environment where safety issues are paramount, and progress in the past has been painfully slow. Rav Chunilal explained that RAS could speed up the nuclear decommissioning process, and that open conversations with regulators in a ‘safe’ space were essential to making that happen.
‘What’s under the hood?’ was a common theme that Cuebong Wong addressed. The fast-growing RAS industry is more likely to protect than share intellectual property. How can regulators and workforces trust an AI solution when they do not know how it works?
Michael Sinclair Williams explained how using predictive analytics creates value. For example, you can produce three times more energy from solar panels by using AI to optimise their performance.
Despite the raft of security, technical, economic and regulation challenges, a simple consensus emerged. Speakers agreed that strong communications and transparency were essential between those who create and use RAS and the regulators and workforce. They need to start talking early, to create a shared understanding of the technology and its associated risks and opportunities, keep talking throughout the process and include all stakeholders. Further they must keep a ‘whole system’ perspective, thinking of the human workforce set to work alongside the RAS. Finally, engagement of the supply chain is critical to gaining confidence in the technology and promoting scalable application of good practice.
Edited extracts from the Q&A
If it’s all about safety, how can regulators enable growth and innovation in the development and deployment of RAS in the UK?
Panel responses:
‘We’ve created a more collaborative approach with our regulators. We get them involved early when we’re at the stages of development. … The regulators that we’re involved in now do have an open approach to innovation. We’ve asked if we need to go back to the rule book? Are we now being regulated against an old set of rules which were against a different mission? We’ve had these open, transparent conversations which has really helped bring forward and be ambitious about how we want to be deploying robotics.’
‘The human factor is overlooked in the development of robotic equipment and AI. It often falls short at keeping the human in the loop. What if the machine breaks down, what’s the impact on the workforce? It’s about making sure that at all times, from the conception of the product till the end of delivering the equipment, the human factors are considered.’
‘You need an educated regulator and an educated industry.’
‘If you want to update the Health and Safety at Work Act, that’s going to take time to go through Parliament. By the time that’s implemented, RAS and AI might have a completely new form… The current legislations … are still fit for purpose, because they are non-prescriptive, so they can adapt to whatever is coming.’
Are we confident that our safety tools in the form of regulations are ready to tackle such a major challenge?
Panel responses:
‘A lot of that is around piloting tools and techniques that already exist … developing the understanding again, rather than necessarily saying that these tools aren’t suitable.’
‘This is a challenge that I put back to our regulators in a very safe space. Why are we being regulated like this? Is it because we’re nuclear? How are other sectors being regulated – who are going to be using very similar technologies, if not the same… Is it a fair playing field? The regulation needs to be proportional to the environment that it’s going to be deployed in.’
Going back to the title of this event, Robotics and Autonomous System: Safety, a Barrier or an opportunity for growth? I’m worried to hear that it’s more a barrier because there is no appetite for bringing new regulations because it takes a long time. How can we help organisations navigate the landscape of regulation and all the standards that we have?
Panel response:
‘I think it’s about having an informed population on all sides. I think it is about having clear guidance. But equally, if we pick up the transport bit, we’ve been through this before… When we take 1994, we privatised the railway industry. We had the railway inspector. They had a very, very clear interpretation of the Acts. We then found out that wasn’t working. As a country, we changed the Office of Rail Regulator and gave them a broader mandate, but a more educated and more informed mandate. So, I see we’re probably in the same space, but just a number of years ahead. But I think this is going to be, personally, I think much more complex.’
How can we make sure that a system based on AI is capable and safe in every regard?
Panel responses:
‘It’s not about being able to demonstrate that it’s robust from a single measure. It’s looking at the whole picture.’
‘Are we expecting a computer to be 100 % correct before we say we’ve put some trust into it? What is the environment, what could go wrong? And what could go right as well with it?’
This needs to be good for individuals and for wider society. How do you bring the workforce on board to buy into this, because it could be very transformative for society?
Panel response:
‘It’s about making the job better and safer. We’ve developed technologies around how we remove human hands from gloveboxes, for example. We had an incident that has driven us towards developing a set of technologies that mean we can remove those humans from that hazardous environment. We’re piloting that R and D to see if it works – with the people who are working in that environment. And the feedback from those operators is predominantly around how it could then improve other jobs. Their feedback…has been implemented into the end-product. Bringing everyone along that journey with you is the crucial part of this… it’s about building that confidence and trust in the technology.’
While the airport I work for is very forward-thinking in its safety, most of the accidents we have are with third parties, like handlers. Is it reasonable for them to invest millions into RAS to do the right thing if it’s not commercially viable? How do we start the conversation?
Panel response:
‘As part of an optioneering study, you are looking at a manual process against a fully automated process. You might come out of that study agreeing that an initial study would be worthwhile, which would need to engage that operator workforce and look at the benefits to them and what concerns they might have. Do you need to put in place a training regimen to upskill your workforce as well? And look at transferring them to new roles rather than just assuming they will take up the same role that they have now. And that would be a low-cost way of assessing early whether there are implications that you haven’t thought of during that initial optioneering study. You look at both a manual and automated process in parallel up to a certain point and continue to review.’
Do you think regulators are behind the curve? I feel that industry is galloping ahead and we’re two steps behind. We talked about the need to develop the guidance, but to do that, we need to understand what our own stance is on various issues.
Panel responses:
‘I don’t think we are with our regulator [in the nuclear industry]. … The sandboxing approach brought together different sectors to be able to see what’s going on in other areas, in different environments. Although we operate in a hazardous environment, other sectors do as well. It’s just that the hazard is ever so slightly different.
‘For once, I don’t think nuclear is [behind the curve]. I think we are trying to keep up with the game and take a leap forward where we’re trying to stay ahead of the game as well. That’s ultimately where we want to get to. I think we’re in a good place with our regulator, but I think we need to maintain that as well because it’s very easy to then slip behind. So it’s something that we need to maintain the momentum on.’
The questioner also asked the panel to consider human factors, referring to the work of Lisanne Bainbridge (Ironies of Automation, Automatica, Volume 19, Issue 6, November 1983, Pages 775-779). [Her paper discusses the ways in which automation of industrial processes may expand rather than eliminate problems with the human operator.]
‘The things that they can’t automate, they leave behind for the human to do. Those can often be boring tasks. We have this problem in the rail industry because we introduced in-cab signalling and drivers are left just with a reduced task… The pride about being a driver, all that is lost. So how should we deal with that?
Panel response:
‘We consider everything from a systems perspective and human factors are a very important part of that. Going back to our workshop with our regulator, we did have our lead from human factors there as well, which plays a very important role around undertaking mundane tasks. And is that right for the operator to be just, using your words, left with the things that are too difficult for the machine? If it is, then the whole system needs to be considered rather than just the individual component parts. Today, we’re still at the proof-of-concept stage – we’re looking at where this can drive a value and make a difference. It’s then about coupling that system or bits of that system together and looking at what that means collectively rather than just in isolation.’
I’d like your opinion about the actual robotics industry and its capability – we need to think about the whole cycle of early deployment of RAS
Panel responses:
‘The nuclear industry offers a lot of problems to kickstart a lot of supplier developments. We’ve seen that technology becomes transferable to other sectors. One example has been working with a big-named robotics manufacturer. We had a very specific capability looking at a mobile robot base with a manipulator robot arm. Because of some of our requirements, there’s been a strong emphasis on how we integrate that system to ensure it is safe for use. This has become a product that the supplier can use in other industries.’
‘The industry is very fragmented across the world. There’s not a lot of massive players controlling the space from an industry perspective. Every single company we’ve worked with protects their intellectual property. That doesn’t help get standardisation and transparency. The rail industry set up the Railway Safety Standards Board and the wind industry has Safety On, which tries to bring the wind industry standards up. There’s going to have to be a similar approach if we are going to get the right balance between the regulators and the whole industry.’
‘That was exactly why the UK Task Force was initiated. We brought all the sectors to work together and the ideas for them to look at what is needed, the common challenges, and understanding that future technologies could help grow the robotics industry in the UK.’