Horizon CDT Research Highlights

Research Highlights

Robotics and the Law: Towards responsible and sustainable adoption of industrial collaborative embodied autonomous systems in the case of digital manufacturing

  Natalie Leesakul (2018 cohort)   www.linkedin.com/in/natalieleesakul


  • Dr Richard Hyde, School of Law
  • Dr Max L. Wilson, School of Computer Science
  • Dr Derek McAuley, School of Computer Science
  • Dr Lachlan Urquhart (external supervisor), University of Edinburgh

In Industry 4.0 smart machines are digitally connected with one another – sharing information and making autonomous decisions. This new wave of industrial revolution is expected to enable organizations to strengthen their competitive advantage by fundamentally reconfiguring the business operations in order to improve productivity, reduce risks, and increase product quality [1]. Therefore, Industry 4.0 is becoming a popular research topic in both academic and business arenas. However, Industry 4.0 is a rather broad concept with multiple layers incorporated into the movement. Given the interest of my industry partner, DigiTOP, I am motivated to look into the use of industrial collaborative embodied autonomous systems in digital manufacturing.

Robotics has been introduced to manufacturing since 1961 [2]. However, traditional industrial robots are kept behind barriers with strict policies on the safety protocols, such as, sensors and systems to prevent people from being in close proximity to the robot while it’s working and to ensure that the robot automatically stops when people are within certain range. Collaborative robots (“cobots”), on the other hand, are designed to work alongside humans which raises prominent concerns beyond current industrial robot safety standards. For example, isolating the robot in a cage with multiple sensors as a form of safety measure is no longer applicable. The safety challenges with cobots cannot be overcome by simply installing a standard movement detection; there is a need for safety standards in decision-making criteria programmed into robots. Because cobots learn from human workers, the system must enable robots to distinguish desirable behaviours from harmful behaviours so that a robot can only replicate the appropriate human gestures. Consequently, the interactions between cobots and humans also create the norms of endless personal data collection and processing which can challenge the principle of privacy. Therefore, the purpose of this socio-legal PhD research is to examine the legal, ethical and social challenges posed by cobots within the context of digital manufacturing. The goal is to design a digital toolkit to help businesses uptaking industrial collaborative embodied autonomous systems in a responsible manner.

As it is mandate for end users, technologists, and law enforcers to collaborate in order to effectively tackle the challenges [4], my research focuses on understanding of the pressing concerns from different stakeholders around the implementation of cobots in the workplace. I have interviewed 15 experts in the area of law, policy making, consultancy, manufacturing, and robotics. Based on the finding of this study, I drafted a framework showing the relationship of the different underlying challenges posed by cobots. The framework will be used as the basis to help determine the relevant legal frameworks for further evaluation of the current law. The toolkit will draw on different concepts explored in my PhD. 

[1]Md. Abdul Moktadira and others, ‘Assessing challenges for implementing Industry 4.0: Implications for process safety and environmental protection’ (2018) 117 Process Safety and Environmental Protection 730

[2]Johanna Wallén, ‘The History of the Industrial Robot’ (2008) Linköping University Electronic Press 18, 13

[3]F. Patrick Hubbard, ‘SOPHISTICATED ROBOTS": BALANCING LIABILITY, REGULATION, AND INNOVATION’ (2014) 66(5) Florider Law Review 1803

[4]Lachlan Urquhart and Tom Rodden, ‘New directions in information technology law: learning from human–computer interaction’ (2017) 31(2) International Review of Law, Computers & Technology 150

This author is supported by the Horizon Centre for Doctoral Training at the University of Nottingham (RCUK Grant No. EP/L015463/1) and DigitTOP.