Organization:National Centre for Nuclear Robotics

From HandWiki
Short description: National Centre for Nuclear Robotics in United Kingdom
National Centre for Nuclear Robotics
Established2018; 6 years ago (2018)
Research typeResearch
Budget£42 million (2018)
Field of research
Artificial intelligence
Robotics
Environmental science
Photon science
Data science
Computational science
DirectorRustam Stolkin
LocationBirmingham
AffiliationsUniversity of Birmingham
University of Edinburgh
Lancaster University
University of Essex
Queen Mary University of London
University of Lincoln
University of Bristol
University of the West of England, Bristol
Operating agency
University of Birmingham
Websitewww.ncnr.org.uk

National Centre for Nuclear Robotics (NCNR) is a science and engineering research consortium of eight universities in the United Kingdom led by the University of Birmingham, aiming to develop technologies to address the problem of nuclear waste in the UK.[1][2][3] As part of the initiative NCNR is developing technologies such as machine vision, artificial intelligence and advanced robotics to decommission 4.9 million tonnes of nuclear waste generated by the nuclear industry in the country over the past since the early 1950s.[4]

Overview

NCNR was launched in 2018 and the University of Birmingham obtained £42 million in funding, co-funded by the Engineering and Physical Sciences Research Council, to establish it.[3][5] The consortium of the eight universities consist of the University of Birmingham, University of Bristol, University of Edinburgh, Lancaster University, University of Essex, Queen Mary University of London, University of Lincoln and University of the West of England. As of 2018 Rustam Stolkin is the Director of NCNR.[6][3]

The research team at the University of Birmingham studied the motive behind robots executing tasks like humans in order to safely work alongside people. The team consisted of Valerio Ortenzi, Marco Controzzi, Francesca Cini, Juxi Leitner, Matteo Bianchi, Maximo A Roa, Peter Corke. This research was published in Nature Machine Intelligence journal.[7][8][9]

Lancaster University aims to develop computing system software to make the robots semi-autonomous, which will simplify human control done remotely due to the highly radioactive environments surrounding nuclear waste in which the robots operate. The researchers at the Lancaster University developed a mobile robotic system consisting on imaging software and a Microsoft Kinect camera added to it with two manipulating arms, making it easier to identify, grasp and cut objects such as metal pipes, commonly found in nuclear decommissioning sites. The research team consisted of Manuel Bandala, Craig West, Stephen Monk, Allahyar Montazeri, and James Taylor, and the research was published in MDPI Robotics journal.[10][11][12]

As part of the consortium, the University of Essex aims to research the effects of radiation on the electronics of robotic systems and develop new methodologies to increase the resilience of hardware and software of the electronics systems, and to provide resilience and robustness against radiation damage. The work is an active collaboration between the consortium partners and NASA Jet Propulsion Laboratory.[13] The researchers at the University of Essex developed a methodology named SoCodeCNN to bridge the gap between natural language processing and computer vision using convolutional neural networks, aiming to increase the resilience of the electronics systems hardware and software against radiation damage. The research team behind the development of SoCodeCNN is led by Klaus McDonald-Maier and consisted of Somdip Dey, Amit Kumar Singh and Dilip Kumar Prasad. SoCodeCNN was highlighted as the most popular paper in the IEEE Access journal after publication.[14][15][16]

The University of Lincoln are developing vision-guided self-learning mobile robots. The aim of the research team at the University of Lincoln is to build systems which can use machine learning to adapt to the unique conditions of nuclear sites contaminated by radiation. The research team is led by Gerhard Neumann.[17]

As part of the consortium, University of Bristol developed specially-equipped drones using remote-sensing lidar to map the most radioactive places in the Chernobyl’s Red Forest.[18][19]

See also

References

  1. Andrew Wade (2019-05-08). "UK to tackle nuclear waste with robots and AI" (in en-US). https://www.theengineer.co.uk/nuclear-waste-ncnr-robots/. 
  2. Jeevan, Dr Nivash; am (2021-10-27). "How AI And Robotics Is Used In Nuclear Research" (in en-US). https://analyticsindiamag.com/ai-robotics-in-nuclear-research/. 
  3. 3.0 3.1 3.2 "University of Birmingham secures funding for National Centre for Nuclear Robotics" (in en). https://sciencebusiness.net/network-news/university-birmingham-secures-funding-national-centre-nuclear-robotics. 
  4. "Cleaning up nuclear waste is an obvious task for robots". The Economist. 2019-06-20. ISSN 0013-0613. https://www.economist.com/science-and-technology/2019/06/20/cleaning-up-nuclear-waste-is-an-obvious-task-for-robots. 
  5. "Grants on the web by EPSRC" (in en). 2022-03-30. https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/R02572X/1. 
  6. Robotics, National Centre for Nuclear (2021-12-13). "Home" (in en). https://www.ncnr.org.uk/. 
  7. "Why robots need to understand motive like humans do" (in en-US). 2019-08-18. https://www.therobotreport.com/robots-understand-motive-like-humans/. 
  8. Ortenzi, V.; Controzzi, M.; Cini, F.; Leitner, J.; Bianchi, M.; Roa, M. A.; Corke, P. (2019-08-09). "Robotic manipulation and the role of the task in the metric of success" (in en). Nature Machine Intelligence 1 (8): 340–346. doi:10.1038/s42256-019-0078-4. ISSN 2522-5839. https://www.nature.com/articles/s42256-019-0078-4. 
  9. "Robots need a new philosophy to get a grip" (in en). https://www.sciencedaily.com/releases/2019/08/190812094446.htm. 
  10. "Developing semi-automatic nuclear decommissioning robots" (in en). https://www.lancaster.ac.uk/energy-lancaster/about-us/news/developing-semi-automatic-nuclear-decommissioning-robots-1. 
  11. Bandala, Manuel; West, Craig; Monk, Stephen; Montazeri, Allahyar; Taylor, C. James (2019-06-04). "Vision-Based Assisted Tele-Operation of a Dual-Arm Hydraulically Actuated Robot for Pipe Cutting and Grasping in Nuclear Environments" (in en). Robotics 8 (2): 42. doi:10.3390/robotics8020042. 
  12. "UK researchers develop semi-automatic robots : Waste & Recycling - World Nuclear News". https://www.world-nuclear-news.org/Articles/UK-researchers-develop-semi-automatic-robots. 
  13. "Essex University role in £42m nuclear robotics venture | Business Weekly | Technology News | Business news | Cambridge and the East of England". https://www.businessweekly.co.uk/news/academia-research/essex-university-role-%C2%A342m-nuclear-robotics-venture. 
  14. Dey, Somdip; Singh, Amit Kumar; Prasad, Dilip Kumar; Mcdonald-Maier, Klaus Dieter (2019). "SoCodeCNN: Program Source Code for Visual CNN Classification Using Computer Vision Methodology". IEEE Access 7: 157158–157172. doi:10.1109/ACCESS.2019.2949483. ISSN 2169-3536. https://ieeexplore.ieee.org/document/8882216. 
  15. "Research could make robots more resilient than ever before | University of Essex". https://www.essex.ac.uk/news/2020/06/11/research-could-make-robots-more-resilient-than-ever-before. 
  16. Ahmed, Shahjahan (2020-06-11). "Research could make robots more resilient than ever before" (in en-US). http://roboticsnews.co.uk/2020/06/11/research-could-make-robots-more-resilient-than-ever-before/. 
  17. "Robots get smarter to help with decommissioning - World Nuclear News". https://www.world-nuclear-news.org/Articles/Robots-get-smarter-to-help-with-decommissioning. 
  18. "First Drone Survey Finds New Radiation Hotspots in Chernobyl's 'Red Forest'" (in en-US). 2019-05-10. https://www.ntd.com/first-drone-survey-finds-new-radiation-hotspots-in-chernobyls-red-forest_327735.html. 
  19. "A Chernobyl sono stati rilevati livelli di radiazioni finora non noti" (in it). https://tech.everyeye.it/notizie/chernobyl-rilevati-livelli-radiazioni-fin-ora-non-noti-383958.html. 

External links