National Centre for Nuclear Robotics (NCNR)

Principal Investigator: Kaspar Althoefer
Co-Investigators: Lorenzo Jamone, Ildar Farkhatdinov, Andrea Cavallaro, Stefan Poslad, Miles Hansard

Nuclear facilities require a wide variety of robotics capabilities, engendering a variety of extreme RAI challenges. NCNR brings together a diverse consortium of experts in robotics, AI, sensors, radiation and resilient embedded systems, to address these complex problems. In high gamma environments, human entries are not possible at all. In alpha-contaminated environments, air-fed suited human entries are possible, but engender significant secondary waste (contaminated suits), and reduced worker capability. We have a duty to eliminate the need for humans to enter such hazardous environments wherever technologically possible. Hence, nuclear robots will typically be remote from human controllers, creating significant opportunities for advanced telepresence. However, limited bandwidth and situational awareness demand increased intelligence and autonomous control capabilities on the robot, especially for performing complex manipulations. Shared control, where both human and AI collaboratively control the robot, will be critical because i) safety-critical environments demand a human in the loop, however ii) complex remote actions are too difficult for a human to perform reliably and efficiently.

This page summarises key contributions of the QMUL team to the project. For more information view the main webpage of the NCNR project.

Tentacle-like eversion robots for the penetration into inaccessible, harsh environments

Taqi Abrar, Fabrizio Putzu, Ataka Rizqi, Hareesh Godaba, Kaspar Althoefer

  • Access to otherwise inaccessible spaces through narrow openings:
    • Advancement of robot in realistic environments up to 20 metres demonstrated
    • Increased manoeuvrability using integrated actuation pouches achieved
    • Delivery and remote operation of vision sensors achieved
    • Initial robot models and control strategies developed to assist human tele-operator
  • Performance is currently state-of-the-art in terms of accessibility and manoeuvrability.
  • User-friendly and intuitive tele-operation software is being developed to enable a human operator to steer and control the remote robot.
  • Interface to mobile robot platform (Lincoln) currently being developed.
  • The hardware and software developments are at TRL3+.

[paper] [paper2] [video] [poster]

Inflatable fabric based robot grasping devices

Ahmed Hassan, Faisal Aljaber, Hareesh Godaba, Ivan Vitanov, Kaspar Althoefer

  • Fabric based robot fingers that can be inflated and combined to form resilient grippers:
    • Fingers are made of fabric-based airtight structures.
    • Applying air pressure to these fingers makes these bend because of the pleated nature of one side of the finger structure.
    • Maximum payload achieved: 10Kg.
  • Grippers have been fabricated and used to demonstrate the capability to grasp a range of unknown, differently shaped objects.
  • The approach requires little control or navigation strategies to achieve reliable grasps.
  • Interface to attach grippers with Franka and UR5 robots are being developed at the moment.
  • Tactile sensing is being currently integrated.
  • TRL3 – has been demonstrated in a lab environment.

[paper] [video] [poster]

Surface fractures (cracks) detection and exploration

Francesca Palermo, Bukeikhan Omarali, Changjae Oh, Stefan Poslad, Kaspar Althoefer, Ildar Farkhatdinov

  • Crack detection algorithm based on multi-modal visual and tactile feedback and motion planning algorithm for optimal fracture exploration via a manipulator based on fibre optic sensor attached as end-effector to a robotic manipulator
  • An important task often performed in remote hazardous environments is the detection of mechanical fractures on objects such as containers, tanks, pipes and other technical systems used for keeping chemical and radioactive waste. The effects of non-detected fractures may lead to larger macro-scale catastrophic failures making the cracked surface mechanically weak to perform its function.
  • Sensor-based on fibre-optics which are resistant to gamma radiation.
  • The sensor is independent of the robot used.
  • The algorithm is agnostic and can be used for other domains than nuclear.
  • TRL level estimated is 3/4. Experiments are performed in the laboratory environment.

[paper1] [paper2] [paper3] [video1] [video2] [poster]

GRIP: Grasping Robot Integration and Prototyping

Brice Denoun, Rodrigo Zenha, Beatriz Leon, Miles Hansard, Lorenzo Jamone

  • GRIP is a ROS-based open-source software framework to quickly and easily design robot manipulation tasks with integrated components (both software and hardware components).
  • GRIP is robot agnostic (any robot can be used as long as it is ROS-compatible) and it can be used to design robotic tasks in the nuclear environment.
  • A tactile slip detection module was developed using the uSkin sensor (XELA Robotics), a sensor that can measure both normal and shear forces on multiple contact points. Detection is done with a classifier that was trained with data collected “in the wild” during robot grasping operations in a semi-structured environment.
  • TRL:4/5. A user study with more than 20 participants was carried out; researchers used GRIP to design robotic manipulation tasks in a mock-up industrial environment.

[paper] [paper2] [video] [poster] [documentation]

Additional videos
[video1] [video2] [video3] [video4]

Soft gripper with variable stiffness hinges and integrated electro-adhesive pads

C. Liu, Hareesh Godaba, A. Sajad, N. Patel, Kaspar Althoefer, Ketao Zhang

  • A light-weight and low-cost two-fingered soft gripper capable of multiple modes for grasping a wide range of objects and adapting to unstructured environments
  • Developed a novel approach and modelling for variable stiffness flexure hinge based on shape morphing
  • The combination of pneumatic actuation and tendon driven system allows minimum electronics suitable for operation at extreme environments including nuclear and oil/gas sites
  • ROS-compatible control system making the robot gripper a modular unit which can be effectively integrated with different types of robot arms

[paper] [video] [poster]

Additional videos

Feature learning for a vision-based mobile agent to explore an unseen environment

Changjae Oh, Andrea Cavallaro

  • A feature learning algorithm for a vision-based mobile agent to explore explore a previously unseen environment
  • The method improves the training convergence rate of reinforcement learning-based mapless navigation, especially when the reward is sparse (i.e. the goal is far from the initial location)
  • The method is for mobile robots to explore the unseen nuclear environment for data collection and navigation (data to be collected could be of different nature than visual - video is used for navigation)
  • The estimated TRL level is 2 that formulates the concept of the technology with the robot simulation

[paper] [paper2] [video] [poster]

Virtual reality based robot teleoperation

Bukeikhan Omarali, Kaspar Althoefer, Ildar Farkhatdinov

  • Virtual reality is used to control a remotely located robot-manipulator
  • Human-operator uses head-mounted display and hand-held joystics to command a virtual model of the robot
  • A dedicated VR interface and hand gestures are used for the commands
  • A set of static and dynamic 3D cameras are used to visualise the remote scene
  • The system was tested with multiple users in the laboratory environment (TRL 3-4)

[paper1] [paper2] [video]

Soft haptic interface for teleoperation

Joshua Brown, Ildar Farkhatdinov

  • We combine particle jamming effects with vibrotactile haptic actuators
  • Particle jamming (fluid) stiffness and hardness is controlled with air pressure
  • Vibrations from haptic coil-type linear resonant actuator propagate differently when jamming fluid changes its density
  • Combination of vibrotactile actuation and jamming effects allows rendering different textures and softness haptic feedback

[paper1], [paper2], [video]