Active Research Projects

National Centre for Nuclear Robotics (EPSRC)

  • QMUL PI: Kaspar Althoefer
  • QMUL Co-Is: Lorenzo Jamone, Ildar Farkhatdinov, Andrea Cavalaro, Stefan Poslad, Miles Hansard
  • QMUL budget: £1,020,239.

Nuclear facilities require a wide variety of robotics capabilities, engendering a variety of extreme RAI challenges. NCNR brings together a diverse consortium of experts in robotics, AI, sensors, radiation and resilient embedded systems, to address these complex problems. In high gamma environments, human entries are not possible at all. In alpha-contaminated environments, air-fed suited human entries are possible, but engender significant secondary waste (contaminated suits), and reduced worker capability. We have a duty to eliminate the need for humans to enter such hazardous environments wherever technologically possible. Hence, nuclear robots will typically be remote from human controllers, creating significant opportunities for advanced telepresence. However, limited bandwidth and situational awareness demand increased intelligence and autonomous control capabilities on the robot, especially for performing complex manipulations. Shared control, where both human and AI collaboratively control the robot, will be critical because i) safety-critical environments demand a human in the loop, however ii) complex remote actions are too difficult for a human to perform reliably and efficiently.

MAN^3: Human-inspired robotic manipulation for advanced manufacturing (EPSRC)

  • PI: Lorenzo Jamone
  • Budget: £310,597

The aim of this project is to develop a system for natural human demonstration of robotic manipulation tasks, combining immersive Virtual Reality technologies and smart wearable devices (to interface the human with the robot) with robot sensorimotor learning techniques and multimodal artificial perception (inspired by the human sensorimotor system). The robotic system will include a set of sensors that allow to reconstruct the real world, in particual by integrating 3D vision with tactile information about contacts; the human user will access this artificial reconstruction through an immersive Virtual Reality that will combine both visual and haptic feedback. In other words, the user will see through the eyes of the robot, and will feel through the hands of the robot. Also, users will be able to move the robot just by moving their own limbs. This will allow human users to easily teach complex manipulation tasks to robots, and robots to learn efficient control strategies from the human demonstrations, so that they can then repeat the task autonomously in the future. Human demonstration of simple robotic tasks has already found its way to industry (e.g. robotic painting, simple pick and place of rigid objects), but still it cannot be applied to the dexterous handling of generic objects (e.g. soft and delicate objects), that would result in a much larger applicability (e.g. food handling). Therefore, the expected results of this project will boost productivity in a large number of industrial processes (economic impact) and improve working conditions and quality of life of the human workers in terms of safety and engagement (social impact).

Get a Move on (EPSRC)

  • PI: Ildar Farkhatdinov
  • Co-Is: Stuart Miller, Dylan Morrissey, Kaspar Althoefer
  • Budget: £50,000.

There are around 1.2 million wheelchair users in the UK and many of them have limited access to social and healthcare due to limited mobility and increasing pressure on the NHS. A possible motivation for the wheelchairs users to keep fit is to introduce suitable mobility tracking technologies. Compared to walking trackers there are almost no feasible solutions for wheelchair users. We propose to develop a mobility tracker for wheelchair users. Such system will be user-centred, inexpensive and adaptable to passive and active wheelchairs. The tracker will be based on combining smartphone navigation data and wheelchairs kinematics. It will give an estimate of the user mobility during wheelchair propulsion (differences of the wheelchair and user movements) and overall navigation path and user efforts (total movement). The wheelchair movement will be tracked with the help of rotary sensor and a microcontroller wirelessly connected to the user’s smartphone. Optional auditory and visual feedback will be used to inform the user about the tracking. Collected data will be automatically analysed and an estimate of the user’s efforts will be available for the user and relevant social/healthcare professionals. With the help of our system, we shall investigate how different wheelchair propulsion patterns affect the fitness of their users. The outcomes of the project will have a direct impact on the wheelchair users (including elderly population) fitness and their motivation to increase their activities.

Automatic Posture and Balance Support for Supernumerary Robotic Limbs (EPSRC)

PI: Ildar Farkhatdinov

Budget: £466,000

Supernumerary (additional) robotic links augment human bodies with extra mobility and manipulation capabilities. Such systems can be efficient in collaborative manufacturing, logistics, mobility support and exploration tasks. It is well-known that material handling tasks in the industry can often cause harmful working postures potentially leading to musculoskeletal disorders and occupational injuries. This project aims to develop novel techniques to address ergonomics and safety of supernumerary robotic limbs. We will develop a novel posture and balance support wearable robotic system and integrate its control with the supernumerary robotic limbs for material handling. We consider a healthy human user wearing a pair of supernumerary robotic arms at the hip level as shown in the figure. The human body is mechanically coupled with the robot which makes the safety and ergonomics aspects crucial for a successful application. The system for back support will be based on conventional servo-actuation and novel soft materials structures for efficient and dexterous spine shape correction. The balance assistive system will be composed of a robotic link whose movement will counter balance the load carrying task through active compensation control.

WormBot (Innovate UK)

QMUL PI: Kaspar Althoefer.

The project is led by Q-bot. Q-Bot specialises in robotic services in the built environment, that allow easier, cheaper, safer and more effective repair, maintenance and upgrade of buildings and infrastructure. Q-Bot will be the end user of the WormBot system providing robot enabled services, initially in the application of the underfloor insulation to buildings (at a fraction of the current cost, and with none of the disadvantages of traditional methods) using a robot to apply insulation in an environment which is currently inaccessible for human operatives without prohibitive disruption and expense. The service is already being commercialised (with the help of a much more cumbersome hardware) with a number of clients including Local Authorities and Housing Associations with over 100 sites successfully insulated and over 300 committed to by our clients. This project builds on ground-breaking robotics innovation in the area of soft and flexible robotic manipulators by the Centre for Advanced Robotics @ Queen Mary (ARQ), Queen Mary University of London (QMUL) initially developed for surgical applications. It will develop the technology further, with a view of utilising it in extreme and challenging environments of inaccessible areas of buildings (initially), infrastructure networks (including sewers) as well as nuclear site inspection. The project will deliver a proof of concept prototype that will be validated in demanding environments as well as developing further the service robotics business model (and validating it in various industrial segments using the Lean Start-up principles).

Learning collaboration affordances for intuitive human-robot interaction (The Alan Turing Institute)

  • PI: Kaspar Althoefer
  • co-Is: Patrick Healey, Lorenzo Jamone, Ildar Farkhatdinov, Julian Hough
  • Budget: £150,000.

Intelligently manipulating objects, especially in collaboration with humans, is still a widely researched topic in robotics. This project aims to create intelligent methods for natural and intuitive human-robot interaction (HRI), with the goal of equipping robots with the required intelligence to understand a given task in such a way to actively support a human completing it. The focus will be on natural, intuitive tool handover in a work environment, such as an operating theatre, a factory floor or the extreme environment of a nuclear power plant. While humans (amongst each other) conduct handover tasks with great ease, there are still many shortcomings when attempting to carry out this HRI task using existing robotic systems. The project aims to create new AI-based approaches to instil robots with capabilities that allow them to interactively negotiate object 'affordances' between humans and robots and to dynamically adapt to shifting view points during the handover for reliable HRI.