In July 2019, I was lucky enough to meet with Dr Alessandro Di Nuovo from Sheffield Hallam University who is an expert in the field of Artificial Intelligence and Robotics. Dr Di Nuovo is leading the Smart Interactive Technologies (SIT) Research group and conducting internationally renowned research. As part of this research he is looking at healthcare and well-being and robotics, specialising in human-robot interaction.
My interview with Dr Di Nuovo took place at his research lab in the University and I was able to meet and interact with several robots including Pepper, manufactured by Softbank Robotics. Read on to find out more about Robots and what Dr Di Nuovo has to say about them and their place in the future…
Humanoid because of the interactivity you get. It acts like an eye. It absorbs facial features and gestures and then interacts.
At school, I enjoyed Computer programming and Artificial Intelligence. Studying this in depth, I realised that these are fundamentally connected to Robotics. Indeed, we can program the AI to use the robotic body for improving the interaction while providing assistance and comfort to the elderly and children.
The work we are doing at Sheffield Children’s Hospital is to help children with anxiety. The robots are there to interact with the children and help them understand the procedures or activities that are going to be done to them.
One example is a robot having a plaster put on it – the child then knows
what is going to happen. Robots are also good at mimicking so they can do certain movements and have the children imitate them. This also has uses for physiotherapy.
Another use of robots is their interaction with the elderly. Social isolation amongst elderly people means that many do not see, let alone speak, to other people during the day. Radio and/or television are very passive, in that the elderly just listen. Robots can be used as social interactors – and through talking and physical movement they stimulate interaction. I have done several projects looking at the impact robots have with social isolation and the elderly.
Also robots are being used to pre-screen dementia in the elderly. In the absence of doctors or practitioners, robots can be used to screen signs of decreased neurological capacity. In these cases the tests are unbiased and gather data that then can be sifted to give either a decision; no further action required or a referral to a doctor for further investigation. There are further uses of medical diagnostic tools using robots. The imaging and detection generated by robots is used widely already in medicine.
At this stage in the interview we took a break so that I could try the dementia screening test. I found that the level of interaction was non-threatening and could see the advantage of using this diagnostic tool.
One concern is that robots are only as good as the human programmers that programme them. There is scope for false negatives, i.e. when something missed is an issue that needed further investigation. False positives can be rationalised just as when humans are used – and further investigations would then give a clear diagnosis.
Yes, only because they are programmed to have feelings and to show emotions. The emotions and feelings can be shown through the connection the person has with the robot. The robot can also show feelings and emotions by facial expressions, voice and gestures.
Do you agree with Dr Di Nuovo? Do you want to see more (or less) robots used in Education and Medicine? What do you think are the impacts of using robots in these industries?
Let us know your thoughts by leaving a comment below.
This article could not have been written without the enthusiasm and generous amount of time given by Dr Di Nuovo.