University of Hertfordshire’s robot you can have a conversation with
ACCORDING to the film series, Terminator, robots (or cyborgs at least) are due to take over the world in 2029 – and Hatfield boffins seem to have uncovered the bots’ latest step towards planet domination!
For the University of Hertfordshire’s Dr Caroline Lyon, Professor Chrystopher Nehaniv and Dr Joe Saunders have discovered their iCub robot, named DeeChee, has learned basic language skills through interaction with us.
Word forms, such as the names of simple shapes and colours, were produced by the robot after it had a ‘conversation’ with a person.
During an experiment, to show how language-learning emerges, it was shown the robot could only babble at first and perceived speech as a string of sounds, not divided words.
However, after a few minutes of conversation, when participants were instructed to speak as if the robot was a child, the robot adapted the most frequently heard syllables to speak.
You may also want to watch:
The tests were carried out as part of the iTalk project – which teaches robots to speak using methods similar to those used to teach children.
It also showed although the iCub robot is learning word forms, it does not know their meaning; although learning meanings is another part of the iTalk project’s research.
- 1 The changing nature of Potters Bar high street
- 2 Water safety advice issued following lake drowning
- 3 New report reveals 28 Covid deaths at Hatfield care home
- 4 Standon Calling called off after heavy rain and lightning risk
- 5 Man drowns in Stanborough Lakes
- 6 What are the outstanding schools in Hertfordshire?
- 7 The latest court results for Welwyn Hatfield and Potters Bar
- 8 Football fanatic honoured with FA award for services to grassroots game
- 9 New Sexy Beasts dating show for Netflix filmed in Hertfordshire
- 10 'A day none of us will forget' - Princess Anne visits Lister Hospital
So maybe our planet is safe for the meantime!
Dr Lyon explained the science behind DeeChee: “It is known that infants are sensitive to the frequency of sounds in speech, and these experiments show how this sensitivity can be modelled and contribute to the learning of word forms by a robot.”
It is thought these scientific and technological advances could have a significant impact on the future generation of interactive robotic systems.