Robotic Insect: Guided by the Same stimuli
It's not even remotely near as small as your average insect, but a small robot about six inches long, created by German researchers, is as near identical to them as possible in another, more important way: It has as close to the same sensorimotor network that a beetle has. This enables it to react to stimuli in the same way as a small insect, and learn to navigate in the same way a small insect would.
The prototype robot was created by a joint research team between two German institutes. Researchers from the Freie Universität Berlin, of the Bernstein Fokus Neuronal Basis of Learning, and a second research team from the Bernstein Center Berlin, both located in the city of the same name.
The basic concept was to create a physically embodied AI that is capable of learning how to navigate its environment following as little direct programming as possible. Instead the rules governing how the sensorimotor control processes incoming data are based on the insect model, and the robot has to figure out how to behave from there. It is essentially a needs based AI system, given a physical embodiment and modelled after some of the simplest neural network possessing creatures in nature.
As the leader of the joint research team, Professor Martin Paul Nawrot from Freie Universität Berlin but it: Much like honeybees learn to associate certain flower colours with tasty nectar, the robot learns to approach certain coloured objects and to avoid others.
Obviously the stimuli are to a high degree artificial; the robot is not gaining sustenance from its environment but from an on-board power pack. However, many of the dangers of nature are relatively easy to simulate by means of a second program, not directly linked to the first, whose task is much more simple: to recreate pleasure or pain responses (or rather their equivalent) outside the control of the first, rather more complex AI in nominal charge of the robot, but also in response to stimuli. In this way, a red patch on the ground for example, becomes harmful, a yellow patch even more harmful, whilst blue may be safe. If it finds a 'food' colour, say green, the secondary program starts filling a 'health' meter, and so on.
The objective is in other words to use the simpler programming to replicate physical 'bodily needs' and the effect of 'fulfilling those needs'. The AI is thus behaving like any other insect, driven to fulfil its needs based on the responses simulated for its body.
Just within seconds, the robot accomplishes the task to find an object in the desired color and to approach it, explained Nawrot. Only a single learning trial is needed, similar to experimental observations in honeybees.
The current robot prototypes are extremely simple of course, but they give insight in both directions:
Conditioned behavior in a robot controlled by a spiking neural network (paper, paywalled)
Relevant Dictionary Terms