Houseplants as Interface Devices
This is a Printer Friendly Article on the Virtual Worldlets Network.
Return to the web-view version.
Author Information
Article by Virtual Worldlets Network
Copyright 12/08/2012
Additional Information

Disney Research has come up with a truly novel type of interface device – a way to turn any house plant into a programmable haptic controller, by taking advantage of the capacitive capability of plants.

Place a single electrode into the soil of the pot holding the house plant, and it connects to the root system of the plant. The electrode sends multiple frequencies of low electric impulse into the plant, in a continuous range from 1khz to 3mhz. The pulse is low enough not to damage the plant or the human operator, but high enough that distortions can be picked up in the completed circuit.

When a part of the plant is touched, the human's hand (or animal's nose) breaks the circuit, and redirects the current into their own body. The software is able to work out which part of the plant was touched, by which specific frequency was disrupted.

This works fine for different plants of the same species – whatever they actually look like, touching a given part of the plant will always produce roughly the same frequency signal, even among different members of the same species. However, every plant species is has a different internal circuit, and needs to be treated differently.

So, the driver software has to be told what species a plant actually is, before it can interact with it. Otherwise it will consistently become confused as to what part of the plant you are touching.

Because a continuous range of frequencies is used, they can be detected in parallel, which means the plants are multi-touch capable. If you stick your whole hand into a leafy plant and wriggle your fingers around; every time part of any one of your fingers touches a leaf or stem, that individual connection produces its own frequency. So even if you are in contact at a couple of hundred different points, they are all tracked in real-time.

The software itself utilises a neural network, so it is capable of learning more complex interactions, and recognising them when they are used again. The researchers have been using this fact to program a variety of complex commands into all manner of plants. Hugging a tree for example, is recognised as a hug, regardless of the height or body mass of the person doing the hugging. Additionally, when the end-user is utilising a copy of the driver, it never stops learning, so highly personal interactions will be recognised over time.

The tree is hugged many times in similar ways. Through this process, the software learns to recognise the intent of a hug, even when different parts of the tree are touched, so long as the same basic hand configuration is used.

In addition, because the circuits for each species of plant are the same for all members of that species, that same circuit can be recreated in an artificial plant, so that touching it will trigger the same responses as touching a living one.

The response codes are dealt with inside the computer as programmable input codes, being tied into any task the programmers minds could envisage. They displayed their prototype, codenamed Botanicus Interacticus at Siggraph 2012, using a mix of a half dozen living plants and a half dozen artificial plants, each hooked up to a display screen and audio capabilities. Some played notes as the users touched different parts of the plants, others commanded swirling patterns on big-screen monitors. One was even used as a calender, with each day being accessed by touching a different leaf.

None of these uses are very practical right now, but practicality was not really the point. Botanicus Interacticus was more to demonstrate that living plants can be used as haptic interface devices for VR and AR applications, than it was about the practical use of such devices.

In truth it is somewhat difficult to imagine a practical use for this technology in it's current state. It may prove to be more useful when paired with other technologies to monitor plant health, or connect a responsive companion AI to a favourite plant, to deepen the connection between the two. That would be particularly useful for those who spend more time bonding with plants than people through circumstances beyond their control.

As it is though, it remains a simple proof of concept and little more.

References

Touch your philodendron and control your computer