Deprecated: mysql_connect(): The mysql extension is deprecated and will be removed in the future: use mysqli or PDO instead in /home/virtualw/public_html/Resources/Hosted/Resource.php on line 9
Sol Bianca: The Legacy Take on Gaze-Directed Neural-controlled Weaponry
Not a member yet? Register for full benefits!

Username
Password
Sol Bianca: The Legacy Take on Gaze-Directed Neural-controlled Weaponry

Sol Bianca: The Legacy is not a VR series, nor in the main, related to realistic technology at all. It takes place on a futuristic starship, long after the fall of our civilisation, when man's greatest technological achievements, are nothing but mysterious artifacts which defy current science.

However, despite these problems, there are two elements to the series which are very relevant to actual science and interaction with VR or AR systems. Both occur in episode three of the series. Sadly, once this episode is over, we never see this weapon again.

There are six episodes in total, all released in 1999, and no others have anything pertinent to our fields of interest within them.

The F-310 Mega-Arms is not a real weapon of course, but the ideals it embodies is a long-held dream of military minds around the world, and something that is being strived for, with a lot of DARPA's work in eye-tracking and brain-machine interfaces leading up to something very much along the same lines as this weapon. Nothing it does, is theoretically impossible, and quite a bit of the episode is spent explaining how it works.

In the episode, it is needed because terrorists have claimed the pirate ship Sol Bianca for themselves, and all but one member of the crew are being held hostage. The remaining member needs something capable of taking the ship back when she is severely out manned and out gunned, and finds this in a very reputable black market weapons merchant's office. The rapidly increased reaction speed it offers, as well as the special features, more than make up for the difference in relative numbers and firepower. Especially in the hands of a trained soldier.

The first shot we see of weapons expert Janny Mann, trying on the F-310 Mega Arms for size.
Click here for a larger version, and a list-format, detailed breakdown.

It is a direct neural interface weapon, tapping into the user's thoughts and intentions. By reading the mind of the soldier wearing it - its a big gun - the reflexes of the peripheral nervous system are effectively bypassed and the gun fires long before the finger pulls the trigger. It is pretty much instantaneous when the decision to fire has been made.

The in-episode demonstration text goes like this. All text in bold and italics is directly taken from the English dub of the original episode. All other text is our own.

Weapon seller: There, I guess that does it. A shoot-navi system is what the thing is called. The components all communicate.

Weapons seller: Okay, stand... Whoa! Are you all right?


The whole outfit is the gun. Jenny stumbled when she first stood up because of the size of the thing. As this image shows, the gun itself is actually attached to her hip, and its weight was pulling her over to the right, so she had to compensate with her stance. It is actually held by her hips and torso, not her arm, so her arm does not get tired from the weight.

Not visible here, there is even more of the device on her back; a backpack linking her hip unit with her shoulders. This carries as much of the weight of the device as possible, close to the wearer's centre of gravity. It also serves to reinforce the user's spine, with the two arm straps shown, attached to it to keep the unit tight against the torso.

The suit itself is also part of the unit, possibly with electromyography sensors below its surface, to monitor muscle contractions. Whilst not a technology featured in the episode, it is a technology we have, and could serve as a backup triggering mechanism should the neural interface fail.

Finally, the neck of the suit is rigid, extending up into the helmet, so the weight of the combat HUD is held by the suit's supports and not the neck bones of the wearer. This goes a long way to avoiding simulation sickness, and supports the neck when carrying the extra weight.

Weapons Seller: Lady, what you are holding is the Voltex 310. The latest in concept-arm hardware, and right now, the company wants product monitors.

Weapons Seller: Since the navi and gun are linked, they operate as a unit. Try it.

Janny Mann: There's no trigger.

Weapons Seller: You fire the gun by using the software.

Weapons Seller: Okay? Now, when you fix your sight on the target, the sensor will automatically lock in and tell you its ready.

Machine(To Janny alone): Ready

Weapons Seller: (taps head) Up here. See, its your brain that's the trigger so when your mind commands to fire, the gun will shoot.


The view through the HUD interface clearly defines what is in front of the user, and what is to either side. The current cross in the middle of the screen is where the user is looking. As she moves her eyes about, so the cross moves. The gun also has a tendency to move to follow that cross, using internal motors. However, these motors are not strong enough to pull the user's arm with them, and disengage when the arm in the handhold, tugs another way.

They are there to help guide the arm and gun to where the user is looking, not to automatically place them in the same alignment. If the user wishes to look one way and fire another, that is perfectly possible.

The system does however track eye movement, much as modern HUDs do. When the user fixes on a target at a given range, the LiDAR units at the front of the gun shoot a laser out and take the range. The weak motors in the gun then guide the user's hand to the optimal position to hit that target and range with the ammo type selected.

Janny Mann: Hmm.

Machine(To Janny alone): No Bullets. Please Reload.

Weapons Seller: All you've got to do is think about the type of shot you want. Single, quick-burst, or full auto. You can shoot safety slugs too. Oh yeah, and grenades.

No, the gun does not manufacture bullets on demand. As the components are linked, and each knows where the other is in 3D space, when the user looks at their gun, they get an overlay, telling them immediately how much of what ammo type is left in the gun. This lets them know whether they have to reload or not.

Safety slugs (rubber bullets) and grenades exit via the same muzzle. All other shots use the other muzzle. The interior mechanics automatically switch the ammo types around at a single mental command. One of the three types for the upper barrel, or one of the two for the lower. The two bands on the earlier picture (above) show what is currently loaded for each barrel. Obviously there will be a slight delay when changing between types.

Weapons Seller: Heh. If you're packing one of these babies, you can fight a war all by yourself.

Overall, there is nothing about this weapon which is in concept at least, beyond our ability to develop for real. Most of the processing power is in the backpack, along with the power supply system and (quite possibly) the bullets themselves. The gun section is quite light, as it would have to be to be toted around all day, although the sheer mass of it does require that the user adopt a new way of standing and moving in general, so that their centre of balance is not thrown off too much.

A nice feature not covered directly is the helmet, although we do see it in action later in the episode and it is quite easy to infer from that, what is going on. If we look at the initial image again:


(Click to enlarge)

We can see there are multiple caneras attached to the unit. Normally, vision is carried out through the front camera, which records events in front of the user, overlays virtual data, and then relays it to the monitors on the inside of the 'sunglasses'. Those are not actually see through, simply holding frames for the display screens of whichever type they may be. The other four cameras scattered around, handle peripheral vision, and their output is added to the first, so the user retains a normal surround vision. Unlike with natural vision however, since it is all camera-based, if they for example focus on their left side peripheral vision, they can see it in just as much clarity as they could straight ahead, without actually turning their head - the system can compensate and move the display, by tracking pupil movement or optic muscles. We have possessed such technology for over two decades.

So, if there is a threat from above, look up an the two cameras mounted on top of your helmet give a better view than your eyes naturally would. The system is tracking your eye and head movement, the motors working with your arm to guide the gun... and to a trained operator thay can be pointing the gun exactly at the assailant coming at them from above, know the bullet will hit, and fire off a short burst with their mind.

As to the mind-controlled aspect. That is the key part of the interface, and is the obvious reason for the sheer size of that helmet. It is a non-invasive system, as demonstrated by the fact the helmet can be put on and taken off like a bike helmet. So that suggests a 3D EEG system, similar to what we have available today, but a little more precise. It is unlikely to be detecting the complex thought of 'fire a bullet now', but much more likely looking into the hindbrain, and tracking the neurons tied to aggression. When the instinct to 'kill' surfaces, the gun fires at whatever it is aimed at. It is a simple and elegant solution that would by far have the fastest response time. Since you never point a loaded weapon at anything you wish to keep, the moral aspect of the act takes care of itself.

Changing ammo types is a bit more tricky, and logically, this would have to be a higher thought process, in which the gun is trained or calibrated to the particular neural pathways of a given user. This is again, what happens with modern neural implants - the implant has to learn to recognise the thoughts of the user rather than the other way around. What this involves, is the user thinking of a concept which the gun learns to read as say 'short burst fire', by receiving that input when it is told it will receive the command for short burst fire mode, and through pattern matching and repeated exposure, learns to recognise subtle variations of that thought as the same thing.

When in other ficticious uses of brain-machine interfaces, the interface has to be thought at in the correct language, that is also what is going on. Thoughts in different languages have slightly different concepts to mean similar things, and since it is the concept that is tracked, not the language, the thought has to be expressed in the right context in order to be recognised. Hence something thought in English concepts might not necessarily be the same thought pattern when considered in Russian (as in the film Firefox).

The suit is also very likely to be transmitting continuous position data to the gun's control system. There will be at least one joint sensor at the knee, elbow, hips, down the back, etc to monitor the relative position of the user's joints in a system such as this one. As stressed earlier, every unit talks to the others. If the user is crouched, they can take more oomph from the gun firing, than they can if they are standing normally. This would be an invaluable datastream to a smart gun, to tell when the user's body position is ideal for more power in a shot, and when it is less than ideal. It could warn them to brace more before firing one of the RPGs for example. Additionally,. it helps in calculating the gun's own relative height in relation to the target. Something essential for gravity calculations, as it fires with greater or lesser force according to the range the target is from the gun itself. Most likely using a linear induction accelerator to push the bullets out of the front end at greater than normal speed.

Ficticious Elements

There is one element of the weapon shown later in the episode that really is pure science fiction. It is a cloaking system that allegedly renders the user invisible to the entire electromagnetic spectrum for 'stealth runs'. If it is possible, modern physics, even theoretical physics, has no idea how such a thing might be achieved. So, we will ignore this one 'feature' as creative writing and nothing more.

Overall

Overall, it is a big, powerful, and scarilly efficient gun, hooked directly into the user's brain and analysing her movements for the most effective shot. It uses a suit which would have to be adjusted for each user, but is in separate secions which communicate wirelessly via a body-area network and a unique protocol. There is some danger of an external forcehacking into the system because of this weakness, but by keeping the communication signal to a low power, few inches transmission range, and using a bespoke protocol this danger can be minimised. There is nothing to stop the user from requesting a new, dynamically generated wireless protocol be used by the gun, each time it is picked up. This would all but eliminate the possibility of hacking. After all, its not as if its ever going to communicate with other systems - except the person wearing it.

What is perhaps scariest about this weapon, is there is literally nothing about it that we could not construct using a mixture of commercial off the shelf, and in-laboratory technology available right now. Sure it might not be as fast or as accurate using modern interface technology, but we could certainly build it, and have it work. In time, perhaps as little time as another decade, we will have reached the point where we can build it as pictured here, and be just as accurate and just as deadly - and trump every 'itchy trigger finger' weapon on the market today.

Further Reading

Large Image Display: Sol Bianca: F-310 Mega-Arms Neural Interface Gun

Blue Thunder - Gaze-Directed Neural-Controlled Weaponry

The Firefox Ficticious Neural-Controlled Fighter Plane

Sol Bianca: The Legacy Take on Virtual Schooling

VWN Resource Category: Neuroprosthetics

VR Interfaces: Heads Up Display

Moving Mountains With the Brain, Not a Joystick

References

Sol Bianca The legacy at IMDB

Sol Bianca: The Legacy was a 1999 release by Geneon Universal Entertainment. English dub versions are available, however the series is not currently in production. At time of writing, Amazon.com still had six used copies.

Staff Comments

 


.
Untitled Document .

Warning: Unknown: write failed: Disk quota exceeded (122) in Unknown on line 0

Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0