Volume 2, Issue 1
1st Quarter, 2007


What it Might "Feel" Like to be Connected to Devices That Will Expand or Enhance Human Function With Cyber Abilities

Lawrence J. Cauller, Ph.D

Page 6 of 8


Image 39 - Recent Replication

Then, here she is later after learning to see backward, she is able to ride a bicycle. Now, the interesting thing is that when they take off the lenses, they've got to learn all over again to move about in the ‘right-side-up’ world. You have to start all over again. You can't just sit there and learn, you have to actively participate, interact with the world, and make mistakes to learn.

For this learning to occur, whether for adaptation to inverting lenses, or re-adaptation after they’re removed, this sensory-motor activity must engage the basic developmental process of NeuroInteractivity involved in all conscious behavior.  


Image 40 - Wearable wirless webcam

Techies are playing with virtual reality headgear that greatly distort what they see to study how well they can adapt to visual transformations designed to optimize the integrated flow of information from multiple simultaneous sources, such as cameras and the web.  


Image 41 - Adaptation

We can further examine the capacity of the adult brain to handle sensory enhancements by considering the rapidly growing field of ‘sensory substitution’ technology being developed to assist adults with sensory disabilities. 

Much of this technology is designed to help blind people by substituting their ability to see with their eyes, with their ability to hear complex sound patterns or feel tactile patterns of touch. These are some of the best examples.  


Image 42 - Optacon

A well-known example is Braille, which substitutes for reading text with the tactile sensations generated by touching patterns with the fingers. Many vision substitution devices are designed to detect obstacles and nearby moving objects to help the blind walk about safely. Several of these stimulate patterns of tactile sensation corresponding to images from forward-looking cameras, including this example that stimulates large patterns on the skin of the back, and other that stimulate the skin on the forehead.  


Image 43 - Cross-model

Here is one that stimulates the tongue which reportedly works very well, apparently because of the high sensitivity and fine resolution of the tongue for these sort of dynamic inputs. The pattern of stimulation sites on this thin flexible sheet maps the image from the camera onto the receptive surface of the tongue which they can interpret to find their way through obstacles, avoid people walking by, and so forth.

Now, you don't have to be blind to do this. This subject isn't blind. So you could use this as a way of enhancing your abilities by mapping in additional inputs.  


Image 44 - EyePlusPlus, Inc

Here is someone using the forehead imaging device which further signals color and movement with vibratory touch sensations.  


Image 45 - See with your ears

This is an interesting example of how sound can substitute for vision. Images from little video cameras embedded within the frames of these glasses here are transformed into sounds that can be heard through these small earphone.

I invite you to look at and listen to this on the Net, I don't have time for it here, but people with these sound input devices can very quickly learn to describe complex visual scenes including the buildings and where the trees are and where the bus is and so forth. When I casually listen to these signals and it doesn't make sense to me, but I guess if I submerged myself in this sensory-motor world, I could develop the ability to see this way NeuroInteractively.  


Image 46 - K-Sonar

Another intriguing substitution device shown here uses sonar to probe one’s surroundings. I like this method because it emphasizes the continuous sensory-motor interactivity involved in scanning for salient sonar signals and probing it’s identity by ‘looking’ it over in the same way we use our eyes. Users of this technology can actually see things in the dark or around corners that are beyond the limits of our natural senses.  


Image 47 - High Hopes

The great progress of neuroscience research for the study of the neural basis of higher brain function has raised great expectations that technology will soon enable direct neuro-cyber integration with the brain areas involved in the highest forms of cognitive function.

Following the success of cochlear implants, there have been several attempts to restore vision in particular, by mapping visual-like inputs directly onto the visual areas in cerebral cortex where maps of visual functions have been well-established. Unfortunately, such efforts have been very disappointing, and raise the likely possibility such sensory prosthesis methods that directly stimulate the brain are fundamentally flawed.  


Image 48 - Neuroprosthetic Researach

The basic idea has been to map video images from cameras onto the visual cortex by electrical stimulation of cortical activation patterns.  Unfortunately, the visual sensations evoked by cortical stimulation are very un-natural and difficult to interpret, like this very low resolution image.  


Image 49 - Stimulation of visual cortex

Blind people have been implanted with such stimulators and they describe what they see when their visual cortex is stimulated as a glowing spot called a phosphene, which floats before them in space. Some have learned to interpret patterns of these un-natural visions to help them avoid obstacles while they walk about.
But even the most successful cases using such cortical stimulation methods have been very disappointing, largely because direct stimulation never evokes natural visual experiences, and the fact that repetitive brain stimulation is known to trigger abnormal seizure activity compounds the medical complications of this invasive brain procedure.   


Image 50 - Look, Ma, No Hands

Significant advances are being made in another field of cortical brain research which has successfully externalized mental commands corresponding to directed movements or intentions to providing severely paralyzed people with a limited ability to directly control their environment.

In one widely reported study, monkeys have been implanted with a device that picks up signals from its cortex enabling it to control a robotic arm to feed itself. A lot of people are working on this. 


Image 51 - Capturing Thoughts

Phil Kennedy [1] is one of the leading authorities on the use of such technology in humans. You may follow the links in these figures to learn more. But this cortical control technology is really most suitable to help people under the most severe conditions, in particular those suffering the ‘locked-in syndrome' they have no other way of expressing themselves, they're conscious but they can't speak, they can't move. A direct cortical interface is the only possible way to externalize their actions.  

Image 52 - Stephen Hawking

Imagine if we could unlock someone like Stephen Hawking [2], and enable him to project his thoughts onto a graphics monitor.  Imagine what he could teach us. That's the goal of these cortical control methods, but the reality of today’s technology is not so promising.

Next Page

Footnotes

[1] Phil Kennedy M.D., Ph.D. – is a clinical assistant professor of neurology at EmoryUniversity in Atlanta, developed and patented the “neurotrophic electrode” in the mid-1980s while working as a neural prosthetics researcher at Tech. His work capitalizes on the basic fact that the act of thinking prompts physical activity in the brain in the form of electrical impulses. Implanted into a patient’s brain, the electrode detects and captures those electrical signals, which are processed by customized microelectronics and software applications to move a cursor and select icons on the screen. In effect, the brain’s neural signals become a computer mouse.

http://www.ifi.unizh.ch/~andel/neurowiki/ March 16, 2007 2:21PM EST

[2] Stephen William Hawking – CH, CBE, FRS, FRSA, (born January 8, 1942) is a British theoretical physicist. Hawking is the Lucasian Professor of Mathematics at the Iniversity of Cambridge, and a Fellow of Gonville and Caius College, Cambridge. He is known for his contributions in the fields of cosmology and quantum gravity, especially in the context of black holes, and his popular works in which he discusses his own theories and cosmology in general.

http://en.wikipedia.org/wiki/ March 16, 2007 2:50PM EST

1 2 3 4 5 6 7 8 next page>