Tuesday, March 17, 2015

I See, Said The Blind Man


There was a Star Trek fan named George La Forge. A muscular dystrophy patient, George became friends with Gene Roddenberry at Trek conventions. Years later, when proposing a new Star Trek series, Roddenberry named a character in George’s memory (he had died in 1975) - Geordi La Forge.



The irony is so thick – LeVar Burton said that
wearing the VISOR that gave his character sight
took away 90% of his vision. He tripped over
everything on the set for the first couple of years.
In later movies of the franchise, he used ocular
implants so that he didn’t have to wear the
visor,  just contacts.
The character played by LaVar Burton was articulate, funny, intelligent, and a more than capable engineer. But mostly what people remember is that he was the blind man who could see better than sighted men. No, he didn’t necessarily have insights into their souls; he could literally see things they couldn’t. The mechanism of his sight – VISOR (Visual Instrument and Sensory Organ Replacement), consisted of what else – a visor.

It fit over his eyes and picked up electromagnetic radiation. The signals were processed and transduced (converted) to electricity. These were sent along wires that went into his temples (where the bolts on the side of the visor were) and attached to his optic nerves. All of this to relay electric impulses based on what EM waves the visor detected.

On each optic nerve was an implant that transduced the electrical signals so that they stimulated the individual neurons in the optic nerve responsible for assembling the picture. Think of it as a TV screen in your mind. There are certain neurons responsible for every pixel. Stimulate the right ones and you can build a picture. It isn’t anywhere near that simple but we don’t have time for a discussion of hypercolumns and visual processing.

Geordi’s visor went nature one better. It expanded his visual range beyond that of visible light. Humans detect only about 1% of the EM spectrum. Visible light is just about in the middle of the spectrum, but below it is infrared (heat) and microwaves, and above it are ultraviolet, X-rays, and gamma rays.


Goldfish sense infrared as noise in rods and
cones, not with photoreceptors specific for those
wavelengths – or with a pit like snakes do. This
allows them to hunt in murky waters. Only juveniles
see UV and this is with specialized cone
photoreceptors just for UV wavelengths. Why – I
don’t know UV is higher energy and damages the
cones, so they are gone by adulthood.
Butterflies can see in the ultraviolet range; it shows them patterns on flowers. They use these patterns more often than colors or shapes to find nectar. Snakes can sense infrared waves, although they do it with their pit organs not their eyes. On the other hand, piranha and goldfish do sense infrared with their eyes – heck, goldfish are the only animals that can see both infrared AND ultraviolet light!

So the question is – how close are we to producing a Geordi La Forge visor? Closer than you think – close enough so you can buy one now.

Believe it or not, research into artificial sight started way back in 1792. Alessandro Volta (the name sound like a word you’ve heard?) connected two copper wires to a bimetallic pile he had constructed. By the way, bimetallic piles are basically batteries – he invented the battery! He connected one wire to the corner of a person’s eye, and the other one he touched to the roof of their mouth.

The person saw blobs of light, even in a darkened room; electricity controlled what the eye would “see.” This is exactly the technology we’re using to help the visually impaired regain their sight; we just use a slightly more refined systems to stimulate the optic nerve via neural prostheses.

The Argus II (Second Sight, Sylmar, CA) is on the market now in the U.S. and Western Europe. This is a system that uses a camera mounted on a pair of glasses. The recorded images are processed and converted to electrical impulses in a hand held unit and these signals are broadcast to radio receiver implanted behind the ear or under the eye. This then relays the signals by micro-wire to an implant in the patient’s retina.

Even if some of a person's photoreceptor cells of the retina don’t work anymore, like in age-related macular degeneration or retinitis pigmentosum (two of the most common causes of progressive blindness in the U.S.), the retinal ganglion cells that take the receptor information and transmit it to the visual cortex are still intact.


Visual prosthetic implants can be epi- or subretinal,
clamped on the optic nerve, fed into the lateral
geniculate nucleus, or attached directly t the visual
cortex of the brain. The cortex offers the easiest
surgical target, and the maximal amplifications
of the signal.
This is why a retinal implant can work in some blind people. The Argus II has sixty electrodes attached to different retinal ganglion cells with microwires. The small electrical impulse sent from the camera fires the different electrodes in a pattern and this triggers the nerve impulse in the proper retinal ganglion cells.

About 100 people have been fitted with the system, each at about $145,000, but that doesn't mean that 100 people who couldn't see much now see perfectly. Extensive training is needed to help the patient interpret what they can now “see.” This is because, unfortunately, current systems just make some cells fire, they don't perfectly match true sight – vision is too complicated for that.

Yes, I said current systems - plural. There are more ways to do this. The Alpha IMS system (Germany) doesn’t use glasses and a camera. It has a subretinal neural prosthesis that has both the ability to detect light (via photodiodes), and then directly pass those signals to 1500 electrodes attached to the retinal ganglion cells. It’s all self-contained and powered by a wireless coil implanted below the skin of the ear.

The testing on the Alpha IMS was reported in 2013. Over a nine month period test subjects had a significant improvement in object detection and field of vision. Almost half could spontaneously, or with training, read letters. The safety study was concluded in 2014 and the device is now going on the market in Europe.

Importantly, this R&D group showed that it does matter where the implant electrodes are placed. If placed over the fovea (area of most acute detection on the retina), the patients do much better. Even a 15˚ movement eccentric to the fovea severely degrades the performance.

Finally, Bionic Vision (Australia) has tested a 24 electrode subretinal implant- but they are planning on putting together a fully functional artificial eye by 2020. This bionic eye will contain the diodes, the electrodes, and the power source to replace the eye completely, perhaps even with muscular attachments for movement.


The Monash Vision Group (Australia) is develop-
ing a visual cortex implant prosthesis. The
camera (a) sends images that are processed in the
pocket held device (b). This sends the impulses via
an antenna (d) wirelessly to the implant in the
visual cortex (e). You could also send them to a
computer, so you see what they see.
So far, we’ve talked only about the retinal implants, but there are other ways to stimulate the visual cortex. You can clamp the implant on the optic nerve – just like what was supposedly done with Geordi. You can also stimulate the lateral geniculate nucleus (further back on the way to the visual cortex) or the visual cortex itself. There are good and bad points to each of these.

On the up side, the further back you do the stimulation, the more types of blindness you could help. If you stimulate the retina – you have to have some working retinal cells. Many blind people have no working retinal cells, so these types of implants won’t work for them. But if you stimulate the visual cortex directly, it wouldn’t matter if the patient had defects in their eyes, optic nerves, or lateral geniculate nuclei. This gives us a clue as to the type of blindness Geordi had – his implants were on his optic nerves, so they and his lateral geniculates must have been functional.

On the down side, there is visual processing that occurs all the along the path from the eye to the visual cortex. The current neural prostheses don’t restore perfect, or even good vision. The retina does some image processing and so does the lateral geniculate. So when the impulses get to the visual cortex, they have been partially ordered and translated. A visual cortex implant loses all this processing. Even a retinal implant misses some processing.


For implants that use image inputs outside the eye,
it would be like you would never blink or close your
eyes. As long as the camera was working, you would
be seeing things. Even in your sleep. You would be
tough to sneak up on. Even though Raiders of the
Lost Ark was a great film, this never happened in any
of the classes I taught.
A 2012 study demonstrated this. There are about 20 different types of cells in the retina. Some for color, some for movement, some for speed, some for acuity. It takes coordinated firing of all of them to give a clear picture. Using optogenetics (the turning on and off of genes with light) this group is starting to figure out the retinal code. Solving this will make implants much better at showing life-like pictures - like upgrading to HD from a black and white Philco (images with Argus II and Alpha IMS only show dark and light).

A different group in a 2014 paper is studying this problem as well. They recorded the patterns of firing for different retinal cell types when a moving object was detected, then reproduced them electrically, the result was more life-like vision. They’ve only done this so far in isolated retinas, not in humans. The point is, lots of work needs to be done to improve the vision that patients are being given; scientists lack the specificity and precision to mimic natural vision as of now.


The Final Cut was a 2004 sci-fi film where Robin
Williams was an editor. He took all the visual memories
recorded by an implant during your many or few years
and edited the images into a shorthand story of your life.
It bombed, even with Mira Sorvino and Jim Caviezel. I’m
guessing Williams wished everyone could unsee it.
Ways to improve visual neural prostheses may include more electrodes for greater acuity or field of vision, better training for patient’s to interpret what they sense, and better electrode placement and attachment. But it may also include more advanced algorithms. Groups are working on incorporating facial recognition software so that more can be recognized with fewer data points. There are also algorithms to detect and highlight the most important objects in a field or the distance to those objects. Some groups are using images at multiple depths of focus (confocal imaging) to identify the most pertinent objects and suppressing the rest of the signal.

Or, we may go completely Geordi-like. The VP of Second Sight says there is no reason that visual prostheses couldn’t also process inputs in the infrared, ultraviolet or even X-ray ranges. It would just be a matter of including those sensors in the input apparatus. Even more bizarre, we could hook the entire system up to WiFi and broadcast/record what the person “sees.”

Robin Williams had a movie where his job was to edit visual memory recordings from deceased people and play them back as a self-eulogy (The Final Cut). Once again, life imitates art, which imitates life, and on and on.

Next week – we’re trying to produce a universal translator, but we don’t know every language in the universe. I see a fundamental flaw here.



Contributed by Mark E. Lasbury, MS, MSEd, PhD



Wang, J., Wu, X., Lu, Y., Wu, H., Kan, H., & Chai, X. (2014). Face recognition in simulated prosthetic vision: face detection-based image processing strategies Journal of Neural Engineering, 11 (4) DOI: 10.1088/1741-2560/11/4/046009

Jung, J., Aloni, D., Yitzhaky, Y., & Peli, E. (2014). Active confocal imaging for visual prostheses Vision Research DOI: 10.1016/j.visres.2014.10.023

Nirenberg, S., & Pandarinath, C. (2012). Retinal prosthetic strategy with the capacity to restore normal vision Proceedings of the National Academy of Sciences, 109 (37), 15012-15017 DOI: 10.1073/pnas.1207035109

Stingl, K., Bartz-Schmidt, K., Gekeler, F., Kusnyerik, A., Sachs, H., & Zrenner, E. (2013). Functional Outcome in Subretinal Electronic Implants Depends on Foveal Eccentricity Investigative Ophthalmology & Visual Science, 54 (12), 7658-7665 DOI: 10.1167/iovs.13-12835


No comments:

Post a Comment