Skip to main content

Life in TECHnicolor



If you’re near a window take a quick look out of it. If you’re in your bedroom, glance at your closet. If you’re in the kitchen, open up the fridge. Just take a minute to appreciate the life and vibrancy that color adds to our world. Now imagine that you can’t see any of it; an orange is just slightly darker grey than a banana and the leaves of a tree are the same dusty grey as the flowers that bloom on its branches. This is how Neil Harbisson experiences our brilliant world of color – in grayscale. Neil has complete achromatopsia, or total color blindness. Although this is tragic news, it is not the end of the story. 

When he was 21 Neil became a part of a project along with Adam Montandon (an expert in Digital Futures according to his website) that would allow Neil to experience color for the first time. You might think the solution would involve some sort of operation or eye transplant, but their idea was much more innovative. They developed the Eyeborg—a sort of third eye that detects color and coverts it to sound waves which are transmitted to Neil’s inner ear via bone conduction. So technically that means that Neil can hear colors. 

Neil Harbisson’s experience of color depends on the transduction of light energy into sound energy. The key to the way this works is in the wave properties of light and sound. We have all probably talked about “light waves” and “sound waves,” but have you ever considered that they are kind of the same thing?

Let’s back up, all the way to elementary school when you learned the colors of the rainbow using the acronym Roy G. Biv. Those colors correspond to the range of wavelengths of light that our eyes can perceive, the same rainbow of colors that physicists and other academic types call the visible light spectrum. Here’s a handy-dandy diagram:


The regions between the dotted lines represent one period of each wave, and the length of this period is called the wavelength. The wave on the right has a longer wavelength than the one on the left, and hence our eyes see them as different colors. So the camera of Neil’s Eyeborg detects the different wavelengths of light that it “sees,” but how does that get translated into sound??

To answer this question we should think a little bit about sound waves and how our ears work. Just like the wavelength of a light wave corresponds to our perception of a specific color, the wavelength of a sound wave corresponds to our perception of a different audible pitch. In brief, this is because sound waves that enter our ears cause all kinds of tiny structures of our inner ears to vibrate with a frequency that is related to the wavelength of the wave entering the ear. 

So, if we put all of that together, Neil’s Eyeborg detects wavelengths of light, a computer program converts those wavelengths to corresponding frequencies of vibrations, those vibrations are passed to Neil’s inner ear via bone conduction where each individual frequency results in his perception of a tone of different pitch. To me, the most amazing thing about this technology is that it utilizes Neil’s intact sense of hearing to enhance his impaired sense of vision. Although it’s probably not quite the same as having a functional visual system I think Neil would tell you it is the next best thing.

To see just how Neil’s Eyeborg works and all that he can do with it check out this TED talk by Neil himself. It’s only 10 minutes long and I guarantee it will be worth it:

Sources and Further Reading:

Harbisson, N. (2012, July). Neil Harbisson: I Listen to Color [video file] retrieved from http://www.ted.com/talks/neil_harbisson_i_listen_to_color.html 

Montandon, A. (2010). Projects: Colourblind Eyeborg Colours to Sound retrieved from http://www.adammontandon.com/neil-harbisson-the-cyborg/

Comments

  1. I enjoyed Neil's description of his Eyeborg! And in thinking about it, one realizes how great a role memory plays in our color sense.
    Neil memorizes the tunes that go with the colors. We may remember flavors, or odors, or textures to complement the visual sensation.
    Right now we are enjoying blue jacaranda blossoms. Amazing how much more intense the blue seems when the sun is NOT shining!

    ReplyDelete
  2. I'm glad you enjoyed his story too! Indeed, his talk made me think about how easy it is to take each of our senses for granted. But you're right, the loss is even greater when you think about how our senses work with one another to enhance our perception of the world. Isn't it cool that although Neil started by attributing sounds to colors, after a while he also began attributing colors to everyday sounds?

    ReplyDelete

Post a Comment

Popular posts from this blog

Precision murder -- wait, no -- medicine

A non-zero amount of what we call ‘medicine’ could be described as just controlled cell murder.  This was my revelation after researching a new treatment for certain cardiac arrhythmias called Pulsed Electric Field Ablation, which I became interested in when my father-in-law asked me how it worked during our Christmas visit. “How can it kill the heart cells and leave the nerves and blood vessels intact?” I had no idea. I know next-to-nothing about medical treatments for cardiac patients, much less how this Pulsed Field Ablation technique could have fewer side effects than the standard-of-care ablation techniques. A quick Google search piqued my curiosity when I learned that PFA is also sometimes called “high frequency irreversible electroporation”. While less catchy, that name revealed a bit more about the mechanism of action behind PFA - electroporation - which happens to be something I actually do know something about. Electroporation refers to the formation of holes (pores) in c...

AlphaFold2 Part 2: The ion channel challenge

Last month I wrote about the wonders and perils of the artificial intelligence program that predicts 3D protein structures, AlphaFold2. As an ion channel enthusiast , I naturally wanted to know how AlphaFold2 performs at predicting the structures of proteins embedded in cell membranes. When I search PubMed for articles that mention both "AlphaFold" and "ion channel" I only get 34 hits. This surprised me, given the hype and the general paranoia around AI replacing humanity. If we use these search results as a proxy for the state of the ion channel protein structure prediction field, I'd say the juice is still in the coconut. I wanted to know how well AF2 would do at predicting an ion channel protein structure, so I asked it to generate the structure of Kv2.1, a voltage-gated potassium ion channel that I studied during graduate school. Kv2.1 is a pretty important protein. It regulates neuron firing throughout the brain and body where it helps us learn new stuff, ...

iPSCs, the new model organism?

Induced pluripotent stem cells. The name doesn't exactly roll off the tongue and it certainly doesn't conjure images of mice, fruit flies, monkeys, or any of the other classic model organisms used for basic biomedical research. These so called "model organisms" are just that; animals that help scientists model the way that the most promising human therapeutics in the collective pipeline will behave in humans. And now induced pluripotent stem cells, or iPSCs, are becoming an increasingly popular tool used for developing and testing novel drugs way before we expose any real human patients to them. The upside to using model organisms is pretty obvious -- we minimize exposure of humans to potentially unsafe molecules. The downsides are many, but one big one is that sometimes potential new drug molecules look really promising when they are given to a mouse with a human-like disease, but then that same molecule does nothing (or worse, is toxic!) when it goes into human clin...