Our brain is amazing! It will 'parse' a wide range of inputs from our eyes, ears, nose, skin etc. However, our brain itself doesn't hear, see, or smell...but rather it interprets the data from our senses.
Neuroscientists have been working on peripheral devices that are helping blind people see, and deaf people hear; and yes, the brain centre directly related to sight and hearing adjusts to interpret input from any appropriate device. So, a blind person really gets an experience of sight (the bits of the brain related to sight in a person who has input from their eyes light up in an unsighted person who has input from a peripheral device).
The implications for our learners with special education needs could be profound. Although, it would also need to respect the experiences of people who have developed their own way of seeing and hearing, and don't wish to use a device (i.e. we still need to stay away from deficit thinking).
What are your thoughts about the implications? Benefits? Drawbacks?
The description from the site reads: "As humans, we can perceive less than a ten-trillionth of all light waves. 'Our experience of reality', says neuroscientist David Eagleman, 'is constrained by our biology'. He wants to change that. His research into our brain processes has led him to create new interfaces — such as a sensory vest — to take in previously unseen information about the world around us.
Add a Comment