Common wisdom has it that if the visual cortex in the brain is deprived of visual information in early infanthood, it may never develop properly its functional specialization, making sight restoration later in life almost impossible.
Scientists at the Hebrew University of Jerusalem and in France have now shown that blind people — using specialized photographic and sound equipment — can actually “see” and describe objects and even identify letters and words.
The images are converted into “soundscapes,” using a predictable algorithm, allowing the user to listen to and then interpret the visual information coming from the camera. The blind participants using this device reach a level of visual acuity technically surpassing the world-agreed criterion of the World Health Organization (WHO) for blindness, as published in a previous study by the same group.
The resulting sight, though not conventional in that it does not involve activation of the ophthalmological system of the body, is no less visual in the sense that it actually activates the visual identification network in the brain.
“The adult brain is more flexible that we thought,” says Prof. Amedi. In fact, this and other recent research from various groups have demonstrated that multiple brain areas are not specific to their input sense (vision, audition or touch), but rather to the task, or computation they perform, which may be computed with various modalities.
“Ramesh Raskar presents femto-photography, a new type of imaging so fast it visualizes the world one trillion frames per second, so detailed it shows light itself in motion. This technology may someday be used to build cameras that can look ‘around’ corners or see inside the body without X-rays.
Photography is about creating images by recording light. At the MIT media lab, professor Ramesh Raskar and his team members have invented a camera that can photograph light itself as it moves at, well, the speed of light.”
Awe-inspiring. Seeing the wave in light’s wave-particle duality left me speechless.
There was a time when concertgoers were perfectly happy just listening to the band play. Not anymore. Today’s music fan expects to see pyrotechnic spectacles with elaborate lighting and special effects for the price of concert admission. Even house-party hosts are downloading trippy screensavers that change in time to the music. When the time came to create the visuals for Floating — a new album by Danish electronic artist Rumpistol / Red Baron — design studio Futura Epsis 1 made something that was a bit of both.
Floating’s live show features a hypnotic 3-D display, controlled by an iPad app, that you can download for your home. The idea is that the show can be projected anywhere — from large arenas to living rooms. “We came up with the idea of a realtime 3-D sound-visualization as an installation for concerts and live-events as well as interactive installation for clubs,” says Andreas Rothaug, Head of Development Futura Epsis 1.
The set of visuals and fluid simulations, are generated in realtime, and driven by the sounds coming into the computer. The multitouch iPad allows a performer to manipulate the images and effects on the big screen, or to switch between scenes.
Floating’s visuals are designed to be used anywhere. Though the app as released is for iOS, Rothbaug says that behind the scenes the tech is radically multi-platform. For instance, visuals can be split into a stereoscopic effect, and then rendered for red/blue, shutter, or polarization-based 3-D glasses, depending on what viewing the audience will have.
The app can be controlled by virtually any device. “The control protocol of the visuals is OSC (like the network version of MIDI), creating the option of controlling the visuals with a variety of controllers,” he says. “For example directly out of Ableton Live, MIDI controllers in general.” This opens up the possibility of ditching the iPad and taking control with even more exotic interfaces like BMI (brain-machine Interfaces) or a Wii controller.
A recent breakthrough in the development of an artificial synapse suggests that assistive devices and other prostheses won’t be limited to just missing joints and failing organs. Researchers in Japan have shown that it’s possible to mimic synaptic function with nanotechnology, a breakthrough that could result in not just artificial neural networks, but fixes for the human brain as well.
Synapses are essential to brain function. It’s what allows a neuron to pass an electric or chemical signal to another cell. Its structure is incredibly complex, with hundreds of proteins and other chemicals interacting in a complicated way. It’s because of this that cognitive scientists and artificial intelligence researchers have had great difficulty trying to simulate this exact function.
But a new study published in Advanced Functional Materials has shown that it may be possible to reproduce synaptic function by using a single nanoscale electrochemical atomic switch. Japanese researchers developed a tiny device that has a gap bridged by a copper filament under a voltage pulse stimulation. This results in a change in conductance which is time-dependant — a change in strength that’s nearly identical to the one found in biological synaptic systems. The inorganic synapses could thus be controlled by changes in interval, amplitude, and width of an input voltage pulse stimulation.
Why this is exciting is that the device is essentially mimicking the major features of human cognition, what the researchers refer to as the “emulation of synaptic plasticity”, including what goes on in short-term and long-term memory. Not only that, it responds to the presence of air and temperature changes, which indicates that it has the potential to perceive the environment much like the human brain.
The researchers are hoping that their newfound insight could help in the development of artificial neural networks, but it’s clear that their system, which operates at a microscopic level, could also be used to treat the human brain. The day may be coming when failing synaptic systems could be patched with a device similar to this one, in which biological function is offloaded to a synthetic one.
Online adverts are already tailored to your search and browsing history, but now Microsoft has plans to add emotions to the mix. In a recently revealed patent application the company suggests that using its Kinect sensor to analyse your face and body language for emotions could help companies better target their ads. Emotional analysis of emails, search terms and even your online gaming performance could also influence the ads you see…
In the aftermath of the tsunami, a Japanese collective created the off-the-grid house of the future where technology and nature coexist.
Neutrinos send wireless message through the Earth
Just as neutrinos look likely to lose their faster-than-light crown, these subatomic particles have a new claim to fame as part of a wireless communication system that could potentially send messages directly through the Earth’s core.
A team at Fermilab in Batavia, Illinois have successfully used a beam of the near-massless particles to transmit the word “neutrino” to a detector 1 km away, including a 240-metre journey through solid rock.
Neutrinos rarely interact with other forms of matter, so pass through most objects unimpeded - including the Earth’s core. That makes them potentially useful as messengers. Previous suggestions include using these ghostly particles to send messages across the planet without wires, cables or satellites, to communicate with hidden submarines or even to sync alien clocks. This latest experiment is the first demonstration that the principle actually works…
(via New Scientist)
Fred Fathom - Curing
Visualization based on 125,530 distinct CT and MR scans that were conducted using GE equipment during a 24-hour period. The globe reveals the geo-location of the scans, while the timelines indicate the number of scans occurring each minute.
Music: Eric Gunther - Oslo Electric
Having trouble remembering your password? Perhaps you need to use your heart instead of your head. An encryption system that uses the unique pattern of your heartbeat as a secret key could potentially be used to make a hard drive that will only decrypt in response to your touch.
Our heartbeats follow an irregular pattern that never quite repeats and that is unique to everyone. Chun-Liang Lin at the National Chung Hsing University in Taichung, Taiwan, and colleagues used an electrocardiograph (ECG) to extract the unique mathematical features underlying this pattern. They then used the information to generate a secret key that forms part of an encryption scheme based on the mathematics of chaos theory, by which small changes in initial conditions lead to very different outcomes.
As a proof of concept, Lin’s system currently takes the user’s ECG reading from each palm once, and a key based on that reading is stored and used for all later decryptions. He says the goal is to build the system into external hard drives and other devices that can be decrypted and encrypted simply by touching them.
(via New Scientist)
Hang on tight while we grab the next page