Contact: SINC
infoARROBAplataformasinc.es
91-425-1820
FECYT - Spanish Foundation for Science and Technology
| ||||
A team of researchers from the University of Alcalá de Henares (UAH) has shown scientifically that human beings can develop echolocation, the system of acoustic signals used by dolphins and bats to explore their surroundings. Producing certain kinds of tongue clicks helps people to identify objects around them without needing to see them, something which would be especially useful for the blind.
"In certain circumstances, we humans could rival bats in our echolocation or biosonar capacity", Juan Antonio Martínez, lead author of the study and a researcher at the Superior Polytechnic School of the UAH, tells SINC. The team led by this scientist has started a series of tests, the first of their kind in the world, to make use of human beings' under-exploited echolocation skills.
In the first study, published in the journal Acta Acustica united with Acustica, the team analyses the physical properties of various sounds, and proposes the most effective of these for use in echolocation. "The almost ideal sound is the 'palate click, a click made by placing the tip of the tongue on the palate, just behind the teeth, and moving it quickly backwards, although it is often done downwards, which is wrong", Martínez explains.
The researcher says that palate clicks "are very similar to the sounds made by dolphins, although on a different scale, as these animals have specially-adapted organs and can produce 200 clicks per second, while we can only produce three or four". By using echolocation, "which is three-dimensional, and makes it possible to 'see' through materials that are opaque to visible radiation" it is possible to measure the distance of an object based on the time that elapses between the emission of a sound wave and an echo being received of this wave as it is reflected from the object.
In order to learn how to emit, receive and interpret sounds, the scientists are developing a method that uses a series of protocols. This first step is for the individual to know how to make and identify his or her own sounds (they are different for each person), and later to know how to use them to distinguish between objects according to their geometrical properties "as is done by ships' sonar".
Some blind people had previously taught themselves how to use echolocation "by trial and error". The best-known cases of these are the Americans Daniel Kish, the only blind person to have been awarded a certificate to act as a guide for other blind people, and Ben Underwood, who was considered to be the world's best "echolocator" until he died at the start of 2009.
However, no special physical skills are required in order to develop this skill. "Two hours per day for a couple of weeks are enough to distinguish whether you have an object in front of you, and within another two weeks you can tell the difference between trees and a pavement", Martínez tells SINC.
The scientist recommends trying with the typical "sh" sound used to make someone be quiet. Moving a pen in front of the mouth can be noticed straightaway. This is a similar phenomenon to that when travelling in a car with the windows down, which makes it possible to "hear" gaps in the verge of the road.
The next level is to learn how to master the "palate clicks". To make sure echoes from the tongue clicks are properly interpreted, the researchers are working with a laser pointer, which shows the part of an object at which the sound should be aimed.
A new way of seeing the world
Martínez has told SINC that his team is now working to help deaf and blind people to use this method in the future, because echoes are not only perceived by their ear, but also through vibrations in the tongue and bones. "For these kinds of people in particular, and for all of us in general, this would be a new way of perceiving the world".
Another of the team's research areas involves establishing the biological limits of human echolocation ability, "and the first results indicate that detailed resolution using this method could even rival that of sight itself". In fact, the researchers started out by being able to tell if there was someone standing in front of them, but now can detect certain internal structures, such as bones, and even "certain objects inside a bag".
The scientists recognise that they are still at the very early stages, but the possibilities that would be opened up with the development of echolocation in humans are enormous. This technique will be very practical not only for the blind, but also for professionals such as firemen (enabling them to find exit points through smoke), and rescue teams, or simply people lost in fog.
A better understanding of the mental mechanisms used in echolocation could also help to design new medical imaging technologies or scanners, which make use of the great penetration capacity of clicks. Martínez stresses that these sounds "are so penetrating that, even in environments as noisy as the metro, one can sense discontinuities in the platform or tunnels".
References:
Juan Antonio Martínez Rojas, Jesús Alpuente Hermosilla, Pablo Luis López Espí y Rocío Sánchez Montero. "Physical Analysis of Several Organic Signals for Human Echolocation: Oral Vacuum Pulses". Acta Acustica united with Acustica 95 (2): 325-330, 2009.
Abstract:
Active human echolocation can be an extremely useful aid for blind people. Active echolocation can be trained with both artificial and organic signals. Organic signals offer some advantages over artificial ones. Very detailed studies of organic signals in animals have been done. However, in the case of humans, the scientific literature is very scarce and not systematic. This is the first paper of a series on the properties of several suitable sounds for human echolocation. In this work, we offer a detailed analysis of these sounds, comparing their merits from a physical point of view. The results of this study have important applications to design systematic and optimized training protocols for accurate echolocation awareness.8 referencias:
Anyone who reads the articles at Acustica will realize that there is nothing there that supports the claim of humans being capable of echolocation. It is an extreme example of bad science in which the authors compute the fft of a human sound. There is absolutely no mention of the ear impulse response function, scattering conditions under which the echo can be detected or any signal processing concept, not even vague or feeble, that links the topic with radar or sonar techniques. The first time I read it I thought it was a joke by the editors, to prove that sometimes anything can creep in a scientific journal. From such a naïve study, which could be done by a freshman in his/her university semester in a single afternoon, it is amazing how many absurd claims the "autores" dared to make.
ResponderEliminar