Over the last few months, we have talked about several new technologies enabling incredible new ways to eavesdrop on secretly record conversations. These have included drones equipped with cameras and audio recorders, as well as wearable technologies like Google Glass, which help people surreptitiously record their surroundings.
As we talked about back in March, users of these technologies must be careful about their use – 12 states have laws requiring that all parties in the conversation be warned when a discussion is being recorded. But in the future, we might not be as concerned about being recorded by wearables or drones listening to our conversations as we are about being recorded by the everyday objects we have around us all the time—like candy wrappers, chip bags, and decorative plants.
This may sound like out of control paranoia, but some new research out of the Massachusetts Institute of Technology (MIT) may make it possible. And no, it doesn’t involve wiring everything in the world with a tiny microphone. This new technology could enable objects to “listen in” on conversations without listening to anything.
So the obvious first questions is—how in the world is this possible?
As first author and MIT graduate student Abe Davis explained in an interview earlier this month, “When sound hits an object, it causes it to vibrate.” Based on this idea, Davis and fellow researchers from MIT, Microsoft, and Adobe, have developed an algorithm for reconstructing audio based on observing the tiny vibrations of objects near the source of the audio.
According to MIT News, the researchers conducted a variety of experiments, including one where they managed to “recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.” Skeptical? Check out this video. Using a high-frame rate video camera able to capture 2,000 to 6,000 frames per second, researchers can observe subtle visual signals that are imperceptible to the naked eye. The algorithm then enables researchers to use these signals to turn everyday objects like plants or chip bags into so called “visual microphones.” The team presented their findings at this year’s Siggraph conference, which took place last week in Vancouver.
Obviously, the technology sounds like something out of a gripping NCIS episode where Gibbs and the team can catch the criminal because they observed the vibrations of his Doritos bag through a surveillance camera. But in the same interview about the research, Davis pointed out that, beyond the obvious forensic possibilities, what the technology really offers is a “new kind of imaging.”
Davis explains that the technology’s ability to recover sounds from objects “…gives us a lot of information about the object itself, because different objects are going to respond to sound in different ways.” Davis and his fellow researchers have tried to identify the makeup of objects based on their reaction to sound.
The technology is still quite a ways from being widely available for us to use, but down the road, it could definitely pose a challenge to existing recording laws. If we apply the same logic from the Aereo ruling we discussed earlier in the summer (the “if it looks like a duck, and quacks like a duck, then it’s a duck” approach), then we could see this technology being treated much the same as traditional recordings, given that the audio can be reconstructed from a visual recording and turn everyday objects into microphones. By the time we may have to confront the issue, we may have had to give on the idea that our communication is completely private.
- Groundbreaking Technology Helping to Fulfill the ADA’s Promise to the Disabled(article-3.com)
- Drone Regulations Take Off (without the FAA)(article-3.com)
- Banning Glass? Etiquette in an Age of Wearable Technology(article-3.com)