Photo via David Barrie via Flickr Creative Commons
We've been hearing about the idea of putting air quality sensors in cell phones for a few years now, with the purpose of scientists using our cell phones as a crowd-sourced way of gathering as much location-specific, real time air quality measurements as possible. It'd help the scientists, which would in turn help us. But what if we want to use our cell phones to satisfy our own curiosity about the quality of air around us? Researchers at University of Southern California have come up with the Visibility app for Android smartphones, which requires just a quick snapshot of the skyline to tell you whether or not it is safe to inhale.
Gizmag points us to the interesting (and free) app from the Robotic Embedded Systems Lab at USC. A user needs only to take a photo of the sky, and they receive a message detailing the pollution levels at that location, including warnings on whether or not to head inside. The information is not only immediately useful to the cell phone owner, but also benefits the scientists who collect it for a better understanding of local air pollution. The team writes that this could be an ideal solution combining the precise measurements possible with electronic equipment with the inexpensive and widely distributed tools we call cell phones. Cheap yet accurate data -- something for which all scientists strive.
So, how can the app tell how polluted the air is from just a simple cell phone snapshot? Wouldn't many factors down to the quality of the cell phone camera impact accuracy? Turns out the photos undergo comparison testing in a central computer before results are provided to users. Gizmag explains:
Visibility sends a user's photo of the sky as a small black-and-white file to a central computer, along with data from their phone's GPS, compass, clock and accelerometer. The computer compares the luminance value of the sky in the photo to algorithmic models for the specific time and coordinates at which the phone data indicates the image was taken. If the sky in the photo isn't as bright as the model says it should be, it means that some of the sunlight wasn't making it through the atmosphere, because it was blocked by haze aerosols. Not only does the computer then send the user a report on the level of air pollution, but it also stores the information (without the user's identity) to augment pollution maps for the area.
There is some user fallibility to factor in, including getting the camera's orientation to the sky correct when snapping a photo, which is why the app has a built-in guide that helps the user orient the phone when taking the photo.
Right now the app is free for Android phones, and an iPhone version is on its way. But the more people that use it, the more the team can improve it as well as gain information for the database to make measurements more accurate. The team has tested the app in Los Angeles, California and Phoenix, Arizona -- two notoriously polluted cities -- and the apps readings measure up favorably to the air quality data published by the EPA.
Follow Jaymi on Twitter for more stories like this
More on Air Quality Sensors
Mapping Traffic Pollution Sends MESSAGE Loud and Clear
DIY Balloons Glow to Show Air Quality
People Actually Like Clunky Environment-Sensing Wrist Bands From La Montre Verte