So-called "augmented reality" apps can be pretty helpful on the run. Hold up an augmented reality restaurant-finding app to a busy street, for example, and the app will show the same street, except with text bubbles to show which of the buildings are restaurants and maybe the ratings given them by other users. Computer scientists Yuichiro Takeuchi and Ken Perlin, however, thought that all those arrows and bubbles were too crowded and confusing. So they designed an app that highlights buildings not with words, but with weird distortions of the buildings that make them grow, shrink, bounce and sway. They'll present their app, called ClayVision, on May 9 at the Association for Computing Machinery's conference on human-computer interaction in Austin, Texas. Their research won a best paper award at the conference.
Augmented reality is growing in popularity, but still emerging, Takeuchi, from Sony's research lab in Tokyo, and Perlin, from New York University, wrote in their paper. So it's the perfect moment to change how people think the technology should look and work. Their idea works better than traditional text bubbles because they reduce the clutter on-screen, they wrote. To demonstrate how the building distortions would work, they created ClayVision for the Apple iPad 2. Watch it at work here:
Augmented reality apps now only have to reproduce what the user sees on the street, on-screen. ClayVision's main challenge was to create a view of the city in which all the buildings a user sees can change shape, color or size.
The new app starts by figuring out at what angle a user is looking at a building. The program compares video it gathers from the iPad's camera to a database of photos. If the user moves the device back and forth, CityVision calculates how much it's been turned.
After figuring out its angle and position, CityVision adds 3D models to the buildings it sees. It can then transform those models, making buildings grow, sway or change in texture so they appear hand-drawn. The visualization works smoothly and in real-time, except for some buildings that have ornate facades, Takeuchi and Perlin reported.
The researchers played with a few ways ClayVision can highlight the right building for users trying to find a specific address in a city. The program can color the correct building red, make the building appear much taller than its neighbors or even make it appear to bounce slightly, like an eager kid.
The researchers also worked on making common navigating landmarks more visible from a distance. They created a green band that extended into the sky above the fountain in Washington Square Park near NYU, so users can find the park even from places in the city where it would normally be hidden by buildings and trees. The researchers also created a line in the sky marking Houston Street, an important boundary line between neighborhoods in New York City.
Because the ClayVision system depends on a database of photos and 3D models of buildings that Takeushi and Perlin created by hand, for now, it only works for a few neighborhoods in New York. In the future, if 3D modeling advances enough, ClayVision could automatically calculate 3D models of buildings in any city, they wrote.
Future technologies could also support more elaborate transformations to help people navigate cities, they said. For example, Tokyo has winding streets, so future versions of CityVision could straighten a street so users could see further down it and see what buildings are ahead. CityVision might also take taps and touches from users, letting people move buildings aside to see what's behind them.
Follow InnovationNewsDaily on Twitter @ News_Innovation, or on Facebook.