7477

Time to start tagging the physical world

Tonchidot's, Sekai Camera, enables your iPhone to provide realtime information on physical objects

Technology trends and news by Chris Caceres
February 19, 2009 | Comments (1)
Short URL: http://vator.tv/n/6ed

 There are iPhone apps out there like Google Maps, which use location services to show you relevant places or searches on their provided maps.  Sometimes these can connect you with the people around you, or sometimes they can simply tell you the location of whatever type of restaurant you are looking for. 

Tokyo based, Tonchidot, is working on something a little more advanced - an iPhone application called “Sekai Camera” that allows you to point your camera, and in real time, overlay information bubbles for whatever you are looking at.  If there isn’t a tag for an object in the real world, add it, and it will be updated into the database.  Sounds kind of like fantasy but it's in the making and was recently demonstrated at the TechCrunch50 Conference.

Tonchidot, which was founded in 2007 by Takahito Iguchi, has a technology in their hands that can open a door of endless business opportunities.  First of all, I’d gladly pay up to $10 dollars for an app that makes me see the world like a robot.  Second of all, imagine companies like CocaCola or I don’t know, quite anybody who would be interested in paying a fee for tagging their product, so that potential customers who adopt this next level of actual reality, can see information or even a little advertisement about whether to purchase that Coke or Pepsi bottle?  I think you catch my drift.  

With the endless opportunities a technology like this can offer, no one can deny excitement for the future of actual reality.  

By the way, “Sekai Camera” translates to, “World Camera” in English.

Here’s a video to see the application in action.  


Comment

Raphael Bennett
Raphael Bennett, on February 19, 2009

That's excellent! Were working on adding similar functionality inside of our WebWalk presentations. Were really excited to see how user based input will effect our presentations. Very cool!


blog comments powered by Disqus