Augmented Reality
Apple iPhone Apps reports on new iPhone features, attributing credit to an anonymous leak from inside Apple. I would like to focus on one specific feature. They report, with skepticism:
-Revolutionary combination of the camera, GPS, compass, orientation sensor, and Google mapsThe camera will work with the GPS, compass, orientation sensor and Google maps to identify what building or location you have taken a picture of. We at first had difficulties believing this ability. However, such a “feature” is technically possible. If the next generation iPhone was to contain a compass then all of the components necessary to determine the actually plane in space for an image taken. The GPS would be used to determine the physical location of the device. The compass would be used to determine the direction the camera was facing. And the orientation sensor would be used to determine the orientation of the camera relative to the gravity. Additionally the focal length and focus of the camera could even assist is determining the distance of any focused objects in the picture. In other words, not only would the device know where you are, but it could determine how you are tilting it and hence it would know EXACTLY where in space your picture was composed. According to our source, Apple will use this information to introduce several groundbreaking features. For example, if you were to take a picture of the Staples Center in Los Angeles, you will be provided with a prompt directing you to information about the building, address, and/or area. This information will include sources such as wikipedia. This seems like quite an amazing service; and a little hard to believe, however while the complexity of such a service may be unrealistic, such is actually feasible with the sensors onboard the next generation iPhone.
And why “unrealistic”? Every piece of this technology already exists in the wild. This is not a great technological leap. This is merely smart convergence.
There are already two applications on the Google Android platform that have these features. One is a proof-of-concept called Enkin, developed by Max Braun and Rafael Spring (students of Computational Visualistics from Coblenz Germany, currently doing robotics research at Osaka University in Japan). The second, Wikitude by Mobilizy, is already in full-blown commercial release (an Austrian company, founded by Philip Breuss-Schneeweis and Martin Lechner).
WIKITUDE DEMONSTRATION:
ENKIN, PROOF-OF-CONCEPT:
It is only one short step further to let users geo-tag their photos. Many social photo/map applications available for the iPhone already incorporate such a feature. Building this into the realtime viewfinder would not be a great challenge. By example, the proof-of-concept for this already exists in the form of Microsoft’s Photosynth (silverlight browser plugin required).
Social Media apps could tap into this utility to network members in real space. At the most basic level, Facebook and/or LinkedIn apps could overlay member’s with their name and profile information.
The next logical extension of this will be to place the information directly into your field of vision.
The OOH marketing opportunities are immense. Recent campaigns for General Electric in the US, and the Mini Cooper in Germany show where this is going. Suddenly the work done by Wayne Piekarski at the University of South Australia’s Wearable Computer Lab is no longer so SciFi (now being commercialized as WorldViz). At January’s CES, Vuzix debuted their new 920AV Model of eyewear, which includes an optional stereoscopic camera attachment to combine virtual objects with your real environment. Originally scheduled for a Spring release, their ship-date has now been pushed back to Fall (their main competitor, MyVu, does not yet have an augmented reality model). If the trend finally takes, expect to see more partnerships with eyewear manufactures.
Initially through the viewfinder of your smartphone, and eventually through the lens of your eyewear, augmentation will be the point of convergence for mobile-web, local-search, social media, and geo-targeted marketing. Whether Apple makes the full leap in one gesture with the release of their Next-Gen iPhone, or gets there in smaller steps depends upon both the authenticity/acuracy of this leak, and the further initiative of third-party software and hardware developers to take advantage of it. Innovation and convergence will be the economic drivers that reboot our economy.
EDIT: The only capability Apple actually needs to add to the iPhone in order for this proposed augmented reality to be implemented is a magnetometer (digital compass). Google Android models already have this component. Charlie Sorrel of WIRED Magazine’s Gadget Lab has separately reported this feature through leaks of a developer screen shot, and on May 22nd Brian X. Chen, also reporting for WIRED Magazine’s Gadget Lab, put the probability of a magnetometer being included in the new iPhone at 90%. Once the iPhone has an onboard compass, augmented reality features will begin to appear, whether through Apple’s own implementation or from third party developers.
UPDATE: Since the time of this writing, the iPhone 3GS has been released, and it does indeed include an magnetometer.
Reader Comments (3)
I'm quite fond of disruptive technology, and this seems to be a prime candidate! While for years we have learned about all sorts of projects promising augmented reality "real soon now", these were insanely expensive and the devices were cumbersome and complex to use — good for the lab, impossible to be adopted by the mainstream.
Ironically, the best "crippled" augmented reality we have right now is geo-tagging and leaving notes/annotations on Flickr/YouTube... at least the interface is simple. But... the connection to the real world, in real time, is lost.
Now Steve Jobs just puts the ultimate user-friendly augmented reality device in everybody's pockets. The mind boggles with the possibilities!
... but actually, a mark of disruptive technology is that it becomes so quickly ubiquitous that we won't even remember a "live before that technology". I mean, does anyone ask where all the queries went before Google went live? :)
The iPhone is quickly becoming a Pandora box full of nice surprises. I thought before that Jobs wanted an affordable PSP clone and take the game consoles market by storm (since he never managed that with the Macs... in spite of the good graphics), and that breaking down the mobile telecom empire was just a side show. Now he promotes the iPhone as an augmented reality device. Mmmh. All smart mobile phones changed to a touch screen interface in two years (sometimes, improving on Apple's design); a few started looking at games much more seriously; now all will become augmented reality devices?
Connect these cool shades to your iPhone, and there you go — it looks just like the technology in any cool "recent-future" SF movie. And it won't take "years of development" or "millions to develop": it's here, now.
Now can I have Second Life running natively on my iPhone too? :) Please? :)
Not sure I believe this yet.
Awhile the IPhone is certainly very capable of this stuff, the support for developers using video is lackluster. (apperently thats whats stopping a Wikitude for the iPhone...to do AR on the iPhone decently you still need to hack away at it). Given thats only a software issue they need to sort out, if they were pushing the iPhone as an AR device they would have done it already.
I suspect the iPhone will only dive into the AR scene wholesale once its been proven elsewhere.
The big issue for me is not when AR will take off massively...it will, and soon.
But rather what form. Will we be stuck with individual applications, or will some standards emerge like with html?
Will we be able to overlay our world with many seperate AR channels at once? Or will we be stuck "alt+tabbing" between them?
"Connect these cool shades to your iPhone, and there you go — it looks just like the technology in any cool "recent-future" SF movie."
Recent? Caprica is set thousands of years in the past :P
Dennou Coil is still the best example of the tech. And its 202X date looks more and more pessimistic.
Great post, Chris. The progression is certainly going to start with handheld "through the looking glass" devices, then projected eyewear, then retina projected eyewear and then presumably implants or some other biologically direct connection.
In practical terms for today, we have some way to go on processing power, portable power/batteries, over-the-air network speed, and other things that make this technology mainstream and convenient, but we are certainly going to see some amazing leaps forward in the next few years with this.
I remember walking around years ago with one of the first Bluetooth headsets getting strange glances from people thinking I was a crazy person talking to myself because there were no wires or a phone pressed to my face. I'm anxious to look crazy again with some of this next generation gear :)