When discussing Location-Aware Mobile Augmented Reality with clients or friends they are often initially mystified by how it works without using any form of tagging or QR codes. In short, this video is a visualization of the first conversation I usually have when the subject comes up. I’ve created it as a simple explanation to demystify the technology for those who are just becoming familiar with it.
The visual shown is not of any specific AR application, it is only meant to be a general representation of the underlying technology.
When, in the late 80s, Apple saw a need for a suite of business software for their new Macintosh platform, they decided to subcontract the development to a software company named Microsoft. The RFP was for three applications: a word processor, a spreadsheet and a slideshow for business presentations. Much like their IBM deal for DOS, Microsoft proposed to develop the apps but retain ownership for a good deal less money than selling it to Apple outright. In 1989 Microsoft released Office for the Macintosh, containing Word, Excel and PowerPoint. A year later, they introduced the same package for their own OS, Windows. While Apple made a premium product with a proprietary OS, Microsoft developed their Windows OS with similar Mac like graphical interface features, built to run on the same x86 (PC compatible) architecture that had made their MS DOS the default industry standard… and now running the same suite of business software available on the Mac. This story will, of course, be common knowledge to most readers as it is among the most famous business parables in modern corporate history.
And history is known to repeat itself.
With the iPhone, Apple today reigns supreme. Just as the introduction of the Macintosh did to the personal computer market in 1984, the iPhone has single handedly reshaped the mobile phone market since its introduction. But things are about to get ugly.
In January 2007 Steve Jobs introduced the iPhone. Within months, before the iPhone was even available for purchase, rumors had already begun to swirl that Google was in overdrive to develop a touchscreen mobile OS, but built on an open standard that could run on many different handsets. In spite of official denials, by fall photos of prototypes were beginning to leak. Initially slow to get traction, over the next several months multiple vendors will be introducing new models, and Google Android clones are about to flood the market.
Motorola, once a dominant force in mobile phones, has lost so much market share it may abandon the mobile market all together. It is betting the farm on Android. Or as Tal Liani put it, “Motorola has one bullet left in its gun.” Just a few years ago the Motorola razor was the top selling phone on the planet, and pwned its competitors in design awards as well. How quickly the mighty can fall.
The future is unwritten, and betting against Apple doesn’t look like a winner’s strategy. But then neither does betting against Google. One thing is for certain. The writing is on the wall, and Apple and Google are on a collision course in the mobile market that is about to look very familiar.
The name “iPhone” is a misnomer. It is not a phone. It is pocket sized computer that, among other features, also happens to include a phone. The Apple iPod-Touch is sometimes portrayed as a crippled cousin of the iPhone— an iPhone somehow lacking its primary function. A more accurate analogy would be that the iPod-Touch is a portable pocket computer, and the iPhone is a premium version of the iPod-Touch that happens to have one extra feature. This is not entirely semantic. The iPod-Touch can do everything the iPhone can do, including connected functions like browsing the web via a wifi connection. Even for iPhone users, a wifi connection is preferred for internet activity beyond a basic search. Most people that have an iPod-Touch have a home wifi, and wifi at the office is now pretty well standard. It is standard campus-wide at every university. In every internet cafe. Every coffee shop. Many parks. Shopping malls… Wifi connectivity is on it’s way to becoming ubiquitous throughout many urban areas.
But the iPod Touch doesn’t have a microphone… yet. At least not for, oh, another month or so. Leaks abound that, like the iPhone 3Gs, the new iPod-Touch will feature video, including both a camera and microphone. The first iPod-Touch with a microphone. Forget video, this opens the door to Skype style IP telephony— internet calls over wifi.
A reasonable long-term strategy for Apple would be the elimination of the “phone” all together. It is conceivable that the iPhone was merely the stop-gap all along. Use the carriers to gain market share, have two models— one with a phone, one without. Then when the phone version reaches critical mass, and wifi penetration meets critical mass, who needs the carrier anymore? At least in urban areas (where Apple sells the majority of their phones anyway). Need it for the wide spaces in between or simply en route? There’s a solution for that. Thank you AT&T for subsidizing the cost until economies of scale could bring the price down to earth (don’t complain, you made a good run of it).
Some may scoff at the idea that Apple would drop the phone version entirely. True, probably not anytime soon. But don’t be surprised if an iPod Touch with a microphone quickly begins to cannibalize iPhone sales. Recall that it wasn’t long ago that most scoffed at the idea of completely ditching landlines for mobile.
Apple iPhone Apps reports on new iPhone features, attributing credit to an anonymous leak from inside Apple. I would like to focus on one specific feature. They report, with skepticism:
-Revolutionary combination of the camera, GPS, compass, orientation sensor, and Google maps
The camera will work with the GPS, compass, orientation sensor and Google maps to identify what building or location you have taken a picture of. We at first had difficulties believing this ability. However, such a “feature” is technically possible. If the next generation iPhone was to contain a compass then all of the components necessary to determine the actually plane in space for an image taken. The GPS would be used to determine the physical location of the device. The compass would be used to determine the direction the camera was facing. And the orientation sensor would be used to determine the orientation of the camera relative to the gravity. Additionally the focal length and focus of the camera could even assist is determining the distance of any focused objects in the picture. In other words, not only would the device know where you are, but it could determine how you are tilting it and hence it would know EXACTLY where in space your picture was composed. According to our source, Apple will use this information to introduce several groundbreaking features. For example, if you were to take a picture of the Staples Center in Los Angeles, you will be provided with a prompt directing you to information about the building, address, and/or area. This information will include sources such as wikipedia. This seems like quite an amazing service; and a little hard to believe, however while the complexity of such a service may be unrealistic, such is actually feasible with the sensors onboard the next generation iPhone.
And why “unrealistic”? Every piece of this technology already exists in the wild. This is not a great technological leap. This is merely smart convergence.
There are already two applications on the Google Android platform that have these features. One is a proof-of-concept called Enkin, developed by Max Braun and Rafael Spring (students of Computational Visualistics from Coblenz Germany, currently doing robotics research at Osaka University in Japan). The second, Wikitude by Mobilizy, is already in full-blown commercial release (an Austrian company, founded by Philip Breuss-Schneeweis and Martin Lechner).
WIKITUDE DEMONSTRATION:
ENKIN, PROOF-OF-CONCEPT:
It is only one short step further to let users geo-tag their photos. Many social photo/map applications available for the iPhone already incorporate such a feature. Building this into the realtime viewfinder would not be a great challenge. By example, the proof-of-concept for this already exists in the form of Microsoft’s Photosynth (silverlight browser plugin required).
Social Media apps could tap into this utility to network members in real space. At the most basic level, Facebook and/or LinkedIn apps could overlay member’s with their name and profile information.
The next logical extension of this will be to place the information directly into your field of vision.
The OOH marketing opportunities are immense. Recent campaigns for General Electric in the US, and the Mini Cooper in Germany show where this is going. Suddenly the work done by Wayne Piekarski at the University of South Australia’s Wearable Computer Lab is no longer so SciFi (now being commercialized as WorldViz). At January’s CES, Vuzix debuted their new 920AV Model of eyewear, which includes an optional stereoscopic camera attachment to combine virtual objects with your real environment. Originally scheduled for a Spring release, their ship-date has now been pushed back to Fall (their main competitor, MyVu, does not yet have an augmented reality model). If the trend finally takes, expect to see more partnerships with eyewear manufactures.
Initially through the viewfinder of your smartphone, and eventually through the lens of your eyewear, augmentation will be the point of convergence for mobile-web, local-search, social media, and geo-targeted marketing. Whether Apple makes the full leap in one gesture with the release of their Next-Gen iPhone, or gets there in smaller steps depends upon both the authenticity/acuracy of this leak, and the further initiative of third-party software and hardware developers to take advantage of it. Innovation and convergence will be the economic drivers that reboot our economy.
EDIT: The only capability Apple actually needs to add to the iPhone in order for this proposed augmented reality to be implemented is a magnetometer (digital compass). Google Android models already have this component. Charlie Sorrel of WIRED Magazine’s Gadget Lab has separately reported this feature through leaks of a developer screen shot, and on May 22nd Brian X. Chen, also reporting for WIRED Magazine’s Gadget Lab, put the probability of a magnetometer being included in the new iPhone at 90%. Once the iPhone has an onboard compass, augmented reality features will begin to appear, whether through Apple’s own implementation or from third party developers.
UPDATE: Since the time of this writing, the iPhone 3GS has been released, and it does indeed include an magnetometer.