Suppose you are foreigner exploring Austria. You come across a magnificent castle but it is closed for public view and no information seems to be available nearby. What can you do? You want to come by the next day and take a look but you need more information about this castle which will let you decide if it is worth taking that effort. How can you get more information?
You hold your smartphone up and take a snap of the castle. The in-built GPS system will figure out your location and match the photo you took to the several images in its database. Soon, the castle is identified and detailed information is delivered to you on your phone – enough information to let you decide if you would like to come by the next day to explore the interiors of the castle.
This is pretty much what “Augmented Reality” is all about. The image of physical, real-world environment is augmented with additional information with the help of sensory inputs such as images, sound or GPS data, making it more useful to the user.
Using Augmented Reality for contextual mLearning sounds too futuristic – a phenomenon that is being experimented in universities and research centers across the world. Wikipedia defines Augmented reality (AR) as a live, copy, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data. It is rather hard for a non-technical person to perceive the implication of this definition in a learning situation.
However, when you come across a report that says
- 864 million high-end cell phones could be AR enabled in 2014,
- 103 million automobiles will have AR technology by 2020, (Ref: Semico)
You can’t ignore Augmented Reality (AR). You need to stop, to understand what it all means. That’s when I came across an article by Jason Haag, who spoke at DevLearn 13 on Augmented Reality in mobile learning. Published at Advanced Distributed Learning website, it lists some cool examples in the form of videos about AR in action. The video from the link given below is one of them; it truly helped me conceive the idea of AR in a learning context.
By the end of the video, I thought of several instances, where such AR apps could be put to use for contextual mobile learning. Here are some of them.
- Help service engineers on the field to troubleshoot a problem in a piece of machinery.
- Give product knowledge to sales personnel or highlight specific product features that help them sell better.
- Aid customers familiarize with a new gadget they have recently purchased and maximize on their investment.
- Assist tourists to navigate around the city on their own or educate them about various tourist spots in the city.
- Support new employees as a part of their induction training – helping them find their way around the new office environment. Treasure hunt apps could be a great idea for the purpose.
These are some preliminary ideas that crossed my mind, thanks to the inspirational video. Can you think of other situations where AR in contextual mobile learning could be a great idea? Do share your thoughts in the comments column below.