Virtual narrator: A mixed reality tour of auburn ave



Auburn Avenue AR project demo

VIrtual Narrator is an Augmented- Reality tour mobile app prototype that provides visitors with personal guided experiences to explore historical and cultural attractions. Most attractions don’t have guided tours for tourists, even if they have, the availability of the tour is either insufficient or inconvenient. For example, Sweet Auburn in Atlanta has a well-planned Martin Luther King Jr historic site, including MLK birth home, Ebenezer Church, etc. The ranger-led tours are easily full and don’t allow apply in advance. Virtual Narrator creates an interactive and vivid AR tour to assist the visit. It is designed for a wide using scenario, but focusing on Sweet Auburn as a specific example in this demonstration.

In the overall experience, Hologram Narrator will:

  1.    Present close-by locations available for tours
  2.   Navigate you to a location of your interest
  3.    Notify you when a location of interest is nearby
  4.    Present a virtual tour with an AR virtual tour guide

Users: Tourists of all ages, to Atlanta's Auburn Avenue, who have smart phones

Problem Space: Open air tourism, mixed reality design

Skills used: Wireframing, On site research, Affinity Mapping, Storyboarding, User Journey design and walkthrough, Javascript.

Team and Role: We worked on this project for 3 months as a 5 member team. I was chiefly responsible for user journey and feature design and storyboarding and also aided in research and programming.

experience walkthrough:

The aim of the app is to provide a user a digital experience of a life like virtual companion, who will show the user a guided tour of an attraction. The user should be able to interact with the companion virtually just as they would in real life. The diagram here outlines the flow through the basic functioning of the prototype we implemented. The dotted oval describes the step where we register the AR video with the real world. This is where our virtual narrator comes to life so the user can interact with him, while he gives them a tour of various points of interest.


user interactions and their design decisions:

  • On first opening the app, the user sees a map of his vicinity and a scroll bar of images corresponding to tour 'hotspots', to help the user have a better idea of what he might be interested in, if he did not do any prior research.
  • Places are also marked with popularity, based on other people who visited the area or took the tour.
  • When a place of interest is chosen, the map routes the user.
  • When the user enters a radius of 20 feet of the destination, his mobile phone will vibrate, and prompt him to raise it. The camera is activated, and the user sees an image of the background he needs to align his phone with, to ensure that his context is set right for the virtual narrator.

content creation:

In order to write the script for the tour, we took inspiration from the National Historic Site tours of the MLK Jr. birth home as well as literature of both the home and of Ebenezer Baptist Church. In addition to the NHS literature, we also looked at online materials describing the history of the Williams and King families and Ebenezer, and produced two scripts, of roughly 3 minutes, for each location. The script was designed to balance factual details, with fun facts, and gave insights into MLK's personal life as well, to allow the audience to connect better.

With the help of an actor friend and borrowing equipment from Georgia Tech Communications Center and LMC Films, we shot two green screen videos, which we later edited to produce the combination composite and alpha channel image that could be used to place our “tour guide” into the Argon browser’s camera view.





  • ArgonAR enabled web browser developed at Georgia Institute of Technology, which we used to display the AR content in our app. We ran the browser on our mobile phones to create a prototype. 

  • ThreeJSJavaScript 3D library which helped simplify webGL. 

  • VuforiaAR SDK for mobile devices that uses Computer Vision technology to recognize and track planar images and simple 3D objects in real time. This was used in the first attempt of our prototype.

  • Cesium: web based globe for tracking and calculating geo located coordinates. We used this for geolocation.

  • Premiere Provideo editing software we in which we were able to process and edit our green screened video.

future directions and applications:

  • Addition of speech recognized interactive elements for more natural communication with the guide
  • Addition of more virtual elements to support the story, like props, etc.
  • Make use of actor costumes to give better context to the stories
  • Collaboration between users in a group with the same instance of the virtual narrator, linked through the app, to simulate multiple people communicating with the same tour guide.
  • Future deployment in museums, theme parks and for other educational purposes.