Marker-based Augmented Reality by Layla Gordon

Introduction

Marker-based augmented reality is where a symbol, a pattern, a barcode such as QR code or even a three dimensional shape of a physical object is used as a target to trigger an AR event and visualise the data that the user can not normally see. This technique can enhance/augment the shape of an object, present a three dimensional representation, reveal a digital dashboard or show meta data and attribution related to the real world object.

Attribution can also be added to this AR view and edited to be shared to the community over the internet and synched in real life. Again this can have many many applications but in this paper we look at the GIS domain again to stay relevant to Ordnance Survey data and its customers.

 

Enhanced paper maps

Printed maps as we know them are a merged view of many layers of topographical data about the land. Cartography is a technique to present this data using different shapes, line thicknesses and colours. Map reading is a skill to be able to isolate all these layers and interpret the information – and even patterns.

Paper due to its convenience, versatility, security, cost and also the scale and reliability is still the favourited medium for presenting mapping information for large areas.

However, AR can be used to 'de-clutter' maps as it can show you only the information that you are interested in and hide the information that is not required.

The other use case is where pattern of the map due to its' uniqueness is used as an AR target to reveal extra information that is not printed on the map such as weather, traffic or even temporal events such as concerts and groups activities.

This was developed and demonstrated between the author and Southampton University computer vision research department in a collaborative project called MapSnapper which was featured in New Scientist in 2006; and recently hailed by The Register in a featured article about Ordnance Survey during national map reading week October 2017.

The methodology used computer vision algorithms such as SIFT features to create a geographically indexed dataset of map tiles with unique identifying vision signatures created by processing salient features in raster OS explorer maps. When the user takes a picture of the map, this is used for identifying the location and the points of interests are then returned and overlaid on the image of the map.

Another example prototype of a marker-based AR using cartographic features as a marker was a prototype iOS app made by the author to work with the Mars map published by Ordnance Survey. This was very successful in recognising the map pattern and used Qualcomm's Vuforia technology. As the user points the camera of the device at the paper Mars map, an AR experience is triggered, and A 3D model of Mars is overlaid on top of the paper representation. The 3D model was produced by OS Cartographic team using the elevation data from the MOLA instrument on MGS (supplied by NASA/JPL/GSF at a resolution approximately 463 meters per pixel).

The original paper map sheet is a topographic base-map based largely on elevation data from satellite imagery and is printed at a scale of 1 to 4 million (1:4 000 000) and measures 980 by 840mm. It represents an area of Mars 3672 x 2721 km which is similar in size to the United States of America (USA).

Figure 1: Snapshot of the Ordnance Survey topographic Mars Augmented Reality app during a demo.
Figure 2: Variation of Mars Augmented Reality app in action at an ESRI conference stand, where the maps' creator cartographer Christopher Wesson [pictured] demos the app to the visitors.

 

Walking trails unlocking geospatial data with smart symbols

Another concept the author is currently exploring with one of the OS sponsored startups is an interactive walking trail augmented reality app for kids and families exploring the wildlife of England and learning about the animals they reveal with the app. Unlike the location-based AR, the physical posters with pictures in woodlands are used as targets to unlock the digital content in this case the 3D animated cartoon animal and overlay them on top of the real-world sign. The app allows the kids to take a photo and currently features four animals with fact sheets included in the app.

Figure 3: Snapshot of Pocket Pals Trail app running on iPhone showing an animated digital butterfly and the tree trunk hiding the physical sign behind it.

 


Sourced from iLRN 2019 London Workshop, Long and Short Paper, Poster, Demos, and SSRiP Proceedings from the Fifth Immersive Learning Research Network Conference (e-book).
http://diglib.tugraz.at/ilrn-2019-london-workshop-long-and-short-paper-and-poster-proceedings-from-the-fifth-immersive-2019