Location-based Augmented Reality by Layla Gordon

Introduction

In this type of augmented reality, the location of the object in real world becomes the key to unlock the digital data attributed to it. This data and attribution in this case is usually not visible to the naked eye but is then revealed by augmented reality technology in a smart phone/tablet or headwear such as HoloLens or Magic Leap.

This type of AR has many use cases and applications with geospatial data. One of the historical parallels to this is where stars were used for navigation. This is also a very good example of a rising computer science paradigm called Spatial Computing where information and data exists in all three dimensions rather than just on the monitor. It has been proved that this method increases data consumption by 30% as it engages human's peripheral vision which is highly sensitive as it is to be alerted where a dangerous even is approaching and is still not in our full view.

 

Indoor navigation

During 2017, we carried out a case study for navigation in Southampton General Hospital which revealed that 57% of visitors find it 'difficult' and 21% 'extremely difficult' to find their way around hospitals. This is contributing to many missed appointments not just in this south coast hospital but right across the country. According to an NHS report, around 6.9 million outpatient hospital appointments are missed each year in the UK, costing an average of £108 per appointment.

An augmented reality arrow overlaid on a live camera view on a phone – together with a thumbnail of where you are in a building – could not just help the millions of hospital visitors each year but staff in emergency situations too. Even 60% of hospital staff find navigation 'difficult' which could add delay to a life or death situation.

Figure 1: UI prototype of an indoor navigation app powered by iBeacons for patients, visitors and staff.

It doesn't help that complex buildings such as these are harder to navigate under stressful conditions. Stress and its associated hormones are known to influence the function of the hippocampus, a brain structure critical for cognitive-map-based, allocentric spatial navigation, backed up by a 2015 study at the Department of Psychology in Canada's University of Victoria.

Therefore, what we propose is a turn-by-turn navigation for patients and staff to improve the wayfinding.

This technology along with iBeacon based indoor positioning was used in producing an app for an event Ordnance Survey sponsored in 2017 called Digital Shoreditch. Event visitors used the app for navigating in the venue and also networking and getting live alerts from the talks delivered to the app.

Another example of a complex building with stressed users is an Airport. Despite the open plan of the building and therefore less complexity we are still facing the reduction of navigation skills discussed above. Ordnance Survey has produced proof of concept apps using iBeacons and AR as demos to showcase the benefits of this type of application in three different areas:

Airport utility/asset managers staff

Here the fixed and moveable assets can be located and visualised by airport utility staff. In case of items such as defibrillators where due to price of the unit can be shared by nearby buildings or even misplaced, it is very important to be able to locate them as quickly as possible to respond to a medical emergency. Again, fire hydrant cabinet as displayed below can be highlighted for rapid access.

Figure 2: Concept design for a prototype location-based AR asset management app in an airport for utility managers.
Passengers

Here as seen below, the app of the airport or airline will help the passengers navigate to the gate via shops or via the quickest route. The app will also show alerts from the flight information system to help inform the passenger as they make their way to the departures lounge.

The arrow is again used for reasons of user experience simplicity and an easy to follow methodology. This technique has long been in practice with many years ago where hospital corridors were painted with various colours to follow to get to a certain destination. In this case the painting is done digitally to deliver a custom route for the individual using the app considering their destination and even accessibility limitations such as need the avoid the steps, etc.

Figure 3: Concept design of a prototype augmented reality app for navigation in a smart airport.
Autonomous baggage handling systems

Another use case for this type of AR in the airport is for baggage handling staff in a smart city airport which has autonomous pods for loading and offloading of the passenger's baggage into and from aircrafts. Like any other IOT system, there will be a cloud service to feed data to an augmented reality platform to be visualised in an app. To detect which baggage QR codes can be placed on the platform to identify the load. Below pictures shows a dashboard in AR view to help the staff with offloading of the baggage from the conveyor belt.

Figure 4: AR dashboard for the baggage handler staff connected to an autonomous baggage handling system.

 

Points of interest discovery

When planning to explore a place you have never been to, we always look at a map. This shows us the places and information we need to plan our trip. But once we are there, although we will carry on using the map to see the big picture of where everything is, the other way we can engage with those 'interest' points is to be alerted about them and being able to click and see more information about them as we walk around and one is in our 'view point'.

An example of this is an already released AR feature within the OS Maps app which is used by walkers in their discovery of nature trails. The algorithm was developed in house by OS Labs and therefore there is no reliance on external SDKs for the augmented reality technology.

This feature won the Yahoo Sports Technology award of year 2018 for best use of AR technology and it continues to benefit walkers with their discoveries of points of interests in outdoor trails.

Figure 5: Snapshot of Ordnance Survey Leisure OS Maps app AR view.

There is also the scenario of city navigation and health routes. This scenario has the potential to benefit from IOT sensors and connected information with the use of Artificial Intelligence for data discovery and delivery. Imagine a scenario where a person with a health condition such as asthma is taking a walk through a large polluted city centre. The sensors within the city are constantly monitoring the pollution levels in the busy parts of the centre. Since this is an IOT scenario, there will be APIs and cloud services available for accessing the live readings from the sensors and perhaps predicted values too.

Figure 6: Concept prototype of a urban ride sharing and health optimised routing app for smart cities citizens.

The smart navigation app will combine the user's health conditions with the real time values to deliver a visualisation of location-based points of interest to help with finding the least polluted route for the user to take.

This will all be powered by AI and IOT in a smart city.

Connected vehicles can also play a part in this scenario, where lift available in real time from a friend or a ride sharing service such as Uber or Lyft.

Most large cities are currently suffering from increased pollution due to large number of single passenger commutes. The simplest remedy for this would be to use ride sharing apps to reduce the number of cars for work journeys as one geographical destination is shared by many users. However, one of the barriers for their lack of widespread use lies in the population's perception of lack of flexibility in these services. Many people work patterns includes shifts and irregularities hence the reluctance to make fixed arrangements for ride sharing with work colleagues. Other factors that are in play include the lack of reliability by families in case of emergencies with illnesses of young kids.

Augmented reality and real time ride sharing can remedy this by enabling an on demand service where people can just 'e-hail' a ride on the spot.

 

Emergency services incidence management

This is another group of users who are again under stress, the navigation can also be hampered by poor vision due to smoke, helmets or low lighting. AR has the capability to make buildings smarter by highlighting fire safety routes, electricity junctions, wiring, exits, and even who’s attending which function in what room. This helps with collaboration between different fire crews by showing them where everyone else is in real time.

In future where heads up displays can be mounted on the safety helmets the information above will be added to the HUD to for example show where a vulnerable person is in a particular floor of a high rise as the crew approach whilst still outdoors without the need to take out a map which speeds up the rescue response time.

Figure 7: Concept design of a location-based AR app used indoors to reveal the 'beyond the wall' features of the building that the human eye can not normally see. The positional accuracy of 2-3 meters is achieved by using over 30 iBeacons installed in the underground of Shoreditch Townhall basement.

It can also be used for alerting the emergency services about any hazards such as chemical spillages and help visualise this data in relation to real world buildings and roads.

Figure 8: Snapshot of Ordnance Survey partner Aligned Assets Symphony AR app for local authorities.

 

Underground utility asset management

This is where the augmented technology can be the holy grail of field workers and utility inspectors. It is a scenario where AR is giving 'X-ray' vision to the utility staff for visualising buried assets and a dashboard of readings from wireless IOT sensors that are inside the pipes and junctions.

In this scenario AR is increasing operational intelligence and also safety and therefore adding value and saving money and lives.

Hazards can be detected quickly and accurately by this ‘X-ray’ vision capabilities and it also remedies the difficult task of correlating the information displayed usually on a map of the assets which are normally not even in the same scale as real world to real world scales.

There are already off the shelf apps such as Augview that deliver this type of AR technology specifically for utility services, but this of course relies on accurate geospatial data being available from the assets and the device needs to be capable of centimetre accuracy positioning. Capturing the buried assets data can be achieved by using ground penetrating radar. Satellite positioning systems and the use of surveying grade antennas and differential GPS can also provide centimeter accurate position of the user in x, y and z.

Figure 9: Concept design of a prototype app for AR visualisation of buried assets and the attribution of each individual physical item.

 


Sourced from iLRN 2019 London Workshop, Long and Short Paper, Poster, Demos, and SSRiP Proceedings from the Fifth Immersive Learning Research Network Conference (e-book).
http://diglib.tugraz.at/ilrn-2019-london-workshop-long-and-short-paper-and-poster-proceedings-from-the-fifth-immersive-2019