In this type of augmented reality, the location of the object in real world becomes the key to unlock the digital data attributed to it. This data and attribution in this case is usually not visible to the naked eye but is then revealed by augmented reality technology in a smart phone/tablet or headwear such as HoloLens or Magic Leap.
This type of AR has many use cases and applications with geospatial data. One of the historical parallels to this is where stars were used for navigation. This is also a very good example of a rising computer science paradigm called Spatial Computing where information and data exists in all three dimensions rather than just on the monitor. It has been proved that this method increases data consumption by 30% as it engages human's peripheral vision which is highly sensitive as it is to be alerted where a dangerous even is approaching and is still not in our full view.
When planning to explore a place you have never been to, we always look at a map. This shows us the places and information we need to plan our trip. But once we are there, although we will carry on using the map to see the big picture of where everything is, the other way we can engage with those 'interest' points is to be alerted about them and being able to click and see more information about them as we walk around and one is in our 'view point'.
An example of this is an already released AR feature within the OS Maps app which is used by walkers in their discovery of nature trails. The algorithm was developed in house by OS Labs and therefore there is no reliance on external SDKs for the augmented reality technology.
This feature won the Yahoo Sports Technology award of year 2018 for best use of AR technology and it continues to benefit walkers with their discoveries of points of interests in outdoor trails.
There is also the scenario of city navigation and health routes. This scenario has the potential to benefit from IOT sensors and connected information with the use of Artificial Intelligence for data discovery and delivery. Imagine a scenario where a person with a health condition such as asthma is taking a walk through a large polluted city centre. The sensors within the city are constantly monitoring the pollution levels in the busy parts of the centre. Since this is an IOT scenario, there will be APIs and cloud services available for accessing the live readings from the sensors and perhaps predicted values too.
The smart navigation app will combine the user's health conditions with the real time values to deliver a visualisation of location-based points of interest to help with finding the least polluted route for the user to take.
This will all be powered by AI and IOT in a smart city.
Connected vehicles can also play a part in this scenario, where lift available in real time from a friend or a ride sharing service such as Uber or Lyft.
Most large cities are currently suffering from increased pollution due to large number of single passenger commutes. The simplest remedy for this would be to use ride sharing apps to reduce the number of cars for work journeys as one geographical destination is shared by many users. However, one of the barriers for their lack of widespread use lies in the population's perception of lack of flexibility in these services. Many people work patterns includes shifts and irregularities hence the reluctance to make fixed arrangements for ride sharing with work colleagues. Other factors that are in play include the lack of reliability by families in case of emergencies with illnesses of young kids.
Augmented reality and real time ride sharing can remedy this by enabling an on demand service where people can just 'e-hail' a ride on the spot.
This is another group of users who are again under stress, the navigation can also be hampered by poor vision due to smoke, helmets or low lighting. AR has the capability to make buildings smarter by highlighting fire safety routes, electricity junctions, wiring, exits, and even who’s attending which function in what room. This helps with collaboration between different fire crews by showing them where everyone else is in real time.
In future where heads up displays can be mounted on the safety helmets the information above will be added to the HUD to for example show where a vulnerable person is in a particular floor of a high rise as the crew approach whilst still outdoors without the need to take out a map which speeds up the rescue response time.
It can also be used for alerting the emergency services about any hazards such as chemical spillages and help visualise this data in relation to real world buildings and roads.
This is where the augmented technology can be the holy grail of field workers and utility inspectors. It is a scenario where AR is giving 'X-ray' vision to the utility staff for visualising buried assets and a dashboard of readings from wireless IOT sensors that are inside the pipes and junctions.
In this scenario AR is increasing operational intelligence and also safety and therefore adding value and saving money and lives.
Hazards can be detected quickly and accurately by this ‘X-ray’ vision capabilities and it also remedies the difficult task of correlating the information displayed usually on a map of the assets which are normally not even in the same scale as real world to real world scales.
There are already off the shelf apps such as Augview that deliver this type of AR technology specifically for utility services, but this of course relies on accurate geospatial data being available from the assets and the device needs to be capable of centimetre accuracy positioning. Capturing the buried assets data can be achieved by using ground penetrating radar. Satellite positioning systems and the use of surveying grade antennas and differential GPS can also provide centimeter accurate position of the user in x, y and z.