Author: Alberto Vidal Rodríguez
Once the Google Maps application has been downloaded and installed on the mobile phone, this functionality can begin to be used.
It should be noted that this functionality is only available for journeys made on foot.
With this system, by pointing the phone straight ahead and using the camera, we will see on the screen the streets that are right in front of us. Then, Google Maps will show the navigation directions on the street image on the screen as if they were right in front of you. Thus, instead of identifying the route to follow on a two-dimensional map, we will see on the street we are walking on the directions that are needed to reach the selected destination.
The procedure for using augmented reality navigation, referred to by Google as Live view, it is relatively simple.
Once the application is open, the destination must be selected. This can be done by choosing the specific point on the map or by entering the address by means of text in the search engine enabled for it. Once we have chosen the place to which we want to go, we must click on the button How to arrive at the bottom left of the screen. Subsequently, as can be seen in the image on the right, the option to carry out the route on foot must be selected and click on the option to Live view.
When these steps have been followed, Google Maps will display messages on the screen in order to give us information about the use of this augmented reality functionality. These messages are as shown below. In them, a brief description of the operation is made, access is requested by the application to access the camera and the user is instructed to be careful and to be attentive to the environment while using this functionality, especially in those dangerous points such as zebra crossings or crosses.
Once we have read these messages, the application will try to locate our position through the images obtained by the camera and the GPS. This process may take a few moments. To do this, we must point the camera at the street in which we are and the objects that surround us, especially those that are representative such as buildings, shops or posters.
In general, during the tests carried out, this process has been carried out successfully, although on some occasions it has taken about 15 or 20 seconds to identify our position, so you have to have some patience. If we carry out this scan in a poorly lit area or that does not have very representative elements that the application can identify, the following message will be displayed on the screen.
As soon as Google Maps has successfully scanned our position, we will be able to use augmented reality directions on our screen. These indications will guide us on the route that we must follow to reach our destination. To do this, the application will show us indications about the images of the camera that are displayed on the screen. Within these indications we find superimposed the name of the streets through which we must walk and some arrows that indicate the different turns that we must make as we approach them and the distance at which these turns are.
As can be seen in the images below that show what this interface is like, these indications are accompanied by a dynamic map at the bottom of the screen that further helps to find your way around. In addition, the application indicates through voice messages the steps that we must follow.
Finally, the application will offer us to make use or not of a functionality the first time we use the Live view. This functionality allows that if we have the mobile vertically pointing the camera at the street, the directions and map that we have seen previously will be shown. However, if we place the mobile in a horizontal position (parallel to the ground) the camera will be deactivated and we will only see on the screen the usual two-dimensional map with the route to follow. In this way, battery consumption can be reduced. This function can be disabled from the Navigation settings.
USE AND SAFETY
One of the aspects in which the application affects the most at all times is the safety of the user while making use of the augmented reality indications. And is that, if the user does not show attention to the environment around him while using this functionality, its use could become dangerous. For example, if we are going to cross a zebra crossing and we are looking at the screen instead of the approaching cars, there could be a risk of being run over. Similarly, if we look at the screen and this prevents us from seeing where we step, we could trip and fall. This is why this functionality should be used with caution.
Therefore, the application places special emphasis on this security issue. In the first place, as mentioned before, when the Live View system is used, the following message is displayed on the screen, indicating that we must pay attention to the environment and not only to the screen.
On the other hand, Google Maps includes a functionality that seeks to increase the security of the user. This means that when we are stopped, the augmented reality indications are shown on the screen. However, when we start to walk and have advanced a few steps, the camera locks and the following message is shown on the screen while we are moving forward. In this way, the user will only have to look at what is displayed on the screen when stopped or when approaching a turn or street intersection.
In order to analyze and evaluate the application, a series of tests have been carried out to verify its operation. The tests have been carried out on a Samsung Galaxy S8 device (Android 8.0.0) and an Iphone 6S (IOS 13.4.1).
In this section, different criteria have been applied to assess whether the application can be used regardless of the capabilities of the user.
-Cognitive accessibility: The augmented reality functionality available in Google Maps can be a good option to facilitate the use of navigation systems for people with cognitive disabilities. Given that the steps to be followed are shown on the screen at all times and where we should walk is indicated by arrows and simple symbols, it can be a good option to facilitate independent movement for people from this group.
-Accessibility for people with reduced or no vision: This functionality is not specially designed for blind people, since in their case they would not be able to make use of the main indications that are shown on the screen by means of augmented reality. It is true that the images are accompanied by voice instructions, but this is also done by the rest of the map applications and there are other applications more adapted for people with zero vision.
On the other hand, for those with reduced vision it can be useful since the indications that are displayed on the screen are generally quite large, in addition to presenting a good contrast.
-Accessibility for people with hearing difficulties: People with hearing difficulties can make use of this application. The only aspect in which there may be some problem is when listening to the auxiliary instructions that complement the guidance through augmented reality on the screen. However, making use of these indications on the screen and the map at the bottom of the screen, it is not absolutely essential to listen to these indications and this functionality can be used perfectly.
These tests are intended to verify if the application has bugs that do not allow it to perform its functions.
During the days that the application was tested, no notable or critical failure was detected in its operation, both for the Android operating system and for iOS.
Functionality Live view Google Maps is an innovative tool that allows you to add an augmented reality layer to the visualization of walking routes in this application.
This can allow the user to locate and follow the directions to reach a specific destination in a simpler and more intuitive way than with other similar applications. In this sense, this augmented reality functionality can be useful for users who are, for example, in a city they do not know or for those who do not find it easy to orient themselves with a two-dimensional map, either on paper or on their own. mobile device.
Therefore, two groups that may be potential users of this application are the elderly or people with cognitive disabilities. Even so, it would be necessary to evaluate if this functionality is not too complex to be used by older people who do not handle very well with latest generation mobiles or by users with some cognitive impairment. Therefore, in this case it would be really interesting to carry out a social validation of the product to check how well it adapts to these potential users and if it can really be useful for them.
An aspect that has been valued very positively is the emphasis that the application places on security while it is being used. For this reason, it not only alerts with messages and advice on how to use this functionality safely, but also incorporates options that mean that the user does not have to constantly look at the screen and can focus on looking where they are walking.
In addition, as mentioned above, during the tests carried out, no serious problems were seen that prevent the application from being used correctly. Perhaps one of the aspects to be optimized is the time it takes for the application to scan the environment and establish the position of the user, since this process can sometimes be a bit long.
One last aspect to note is that while using augmented reality on Google Maps, the battery consumption of the mobile phone is very high. This is because the screen, the camera and the GPS location must be active simultaneously. Therefore, we must be careful if we want to make use of this functionality and we have a low battery or we do not want to consume too much by having to perform other actions with our mobile device.
- Functionality that makes it easy to follow a route on foot to a destination point thanks to augmented reality on the mobile phone.
- Very advanced technology and with a good implementation and operation.
- It can be purchased and used completely free of charge.
- Very high battery consumption while using this functionality.
- Some aspects can still be optimized such as the scanning of the area where the user is or the position of the different arrows and symbols on the image taken by the camera.