DECEMBER 30, 2016
Welcome to the Augmented Reality App in Android : Tutorial - Part 2. I hope you liked and understood the Part 1 of the tutorial. If you haven’t been to the it, please goto the Part 1.
In Part 1, we have seen what is the Theory behind the Augmented Reality. Now it’s the implementation time.
I hope you have familiarity with Android Studio as this Tutorial prerequisites it. If not, go ahead, learn about it, and come back later for this Tutorial.
I’ve priovided a starter pack for this Tutorial to give you a kickstart. Download or clone it from the repository. Then Open the project in Android Studio by navigating to it’s directory.
MainActivity.java in your Android Studio editor, this is where we will be coding. It has following methods:
The methods in bold are the ones which are needed to be completed. The others are helper methods which are already complete. Offcourse, there are some others methods too, which are responsible for setting up of Camera View and Sensor events. If you want to learn about it, head to the Android Developers site here and here.
This function will set the POI.
This function will calculate the Azimuth angle(φ) for POI. From Part 1, we know that
tan(φ) = dy/dx
So we will first calculate
dy, then tan(φ), and then by applying tan-1 to tan(φ), we will get φ and convert it from radians to degrees. But this φ always is between 0 and 90 as tan-1 is defined only in this range. But we have
dy which give us clear idea about the quadrant of POI. We will just check different conditions on
dy and get correct φ.
Following code demonstrates it:
This function calculates the Camera View Sector of Device. It return minAngle and maxAngle of Camera View Sector.
This function checks if POI lies between the Camera View Sector. To do this, we just check if the azimuth angle is between the minAngle and maxAngle of the Camera View Sector. But this isn’t enough. Sometimes, minAngle is greater than maxAngle. In that case we just check if azimuth lies in 0 and maxAngle OR minAngle and 360.
Following code illustrates above procedure:
This function just updates the values shown in textbox. It is for testing function.
This function takes care of the change in location of Device. It updates the location and recalculates the azimuth angle of POI.
This function takes care of the change in azimuth angle of the device. This is the method where the real thing happens, i.e. to augment the POI. Here, we have the Device’s azimuth angle and the azimuth angle of the POI. First, we add 90° to azimuth angle of Device (Remember…), then we calculate the Camera View Sector. Then we check if POI lies in the Camera View Sector, if yes then we show pointer icon on the screen. In order to give the augmented feel, we place the pointer icon on screen proportional to where the POI is in the sector. This is done by calculating the ratio of difference of azimuth angle and minimum angle of sector to that of difference of minimum and maximum angle of sector. Then the top margin of pointer icon is set to product of ratio and screen height.
That’s it. Now you can build the app, deploy and test it on your device. In case you need the completed project, you can get it below.
Note: If the app says “Using Game Rotation Vector. Direction may not be accurate!”, then your device doesn’t have compass i.e. the device can’t locate North. The angle available in this case is not relative to North but somewhere else. More information here.