Track Trek Final Project Submission

advertisement
Track Trek
Final Project Submission
Abstract—Take the lead from similar GPS tracking mobile
applications and provide a more detailed record of a typical
activity route, whilst incorporating elevation changes into
exercise calculations. The app features a limited user interface
that provides detailed information about a user’s Trek. This task
incorporates the GPS sensor in coordination with the Google
Maps API to visualize a traversed route.
Index Terms—Component, API, GPS, Interface, ViewGroup
I. INTRODUCTION
A. Aim
In this paper a proposal is introduced to develop a
Route tracking Location based service targeted towards a
cyber-physical system. The service will run on an androidbased phone as an app.
The time-driven app will periodically record the
location of the Android User through sensors in the phone as
he/she completes a track. This data is then processed to give a
variety of information showing the runner’s progress over time
including: timing, pacing, average speed as well as calories
burnt. Furthermore, the data can be processed to create route
visualization over a map.
B. Motivation
While there are several apps that perform similar
actions i.e. Nike+, they only take in to account the twodimensional positioning of the user: latitude and longitude.
The proposed app has the potential to become more accurate
as it takes a third dimension of the user into account, the
elevation.
The proposed app will be targeted towards a whole
range of customers. From Hikers to marathon runners, dog
sled teams, skiers & outdoor enthusiasts.
II. BACKGROUND
The proposed app should be all inclusive in a typical
android-OS system. As such, in terms of hardware, the entire
system can be executed just in a single suitable cell phone
with no additional accessories. The phones available for this
project are the Nexus_S and the Galaxy SIII, both of which
fulfill the requirements discussed in the subsections below.
A. Important Hardware Requirements
GPS Sensor: This is the main sensor with which the
app will record the location of the user (latitude, longitude,
elevation).
Responsive input terminal: Required to enter information
about the user, and set parameters for the app. Achieved
through touchscreen interface of the screen.
Graphic Output Interface: Interface through which user can
track his/her progress as well as viewing map visualization.
B. Important Software Requirements
Android-OS: Cell phone has to run the latest Android OS
which are compatible with the below APIs and programs.
Android API: Contains classes & methods to interact with the
Android OS
The android’s application framework is vast with a
whole host of reusable components. In this proposal we only
present two components. Further research and work in to the
project may reveal more components that might be utilized.
Google Maps API: Contains classes & methods to interact
with Google maps, which will be used in map visualization.
C. User Interface
A major concern with the development of a mobile
app, is the layout of the user interface. The Android uses an
architecture of Views grouped into ViewGroups, which the
user can call by touching the screen. Upon launching our app,
a ViewGroup “MenuScreen” will load displaying the options
'Past Maps, Create Map, Settings'. Each of these options
represents a View that is owned by the Viewgroup
“MenuScreen”. From the View “PastMaps”, the user will be
able to access altitude representations of their routes, and
access GoogleMaps with their route laid over the map. This
view will also hold information about calories burned on
previous routes, and a graphical comparison of the stored
routes. The CreateMap View will allow the user to press a
button and begin tracking a route. This will trigger the GPS to
start logging information. After pressing the begin button, A
stop button will appear that user may press to stop the route.
The last View titled “Settings” will be used to change various
settings of the app. We will include different units to choose
from (SI vs. US), and the user will be able to input personal
information to help calculate calories burned more accurately.
D. Sensors
The main sensor used by our app is the GPS. Our
main concern in using the GPS is the accuracy of the data, and
how the usage of the GPS will drain the battery. We have
found that cell towers may provide more accurate information
more quickly, but have opted not to use tower tracking
initially to simplify our design. As the semester progresses and
if time allows, we may include tower tracking. We can control
the power usage of the GPS by limiting how often the GPS
returns data. We can set this with a time limit & also a
minimum distance traveled. This variable will also be
changeable by the user dependent upon the pace of the user’s
activity. We will have to fine-tune this variable as the app
becomes usable, and we can see how it affects the quality of
the maps produced.
E. Google Maps APIs
The key functionality of the application will be
driven by the use of several Android and Google API’s.
Specifically, initial investigation has highlighted the Android
Location API, Google Maps Android API, and the Google
Maps Elevation API as probable components in our
application. Google’s MapView() library for Android devices
provides developers the main UI functionality of Google Maps
such as map control and rendering, but also allows a number
of Overlay types to be drawn on top of the map. The drawing
of Overlays is an integral part of the application, as it allows
for the visualization of activity routes, both present and past.
Whether obtaining the device location through the Android
Location API or Google Maps API, the application will store
an array of latitude and coordinates for use later on. In order
to get the elevation of a given point, the Google Maps
Elevation API will be used. Using the Elevation API,
coordinate pairs of latitude and longitude are provided to the
API, and the Google Maps elevation for that point is returned.
Once elevation data is collected, an overlay of the route will
be drawn on top of a Google Maps, with variation in route
color based on elevation changes.
V. TRACKTREK OVERVIEW
§
§
§
§
§
§
§
Launch App
Choose settings
Start Route
Log GPS coordinates and elevation at controllable time
intervals
End Route
Save information to Routes activity
Select route from list in Routes activity
¬ Upon selecting the route, select from the
following options:
·
Google map visualization
·
Elevation visualization
·
Calories spent
VI. HIGH LEVEL SYSTEM DESIGN
The block diagram shown below illustrates an
abstract view of our system design.
Launch App III. PROPOSAL CHALLENGES
Of the challenges expected, the most outstanding
obstacle is having no previous experience with Android
application, Java, or XML development. There is a relatively
short time frame in which our group must transition from
learning about the basic Android architecture to actually
implementing a program to obtain the desired functionality.
Additionally, satellite communication will pose several issues
in itself, from signal strength to power management. Perhaps
the most significant concern here is satellite-to-phone
accuracy: without reliable accuracy, the purpose and
improvements our application is intended to provide will be
greatly undermined.
Start Route Prevoius Route Map Name New Route List all saved routes Enter specific GPS coordinate Log in GPS and Eleva;on Select Route to Analyze Show Coordinate on Google Map Save Route upon Exit Show map overlay or Eleva;on Graph Figure 1 – System Block Diagram
IV. PROPOSAL GOALS
Despite the challenges we will encounter, we are
confident that completing this project will accomplish several
goals. Upon successful completion of the project, our group
will have a solid fundamental understanding of Android
application development. This will include experience in Java
programming and XML design, both of which are skill sets
that have an extremely broad array of applications outside of
mobile application development. Altogether, a key take-away
from this project is practice in creating a simple user
experience for a complex task.
A. Launch App
Upon launching the app, the user is presented with
four options in the MenuScreen activity depicted in Figure 1.
The options include:
• Start Route – Start GPS tracking
• Previous Routes – GPS coordinates of routes
• Settings – User preferences
• Map – Temporary demonstration of Google Maps.
Future application of Google Maps will be
elsewhere.
Figure 2 – MenuScreen Activity Layout
Figure 3 – Settings Activity Layout
Each of these selections starts a new activity with
different functions. Upon launching the app, It will recall the
settings that have been previously determined during other
uses of the application that have been set in the Settings
activity.
The user will first be able to choose if they would like
to see mile or kilometer values for distance traveled. These
choices are shown in the spinner device described earlier. This
will directly affect which unit is used when displaying a
distance on the map and on the elevation depiction. The
graphical layout of this parameter is demonstrated in Figure 2.
B. Choose Settings
The user will be able to define different parameters to
change the outcome of the route tracking. These settings will
be determined under the Settings activity using the ‘spinner’
devices created by the Android developers. Spinners act as a
dropdown list of strings, which displays the choice made by the
user. These choices are stored into a global parameter defined
by the programmer stored in a list of preferences called
SharedPreferences. SharedPreferences is called to save the
choices, and once again to use the choice across all activities
within the application. Saving choices is called by the program
using Android derived functions such as onSelected(), and
recalled in the settings menu by using onResume().
Another preference will be the sampling rate of the
GPS. This will determine how often the GPS records
latitude/longitude information based on a time factor and a
distance. This parameter may greatly increase the batteries
performance for long routes, and will allow increased accuracy
for those seeking greater performance. Sampling rate will be
directly related to exactness of calories burned, as well as the
graphical representations of the route. This parameter has not
been implemented in this demonstration and will be added at a
later date.
C. Start Route
Once the user is ready to start his run, he selects the
Start Route option. From the settings (covered in the next
subsection) the app will start logging in the User’s
Latitude/Longitude co-ordinates and elevation at fixed time
intervals.
D. Log GPS Coordinates & Elevation
1) GPS Coordinates
Our Android-powered devices (S3 & Nexus S) can
receive location updates through two underlying technologies,
using the internal GPS sensor or cell network localization.
These technologies are all contained in a whole host of highlevel abstract methods to find a cell phones position. This level
of abstraction allows the same piece of code to typically work
for any android phone which contains a GPS sensor. As such
our code works just as on well on the Galaxy S3 or the Nexus S
without requiring any modification of code.
For our application purposes we a high degree of
accuracy is required. Generally a location provider with greater
accuracy (GPS) requires a longer fix time than one with lower
accuracy (network-based). We want to record the location data
as quickly as possible and update it as more accurate data
becomes available. To do so we use both GPS and network
providers. We receive location updates from multiple location
providers that may have different timestamps and varying
levels of accuracy. Logic is incorporated to disambiguate the
location providers and discard updates that are stale and less
accurate.
The typical flow of procedures for obtaining the user
location:
1.
2.
3.
4.
User starts the run.
App starts listening for updates from desired location
providers.
Maintain a "current best estimate" of location by
filtering out new, but less accurate fixes.
Stop listening for location updates.
This flow is demonstrated in the figure below.
Figure 4 - A timeline representing the window in which the
application listens for location updates.
Halfway through our project we were able to get
accurate Latitude/Longitude values. The next step is to save a
log of those values, which can then be used as the basis for
computations and analysis by the other components of our
application. Furthermore since our application will need to
continuously receive and process location updates, a more
efficient use of resources can be utilized if we implement the
logic as a background service, so the user can multitask on the
phone.
Another we factor we have considered is power
consumption. Long windows of listening for location fixes can
consume a lot of battery power, but short periods might not
allow for sufficient accuracy. So we will have to carry out
field tests to find out the best compromise.
When the user is done with his/her run, location
updates are terminated to reduce unnecessary consumption of
power and network bandwidth.
Reverse-geocoding is the process of translating
latitude longitude coordinates to a human-readable address
[http://developer.android.com/training/basics/location/geocodi
ng.html]. We have implemented this logic in our code to be
used as a reference in future field tests.
2) Elevation
On Android powered devices there are three different
methods of acquiring the elevation at a location:
1.
2.
3.
Altitude raw data straight from GPS Sensor: In
addition to Latitude & Longitude the GPS sensor can
also
acquire
elevation.
However
sources[http://gpsinformation.net/main/altitude.htm]
indicate that these readings are unreliable and
inaccurate.
Utilizing Web Service, which can return an
elevation reading: From the Latitude & Longitude
coordinates acquired by the GPS, a Web Service can
take those values and return an elevation reading.
USGS Elevation Query Web Service or Google Maps
are examples.
Using the Barometer sensor: Commercially Air
pressure readings have been used to more accurately
calculate the relative altitude. Out of all three
methods, this one seems to be the most accurate
(according to research). However only the Galaxy S3
has a barometer sensor, the Nexus S do not have one.
For our application purposes, we are not interested in
the absolute altitude above sea level per se. Rather we are
interested in the changes of elevation the user will go through
as he/she goes on his/her run. As such the first two methods
could be feasible. Further field tests are required to test this
hypothesis.
E. End Route
Upon exiting the activity to return to the main screen,
signifies the end of the route. This turns off the GPS and
elevation tracking, and stores the information under the routes
section. This action will eventually prompt the user asking if
they want to view the Google map visualization, the elevation
map, or the calorie counter.
F. Select Route from List in Routes Activity
The Routes activity will show a dynamic list of the
routes tracked by the application. The routes will be listed in a
date created order listing the most recent route at the top of the
list. The routes activity will also implement a means of deleting
the information if desired.
This approach may cause some confusion at first
when the user terminates the GPS by selecting ‘End Route’.
They may expect something to immediately happen upon
pressing End Route, but will find that no immediate action
happens. To go around this shortcoming, we may be able to
open the Routes screen upon exiting the Route Tracking
activity, and selecting the first route for the user; so that they
are immediately prompted with what information they want to
view.
G. Google Maps Visualization
As discussed in the above sections, once the user has
completed an activity, the application stops logging
positioning data and then stores the waypoints of the recently
completed route into the “Previous Routes” memory area.
Based off the methods we intend to apply in order to obtain
elevation data, this information will likely be stored into an
array in ordered pairs of latitude and longitude. In fact, within
the Google Maps library, a predefined class labeled
“GeoPoint” exists which stores a pair of latitude and longitude
coordinates as integer numbers of micro degrees.
In order for the user to review their route later on, we
must incorporate the Google Maps API. In order to do this
with an Android device, we first import the Google Maps
external library, which contains the key class MapView. This
was done within our AndroidManifext.xml file. MapView is a
subclass of the Android class ViewGroup. Basically,
MapView provides a container for the Google Maps API,
allowing the user the manipulate map data and control the
view. The map displayed is constructed with data from the
Google Maps Service, but now the map can be controlled with
key presses, touch gestures, and other common user interface
elements.
A key feature of the MapView class that we wish to
utilize is the ability to draw Overlay types over the generated
map that respond to user inputs. Specifically, we are interested
in the Itemized Overlay class (extension from the class
com.google.android.maps.Overlay) within the Google Maps
library. This base class consists of a list of OverlayItems,
which in the case of route drawing, controls the orientation,
span bounds, and marker/waypoint creation along a route. An
additional extension of the Overlay class is the
RouteSegmentOverlay, which will be explored and ideally
utilized as we move forward with our application
development. Although not yet implemented, our next steps
will involve using both of these classes, along with the
GeoPoint class, to plot out a user’s activity route.
At the time of our midterm presentation, our
application has the ability to obtain latitude/longitude
coordinates via a GPS sensor or Location Services, as outlined
in the above sections. Concerning map visualization, we are
able to center a map over a pair of latitude and longitude
coordinates input by a user.
Figure 5: Screenshot of Google Maps Implementation
Outside of location services, the key Google Map classes that
allow this are MapView, MapActivity, MapController, and
GeoPoint. A necessary step to realize this was the generation
of a Google Maps API Key (which is unique to each machine
used during an app’s development stage). This is required to
identify one’s application, as Google limits the number of
times an individual/app accesses their satellite and server data.
In order to get a map centered on the location of the device,
Location Services must be implemented. Now that the Google
Map classes allow control and viewing of a map, interfaces
such as LocationListener and LocationManager prompt the
device to detect changes in GPS status and receive location
updates. Upon further testing, the final components used to
achieve our desired functionality will be described in greater
detail at a later date.
H. Elevation Visualization
This option will make a timeline visualization of the
elevation information using the sampling rate chosen by the
user in the settings menu. The visual will resemble a line graph,
and will be shown in Miles/KM versus time.
VII. SYSTEM SETUP
This section describes the system described in
previous section was implemented and setup. Unfortunately we
were unable to implement the settings procedure (discussed
further in future work).
A. Hardware
The program (or app) was designed in the eclipse
programming environment and run on the Galaxy S3, as shown
in the figure below.
Figure 7- MenuScreen Activity Layout
2) Start Route
Once the user is ready to start his run, he selects the
Start Route option. However, before the route can start the user
must name this new route. This will be useful for the user later
when he wants to differentiate routes from each other.
Figure 6: Galaxy S3
The S3 serves as the hardware of our embedded
system. It hosts several physical resources of which we only
utilize a few. Specifically our system employs the use of the
S3’s GPS & Barometer Sensor along with the touchscreen
interface, phone memory and internet connection. We
specifically use the S3 because of its highly accurate barometer
sensor
[http://www.engadget.com/2012/09/06/stmicroelectronicsdetails-pressure-sensor-in-your-galaxy-s-iii/].
B. Software Design
For the most part the system setup follows the system
design outlined in the previous section VI. A run through of
each the three branches, as shown in the system block diagram,
is documented in the next following subsections.
1) Launch App
Upon launching the app, the user is presented with the
home screen. The app by default will always start at the Menu
Screen.
Three options in the MenuScreen activity are
displayed, depicted in the next figure. The options include:
• Start Route: Start GPS tracking
• Previous Routes: List of saved routes.
• Map:
Demonstration of Google Maps
(discussed and illustrated in previous section).
Figure 8 – Name Route
Once the user selects Ok, the app will give the user
the option of how to record his GPS location fix. At this
moment the app has not started logging in data, that is why the
fields under the position labels are marked “I’mBatman”.
Figure 9- Fine-Grain or Both Providers
Fine-Grain Provider
The devices (S3 or Nexus S) will receive location
updates only through the use of the internal GPS sensor. The
(GPS) requires a longer fix time than the other option and the
user must wait for that long to get a fix before he can start his
run.
Both Providers
The devices (S3 & Nexus S) will receive location
updates through both underlying technologies, using the
internal GPS sensor and the cell network localization. Getting a
fix is quicker but on average the results are slightly less
accurate. Logic is incorporated to disambiguate the location
providers and discard updates that are stale and less accurate.
However this choice constrained to only use S3. While the
Galaxy Nexus would still run under this code it would not be
able to record the altitude. A more detailed discussion of
elevation acquiring is discussed in the next section.
Address
Reverse-geocoding is the process of translating
latitude longitude coordinates to a human-readable address
[http://developer.android.com/training/basics/location/geocodi
ng.html]. The system is able to display the corresponding
address in real time by accessing a special web service through
the internet. Consequently this field only shows viable results
when the phone’s Internet capabilities are on (either through
Wi-Fi or Data Networks).
Foreground Task
In order for this activity to continuously record GPS
& elevation it must be kept as the foreground task. In other
words, the user may not access other aspects of his android
device while going on his route.
Figure 10- A-GPS on a phone
[http://tech2.in.com/features/all/what-is-agps-how-does-itwork/115142]
Once the user selects one of those two options, the
system will start logging in the User’s Latitude/Longitude coordinates and elevation at fixed time intervals. The system then
displays the Lat, Long, address & elevation in real time, shown
in the figure below.
End Route
At the end of the route the user shall press the return
button to return to the home screen thus ending the route.
Ending the route turns off the GPS and elevation tracking, and
stores the information under a custom built multi-dimensional
arrays under the filename decided by the user.
This approach may cause some confusion at first
when the user terminates the GPS by exiting ‘Start Route’.
They may expect something to immediately happen upon
pressing, but will find that no immediate action happens.
3) Previous Routes
Once the user returns to the home screen he can select the
“Previous Routes” option.
Figure 11- Real-time Logging of Data
Elevation
The method of logging in the elevation is independent
of the choice the user selects regarding providers. Using the
Barometer sensor, air pressure readings are logged in
periodically. The readings are then converted to altitude. We
selected this method over the other two (discussed in the
previous section), because it proved to be the most accurate.
Figure 12- List of Routes
This activity shows a dynamic list of the routes
tracked by the application. The list grows by +1 every instance
the Start Route function is called. The routes were given a file
name chosen by the user upon the start of the route. The routes
will be listed in reverse chronological order, listing the most
recent route at the top of the list.
Each file contains an array storing the Lat, Long pairs and the
corresponding elevation.
Once the user selects a route he is then presented with two
options, shown in the figure below.
5) Elevation Graph
Selecting the second option from figure 6 will make a
timeline visualization of the elevation information. The visual
resembles a line graph, and is shown to be in Miles versus
time.
Figure 15 - Elevation Graph
Figure 13- Maps or Elevation Graph
4) Maps
From these graphs/results we modified the code to meet out
our evaluation metrics (which are discussed in the next
section).
VIII. CHALLENGING CODE
If the user selects the Maps option shown in the
figure above, the system takes array stored in the file and
processes it to produce an overlay route over Google maps.
The logic to do so was discussed in the high-level system
design.
The map displayed is constructed with data from the
Google Maps Service, and the map can be controlled with key
presses, touch gestures, and other common user interface
elements.
Figure 14- Route visualization on Map
The figure illustrates a user talking around the school
campus. In this example there is a bit of a jerk in the
beginning. This is due to the fact that it takes time for the GPS
to get an accurate fix. The color changes according to the
change in elevation. In this example the lighter the color (from
blue to red to light orange) the lower the altitude.
A. MAP VISUALIZATION
One of the more challenging coding aspects of our
project involved actually drawing our routes on the Google
Map tiles, while varying the color of particular segments as
desired. Our initial approach to plotting a route involved
utilizing the “draw Line” function within the Android Graphics
library, and then placing this path onto Google Map tiles with
the Google Maps “Projection” interface. The Projection
interface serves to translate between the coordinate system of
x/y pixels on a device and latitude/longitude points on the
surface of the earth. At first, we drew a route with segments of
straight line between two points. Upon creating an array list of
latitude and longitude data, we passed two points into a loop, in
which the statements “path.moveTo()” defined the first point
for a given segment and “path.lineTo()” drew a line from the
last point to the current point, thus completing a segment. With
each iteration of the loop, the moveTo statement stepped
through our route as we plotted it, ensuring the lineTo
statement progressed along the route and drew a new segment
each time. The function drawLine was then used to actually
place the route on the Google Map tiles.
During initial testing, this provided a result that was
satisfactory to our expectations at the time. However, when we
focused on varying the color of a path according to changes in
elevation data, we encountered two primary issues. In the first
instance, we could successfully draw the route, but it did not
allow for variation of the route color. After modifying our code
to allow this, the next error we encountered allowed for a
change in the color of the path, yet would not include the entire
route content, rather only the last segment of it. We reached the
conclusion that the second error was caused by overwriting
previous route overlays within our loop. Essentially, as soon as
we moved to plot the next route segment, we erased the
previous one. In hindsight, we believe that this method might
have worked had we initialized a new line segment for each
pair of points along the route. However, prompted by the
pending due date, we explored an alternative method, which
ended up being much more efficient at handling the drawing of
a path.
Using the drawPath method, we found that we could
plot individual path segments one by one, rather than as a
whole at the end of our loop. Additionally, the drawPath
method afforded us the flexibility to vary the color of a path
segment between any two given points, as each path segment
was it’s own independent entity. Within the loop, each path
segment was initialized as a new Path() object, thus allowing us
to control the color of that segment, while also ensuring it
wasn’t immediately overwritten in the next iteration of the
loop. The following segment of code summarizes the basic
concept of how we went about controlling the color of each
route segment without overwriting previous points.
For (int i = 2; i < latA.length – 1; i++) {
path1 = new Path();
path1.moveTo(p2.x, p2.y);
path1.lineTo(p1.x, p1.y);
AssignColor( (int) AltA[i], (int) AltA[i+1]);
canvas.drawPath(path1, Paints.get(ColorIndex));
}
CodeBlock 1 – Path Overlay
As seen above, for the length of our route, the
elevation content corresponding to set of lat/long coordinates is
passed into the loop, assigned the proper color, and then drawn.
We then wrote the AssignColor method so that for any given
route, the change in color along a route is relative to the
elevation content of the route itself. Rather than choosing sea
level as the reference, we thought it would be more useful to
tailor the color variation to the elevation changes of a user’s
particular route. This way, we avoid the situation in which the
color of a route is skewed towards all warm colors (high
elevation), if the activity was carried out on top of a mountain,
for instance. It is worth noting as well, that each color must be
defined as it’s own separate entity before one can use it to draw
on a map.
B. COLOR CHANGING ALGORITHM
The ability to change the route’s color posed a
complex problem. The goal of the coloration was to show the
user when they were traveling uphill in their route and when
they were traveling downhill, and to fade hotter or colder when
the highest or lowest points of the route were reached.
The overlay can be assigned an RGB color value
every time a new section of the route is created. This process
involved creating the colors to be used, determining if a hot or
cold color should be applied, and then assigning the color
based on the section’s relative elevation to the high and low
values of the route.
The draw() method in Java requires that each paint
color used be allocated to memory. This ensures that when it is
showing the colors, each color value is defined in the resources.
We first tried to use the same paint variable for each color, and
found that the final color assigned was used for the entire route.
The colors were generated based on the gradients generated
below.
Figure 16 – Cold & Hot Gradients
Each gradient contains 130 colors. The brightest red is
equivalent to the lightest blue elevation wise. These colors
signify the highest elevation value on the Google Map, while
the darkest blue and palest red signify the lowest spot on the
map. The cold or hot equivalent is chosen based on if the user
is traveling upwards or downwards in elevation. Sliding the R,
G, or B value of the color code across the color scale generates
these colors. Every color was generated and stored into an
array of Paints of length 260.
R G B Cold High 0 38 255 Low 0 38 125 R G B Hot High 255 168 130 Low 255 38 0 Table 1 – Possible Color Values
Once the colors were generated, an algorithm was
written to figure out if a hot or cold color should be applied,
and which of the 130 colors the section of the route should be.
To determine if the section should be hot or cold, the elevation
at the current section to be drawn was compared to the section
directly in front of it. If it was less than the next value a hot
color was assigned, and if it was more than the next section a
cool color was applied.
The algorithm for choosing the color can be explained
by the equation below.
!"#"$ = !"#!"#$ − !"#!"#
∗ 130
(!"#!"#! − !"#!"# )
Equation 1 – Color Algorithm
This makes a ratio of the user’s altitude at a given
instant to the altitude high and low extremes, and then
multiplies the ratio by the total number of colors. This section
of the code needed improving since we were unable to see the
route fade across the entire color spectrum.
!!"#$ .!"#$%&
!"# = 1 −
!!"#
Equation 2 – Altitude Calculation
C. DATA HANDLING AND STORAGE
The nature of our app requires that we must save
large amounts of data. A typical 10 minute walk/route requires
over 36K GPS & altitude readings. As such saving, safely
storing and safely retrieving this data was of the outmost
importance.
Android devices have three principal data storage options:
•
Saving key-value pairs of simple data types in a
shared preferences file
•
Using databases managed by SQLite
•
Saving arbitrary files in Android's file system
The first option is only good for storing small amounts of
information, a single variable or String for example. The
second option, although capable, was beyond the scope for our
app. So we were left with the third option.
Initial attempts of implementing this option proved to be quite
fruitless and unachievable. Research this method to find
material that could relate to our system was as hard as finding
a needle in a haystack. Moreover, the resources that we could
find useful only talked about how to save single pieces of data
at a time. We would have ended up with requiring four
separate files per route, each file belonging to a certain field
(lat, long, etc.). Working with these files and trying to
synchronize between them was drastically wasteful and
inefficient.
We needed to find a way to save thousands of
corresponding quartets (lat, long, altitude, & timestamp). Our
solution, create a custom class which takes those 4 items as
fields.
public class locationData implements Serializable{
public double latitude;
public double longitude;
public double pressure;
public double timestamp;
CodeBlock 2 – Location Custom Class
Hence, a route is recognized in our system by an
array of locationData Objects. Thus to save a route we would
save the corresponding array. This allowed all-important data
to be save in one file and in matching quartets. This results in
making it easier to process these values in our map and
elevation visualization.
IX. EVALUATION PLAN
A. Evaluation of Barometer Sensor & Accuracy
Elevation data is used throughout our application to
help visualize the user’s route. The height cannot be directly
acquired, but instead is derived from atmospheric pressure
data that is logged from a barometer sensor. The altitude can
be found using the equation below.
∗ 44307.69
This equation requires pressure data collected at the
users position, and the pressure at sea level given the same
weather conditions. For our project we used the standard
accepted value of pressure at sea level 1013.25 hPa. The sea
level value can be more accurately acquired by checking online
databases from local airports. By getting a more accurate sea
level reading, the application’s elevation accuracy would
increase. By referencing Knoxville’s airport, Tyson McGee, we
found sea level pressure ratings over one month and were able
to note that Knoxville’s sea level pressure sits between 1021
hPa and 1027 hPa. The table below demonstrates how this
change in sea level pressure affects the accuracy of the reported
elevation.
Sea Level (hPa) User Pressure (hPa) Elevation Reading (m) 1011 915 835.58 1013.25 915 854.02 1017 915 884.64 Table 2 – Sea Level Pressure Testing
This table shows that using 1013.25 as the only sea
level reading can return an error of up to 30.62 meters
depending on weather conditions. On the Google Map overlay,
the color chosen for the route only relies on relative altitude,
and is not affected by this inaccuracy. The elevation
visualization does not show the calculated altitude and will
suffer from this inaccuracy.
The PUser variable in the altitude calculation is read in
from the barometer. The barometer sensor is extremely
sensitive to small pressure changes, and the data cannot be
inserted directly into the program. To account for the
inconsistency of the data 100 samples are taken over the course
of 1 GPS logging. These samples are then averaged to find the
mean elevation from the 100 samples. Below is a sample list of
10 data points collected from the barometer with the phone
sitting still.
location of at least three satellites above it and the distance the
device
and
each
of
those
satellites.
[http://tech2.in.com/features/all/what-is-agps-how-does-itwork/115142]
Time (mS) 5 10 15 20 25 30 35 40 45 50 User (M) 255.83 257.26 254.74 255.9 256.34 254.65 254.87 255.02 253.93 254.06 Table 3 – Sample Barometer Data
3) Timeframe for collecting information.
After multiple tests, and trying to find a balance
between accuracy and power efficiency, GPS sample readings
are taken twice a second (2 Hz).
4) Testing and Evaluation metrics
Identical test routes: The user takes the same route
over and over again with different algorithms until we reach
our strict targets.
Our aim was to get as accurate to within a path width
on a Google maps.
5) Analyzing information.
The information we received from the GPS was
analyzed through route visualization on Google maps.
This data shows how the raw data taken from the
sensor is not entirely accurate. Considering this data was taken
inside with an air conditioner running, the pressure of the
building will fluctuate much more rapidly than that of an
outdoor reading. The indoor data is used in the report to show
just how varied the altitude data can look. This set of data has a
variance of 3.33 meters with the phone sitting completely still.
To fix this shortcoming 100 data points are averaged together
to increase accuracy.
B. GPS accuracy
Our system heavily relies on the GPS sensor of the
android device, whose readings serve as the primary data
required for our app. As such it was important to ensure that
the GPS readings were strongly evaluated to ensure they met
the targets imposed by our system.
1) Types of information needed.
The GPS readings return a wealth of information
including the bearing of the phone, the speed of the device and
the altitude. Nevertheless, we only use the GPS sensor for Lat
& Long readings. The other information can be more
accurately obtained from sensors specifically designed to take
those readings. For example if we wanted the speed of the
device we would use the device’s accelerometer.
Figure 17 – Initial test route
The figure above shows our initial test trials. The user
kept walking within the roads but the map visualization showed
results that were too haphazard and did not meet our evaluation
metric. The next figure shows one of our final trials. It does
meet our target requirement.
2) Sources of Information.
GPS sensors can use up to a maximum of 24 satellites
to enable a user to pinpoint his current location.
Since the Earth is a sphere, each satellite generates a
specific part of the sphere it hovers and revolves with. An
intersection of three such spheres which is closest to the GPS
device’s location is done and the location is thus identified.
This technique is called 3D trilateration. To gather the
requesting device’s current location and provide accurate
response, the GPS receiver requires two vital details, i.e. the
Figure 18 - Identical Test Route
X. DISCUSSION
A. Team Contribution
Mark Gill (33%) – My main contributions to the code
involved most concerns with the barometer sensor. I enabled
the sensor, implemented sampling, created the elevation graph,
and determined the color-changing algorithm on the Google
Map. Other small implementations include the Alert dialog
prompting the user for the file name, and determination of the
App layout.
Mohamed Saleh (33%) – I worked on the information
handling, data saving, storage & retrieval. Additionally I
covered the backbone structure of Lat/Long & GPS
programming. Finally I assisted others in route coloration and
various other tasks.
Ben Roehrs (33%) – I worked primarily with the integration
of Google Maps into our application. Once raw
latitude/longitude/altitude data was collected and stored, I
worked on taking the data, and actually drawing the route on
the map tiles. To a lesser degree, I helped optimize the
sampling rate of the GPS sensor to ensure we received the most
accurate representation of a route as possible.
B. Experimental Solution to GPS Accuracy
While discussing the inaccuracies of the GPS sensor,
our team devised a way to potentially track a user’s route based
off far fewer GPS readings given accurate motion sensors.
Given an initial GPS lock to the user that is confirmed to be
very accurate, the phone could then use a digital compass and
an accelerometer to determine which direction and at what pace
the user was moving. This information could then be used to
generate GPS coordinates based off of the initial reading. These
generated coordinates could then be referenced against the GPS
sensor when we are able to determine that the information is
very accurate.
C.
Future Work
Due to time constraints and the scope of the
project, we were not able to implement every idea that
was initially proposed. Specifically, the “calorie counter”
function was not realized. The usefulness of collecting
elevation data only serves to provide a more accurate
distance figure when calculating the amount of calories
spent during an activity. Ideally, future development
would integrate this function into the app, as many
current applications on the market do not account for
elevation changes, resulting in at best, a rough estimation
of the calories spent during an activity.
Additionally, we felt that implementing our
application’s functionality as a service, and thus allowing
it to run in the background, would greatly increase the
degree of user-convenience and robustness of our
application, especially
when
considering
how
multitasking becomes a more powerful and integral
feature of each new generation of smartphones.
Concerning user interface improvements, the
ability to accurately switch between measurement
systems would be a quickly realizable and welcome
feature. It would also be necessary to provide a user more
control over route file management, as currently an
individual route cannot be removed without connecting to
a laptop. The last user interface improvement would
likely concern the color selection algorithm. Although it
currently functions properly, we found it difficult at times
to detect the degree of color change along a route. This is
in part due to extremely small intervals between sampled
data points, which result in small route segments. Further
optimization to make these changes more obvious would
only improve perhaps the most important feature of our
application.
XI. CONCLUSION
Using the Android platform as a fully functional GPS
tracking device has its strong and weak suits. The integration of
Google Maps with the device allowed us to use a worldwide
map to overlay GPS coordinates taken with the phone. The
platform is also open source allowing us to learn from the
community, and quickly develop a full-featured app. The OS
kernel is also very detailed, allowing developers easy access to
the sensors and powerful functionality. The Android platform
is also able to draw pressure data directly from an airport
database allowing extremely accurate elevation determination.
The Android phones were unable to produce always-accurate
GPS information. This is due to the quality of the sensor
installed in the S3 device.
Throughout this project, our team has learned to work
with an over abundant amount of resources. We also learned
how to divide the labor of a large project among team
members. In relation to our coursework, we saw how the
Android operating system deals with tasks and processes across
activities. By referencing our roadmap, our project was able to
incorporate the most vital functionalities of the application.
Download