Route Tracking Project Proposal

advertisement
Route Tracking
Project Proposal
Abstract—Take the lead from similar GPS tracking mobile
applications and provide a more detailed record of a typical
activity route, whilst incorporating elevation changes into
exercise calculations. The app features a limited user interface
that provides detailed information about a user’s Trek. This task
incorporates the GPS sensor in coordination with the Google
Maps API to visualize a traversed route.
Index Terms—Component, API, GPS, Interface, ViewGroup
I. INTRODUCTION
A. Aim
In this paper a proposal is introduced to develop a Route
tracking Location based service targeted towards a cyberphysical system. The service will run on an android-based
phone as an app.
The time-driven app will periodically record the location of
the Android User through sensors in the phone as he/she
completes a track. This data is then processed to give a variety
of information showing the runner’s progress over time
including: timing, pacing, average speed as well as calories
burnt. Furthermore, the data can be processed to create route
visualization over a map.
B. Motivation
While there are several apps that perform similar actions i.e.
Nike+, they only take in to account the two dimensional
positioning of the user: latitude and longitude. The proposed
app has the potential to become more accurate as it takes a
third dimension of the user into account, the elevation.
The proposed app will be targeted towards a whole range of
customers. From Hikers to marathon runners, dog sled teams,
skiers & outdoor enthusiasts.
II. BACKGROUND
The proposed app should be all inclusive in a typical androidOS system. As such, in terms of hardware, the entire system
can be executed just in a single suitable cell phone with no
additional accessories. The phones available for this project
are the Nexus_S and the Galaxy SIII, both of which fulfill the
requirements discussed in the subsections below.
A. Important Hardware Requirements
GPS Sensor: This is the main sensor with which the app will
record the location of the user (latitude, longitude, elevation).
Responsive input terminal: Required to enter information
about the user, and set parameters for the app. Achieved
through touchscreen interface of the screen.
Graphic Output Interface: Interface through which user can
track his/her progress as well as viewing map visualization.
B. Important Software Requirements
Android-OS: Cell phone has to run the latest Android OS
which are compatible with the below APIs and programs.
Android API: Contains classes & methods to interact with the
Android OS
The android’s application framework is vast with a whole host
of reusable components. In this proposal we only present two
components. Further research and work in to the project may
reveal more components that might be utilized.
Google Maps API: Contains classes & methods to interact
with Google maps, which will be used in map visualization.
C. User Interface
A major concern with the development of a mobile app, is the
layout of the user interface. The Android uses an architecture
of Views grouped into ViewGroups, that the user can call by
touching the screen. Upon launching our app, a ViewGroup
“MenuScreen” will load displaying the options 'Past Maps,
Create Map, Settings'. Each of these options represents a View
that is owned by the Viewgroup “MenuScreen”. From the
View “PastMaps”, the user will be able to access altitude
representations of their routes, and access GoogleMaps with
their route laid over the map. This view will also hold
information about calories burned on previous routes, and a
graphical comparison of the stored routes. The CreateMap
View will allow the user to press a button and begin tracking a
route. This will trigger the GPS to start logging information.
After pressing the begin button, A stop button will appear that
user may press to stop the route. The last View titled
“Settings” will be used to change various settings of the app.
We will include different units to choose from (SI vs. US), and
the user will be able to input personal information to help
calculate calories burned more accurately.
D. Sensors
The main sensor used by our app is the GPS. Our main
concern in using the GPS is the accuracy of the data, and how
the usage of the GPS will drain the battery. We have found
that cell towers may provide more accurate information more
quickly, but have opted not to use tower tracking initially to
simplify our design. As the semester progresses and if time
allows, we may include tower tracking. We can control the
power usage of the GPS by limiting how often the GPS returns
data. We can set this with a time limit & also a minimum
distance traveled. This variable will also be changeable by the
user dependent upon the pace of the user’s activity. We will
have to fine tune this variable as the app becomes usable, and
we can see how it affects the quality of the maps produced.
E. Google Maps APIs
The key functionality of the application will be driven by the
use of several Android and Google API’s. Specifically, initial
investigation has highlighted the Android Location API,
Google Maps Android API, and the Google Maps Elevation
API as probable components in our application. Google’s
MapView() library for Android devices provides developers
the main UI functionality of Google Maps such as map control
and rendering, but also allows a number of Overlay types to be
drawn on top of the map. The drawing of Overlays is an
integral part of the application, as it allows for the
visualization of activity routes, both present and past. Whether
obtaining the device location through the Android Location
API or Google Maps API, the application will store an array of
latitude and coordinates for use later on. In order to get the
elevation of a given point, the Google Maps Elevation API
will be used. Using the Elevation API, coordinate pairs of
latitude and longitude are provided to the API, and the Google
Maps elevation for that point is returned. Once elevation data
is collected, an overlay of the route will be drawn on top of a
Google Maps, with variation in route color based on elevation
changes.
III. PROPOSAL CHALLENGES
Of the challenges expected, the most outstanding obstacle
is having no previous experience with Android application,
Java, or XML development. There is a relatively short time
frame in which our group must transition from learning about
the basic Android architecture to actually implementing a
program to obtain the desired functionality. Additionally,
satellite communication will pose several issues in itself, from
signal strength to power management. Perhaps the most
significant concern here is satellite-to-phone accuracy: without
reliable accuracy, the purpose and improvements our
application is intended to provide will be greatly undermined.
IV. PROPOSAL GOALS
Despite the challenges we will encounter, we are confident
that completing this project will accomplish several goals.
Upon successful completion of the project, our group will
have a solid fundamental understanding of Android
application development. This will include experience in Java
programming and XML design, both of which are skill sets
that have an extremely broad array of applications outside of
mobile application development. Altogether, a key take-away
from this project is practice in creating a simple user
experience for a complex task.
V. TRACKTREK OVERVIEW







Launch App
Choose settings
Start Route
Log GPS coordinates and elevation at controllable time
intervals
End Route
Save information to Routes activity
Select route from list in Routes activity
 Upon selecting the route, select from the
following options:
Google map visualization
Elevation visualization
Calories spent
VI. HIGH LEVEL SYSTEM DESIGN
A. Launch App
Upon launching the app, the user is presented with four
options in the MenuScreen activity depicted in Figure 1. The
options include:
Start Route – Start GPS tracking
Previous Routes – GPS coordinates of routes
Settings – User preferences
Map – Temporary demonstration of Google Maps.
Future application of Google Maps will be
elsewhere.
Figure 1 – MenuScreen Activity Layout
Each of these selections starts a new activity with different
functions. Upon launching the app, It will recall the settings
that have been previously determined during other uses of the
application that have been set in the Settings activity. If
TrackTrek has been paused to use the phone for other purposes,
it will also continue running the GPS in the background. Upon
launching, the ‘Start Route’ button may also display ‘End
Route’, depending upon the current status of the GPS upon
launch. The app will not default to the MenuScreen, but will
open where the user last left the application.
B. Choose Settings
The user will be able to define different parameters to
change the outcome of the route tracking. These settings will
be determined under the Settings activity using the ‘spinner’
devices created by the Android developers. Spinners act as a
dropdown list of strings, which displays the choice made by the
user. These choices are stored into a global parameter defined
by the programmer stored in a list of preferences called
SharedPreferences. SharedPreferences is called to save the
choices, and once again to use the choice across all activities
within the application. Saving choices is called by the program
using Android derived functions such as onSelected(), and
recalled in the settings menu by using onResume().
Figure 2 – Settings Activity Layout
The user will first be able to choose if they would like to
see mile or kilometer values for distance traveled. These
choices are shown in the spinner device described earlier. This
will directly affect which unit is used when displaying a
distance on the map and on the elevation depiction. The
graphical layout of this parameter is demonstrated in Figure 2.
Another preference will be the sampling rate of the
GPS. This will determine how often the GPS records
latitude/longitude information based on a time factor and a
distance. This parameter may greatly increase the batteries
performance for long routes, and will allow increased accuracy
for those seeking greater performance. Sampling rate will be
directly related to exactness of calories burned, as well as the
graphical representations of the route. This parameter has not
been implemented in this demonstration and will be added at a
later date.
C. Start Route
Once the user is ready to start his run, he selects the Start
Route option. From the settings (covered in the next
subsection) the app will start logging in the User’s
Latitude/Longitude co-ordinates and elevation at fixed time
intervals.
D. Log GPS Coordinates & Elevation
1) GPS Coordinates
Our Android-powered devices (S3 & Nexus S) can receive
location updates through two underlying technologies, using
the internal GPS sensor or cell network localization. These
technologies are all contained in a whole host of high-level
abstract methods to find a cell phones position. This level of
abstraction allows the same piece of code to typically work for
any android phone which contains a GPS sensor. As such our
code works just as on well on the Galaxy S3 or the Nexus S
without requiring any modification of code.
For our application purposes we a high degree of accuracy
is required. Generally a location provider with greater accuracy
(GPS) requires a longer fix time than one with lower accuracy
(network-based). We want to record the location data as
quickly as possible and update it as more accurate data
becomes available. To do so we use both GPS and network
providers. We receive location updates from multiple location
providers that may have different timestamps and varying
levels of accuracy. Logic is incorporated to disambiguate the
location providers and discard updates that are stale and less
accurate.
The typical flow of procedures for obtaining the user
location:
1.
2.
3.
4.
User starts the run.
App starts listening for updates from desired location
providers.
Maintain a "current best estimate" of location by
filtering out new, but less accurate fixes.
Stop listening for location updates.
This flow is demonstrated in the figure below.
Figure 3 - A timeline representing the window in which the
application listens for location updates.
At the moment we have been able to get accurate
Latitude/Longitude values. The next step is to save a log of
those values, which can then be used as the basis for
computations and analysis by the other components of our
application. Furthermore since our application will need to
continuously receive and process location updates, a more
efficient use of resources can be utilized if we implement the
logic as a background service, so the user can multitask on the
phone.
Another we factor we have considered is power
consumption. Long windows of listening for location fixes can
consume a lot of battery power, but short periods might not
allow for sufficient accuracy. So we will have to carry out
field tests to find out the best compromise.
When the user is done with his/her run, location updates are
terminated to reduce unnecessary consumption of power and
network bandwidth.
Reverse-geocoding is the process of translating latitude
longitude coordinates to a human-readable address
[http://developer.android.com/training/basics/location/geocodi
ng.html]. We have implemented this logic in our code to be
used as a reference in future field tests.
2) Elevation
On Android powered devices there are three different methods
of acquiring the elevation at a location:
1. Altitude raw data straight from GPS Sensor: In
addition to Latitude & Longitude the GPS sensor can
also
acquire
elevation.
However
sources[http://gpsinformation.net/main/altitude.htm]
indicate that these readings are unreliable and
inaccurate.
2. Utilizing Web Service, which can return an
elevation reading: From the Latitude & Longitude
coordinates acquired by the GPS, a Web Service can
take those values and return an elevation reading.
USGS Elevation Query Web Service or Google Maps
are examples.
3. Using the Barometer sensor: Commercially Air
pressure readings have been used to more accurately
calculate the relative altitude. Out of all three
methods, this one seems to be the most accurate
(according to research). However only the Galaxy S3
has a barometer sensor, the Nexus S do not have one.
For our application purposes, we are not interested in the
absolute altitude above sea level per se. Rather we are
interested in the changes of elevation the user will go through
as he/she goes on his/her run. As such the first two methods
could be feasible. Further field tests are required to test this
hypothesis.
E. End Route
Ending the route turns off the GPS and elevation tracking,
and stores the information under the routes section. This action
will eventually prompt the user asking if they want to view the
Google map visualization, the elevation map, or the calorie
counter.
F. Select Route from List in Routes Activity
The Routes activity will show a dynamic list of the routes
tracked by the application. This list will grow by +1 every
instance the End Route function is called. These routes will be
given a file name chosen by the user upon completion of the
route, or until we enable this feature a generic name such as
Route1, Route2, etc. The routes will be listed in a date created
order listing the most recent route at the top of the list. The
routes activity will also implement a means of deleting the
information if desired.
This approach may cause some confusion at first when the
user terminates the GPS by selecting ‘End Route’. They may
expect something to immediately happen upon pressing End
Route, but will find that no immediate action happens. To go
around this shortcoming, we may be able to open the Routes
screen upon pressing the End Route button and select the first
route for the user; so that they are immediately prompted with
what information they want to view.
G. Google Maps Visualization
As discussed in the above sections, once the user has
completed an activity, the application stops logging
positioning data and then stores the waypoints of the recently
completed route into the “Previous Routes” memory area.
Based off the methods we intend to apply in order to obtain
elevation data, this information will likely be stored into an
array in ordered pairs of latitude and longitude. In fact, within
the Google Maps library, a predefined class labeled
“GeoPoint” exists which stores a pair of latitude and longitude
coordinates as integer numbers of micro degrees.
In order for the user to review their route later on, we
must incorporate the Google Maps API. In order to do this
with an Android device, we first import the Google Maps
external library, which contains the key class MapView. This
was done within our AndroidManifext.xml file. MapView is a
subclass of the Android class ViewGroup. Basically,
MapView provides a container for the Google Maps API,
allowing the user the manipulate map data and control the
view. The map displayed is constructed with data from the
Google Maps Service, but now the map can be controlled with
keypresses, touch gestures, and other common user interface
elements.
A key feature of the MapView class that we wish to
utilize is the ability to draw Overlay types over the generated
map that respond to user inputs. Specifically, we are interested
in the ItemizedOverlay class (extension from the class
com.google.android.maps.Overlay) within the Google Maps
library. This base class consists of a list of OverlayItems,
which in the case of route drawing, controls the orientation,
span bounds, and marker/waypiont creation along a route. An
additional extension of the Overlay class is the
RouteSegmentOverlay, which will be explored and ideally
utilized as we move forward with our application
development. Although not yet implemented, our next steps
will involve using both of these classes, along with the
GeoPoint class, to plot out a user’s activity route.
At the time of our midterm presentation, our
application has the ability to obtain latitude/longitude
coordinates via a GPS sensor or Location Services, as outlined
in the above sections. Concerning map visualization, we are
able to center a map over a pair of latitude and longitude
coordinates input by a user.
Figure 4: Screenshot of Google Maps Implementation
Outside of location services, the key Google Map classes that
allow this are MapView, MapActivity, MapController, and
GeoPoint. A necessary step to realize this was the generation
of a Google Maps API Key (which is unique to each machine
used during an app’s development stage). This is required to
identify one’s application, as Google limits the number of
times an individual/app accesses their satellite and server data.
In order to get a map centered on the location of the device,
Location Services must be implemented. Now that the Google
Map classes allow control and viewing of a map, interfaces
such as LocationListener and LocationManager prompt the
device to detect changes in GPS status and receive location
updates. Upon further testing, the final components used to
achieve our desired functionality will be described in greater
detail at a later date.
H. Elevation Visualization
This option will make a timeline visualization of the
elevation information using the sampling rate chosen by the
user in the settings menu. The visual will resemble a line graph,
and will be shown in Miles/KM versus time.
I. Calories Spent
VII. MIDTERM PROGRAMMING CHALLENGES
The biggest challenge our group has faced is the
breadth of concepts & syntax required to understand the
Android SDK environment. While there are a lot of resources
available for support & learning, (being a worldwide multibillion dollar industry tends to create that) we have found the
android platform has a broad learning curve.
One of the primary challenges our whole group faced
was the lack of experience in developing an android
application. Eclipse is a great development tool, but even that
was not enough to bridge the gap in understanding of xml and
javascript. Fortunately, Google provides excellent resources to
help one get up to speed. However, all of us struggled at first
to understand how each component is related to or required by
another component to create the over all application function.
Miscellaneous errors with the Java compiler and syntax errors
were eventually solved, but we have encountered nearly
exactly what we expected in our initial proposal challenges
section.
VIII. MIDTERM GOALS
With reference to our project RoadMap, we feel that
we accomplished what we set out to do before our midterm
presentation. Although we don’t have a firm method to
calculate calories burned yet, we made progress with the map
visualizations, which was initially a later milestone. As
mentioned throughout our high-level system design section,
our next steps require storing the latitude/longitude pairs of a
user’s route, and then plotting them on a Google Map. Overall,
our group is confident that we can achieve our original app
functionality within the stipulated project timeline.
Figure 5 – Project roadmap
Download