Is There Research in Smartphone-Based Systems?

advertisement
The Convergence of Mobile
and Cloud Computing
Ramesh Govindan
ramesh@usc.edu
University of Southern California
1
A Seductive Question…
The Cloud
When 4 billion smartphones
are
Internet-connected, how (if at all)
Network
should
theFabric
Internet
architecture change?
2
Bridging the Capability Gap
As the Smartphone becomes the primary computing device…
Capability
What users will want
What the device is capable of
Time
3
Bridging the Capability Gap
Dealing with Constraints
Energy-Delay Tradeoffs in a Video Documentation System
Overcoming Constraints
Toward Cloud-Assisted Interactive Mobile Perception
4
Energy Constraint
Battery Life is an issue for Usability
Intelligently use peripherals and
sensors to reduce energy
consumption
Opportunity: Multiple peripherals and sensors
5
The Urban Tomography System
Video Collection 10 years ago
Video
Collectio
n Today
M. Ra, J. Paek, A. Sharma, R. Govindan, M. Krieger, M. Neely,
“Energy-Delay Tradeoffs in Smartphone Applications”, ACM Mobisys 2010.
Users
Transportation
Post-Disaster
Security
Urban Planning
Documenting
Post-Katrina
Reconstruction
Leveraging Delay-Tolerance
Planning
Research
Many
But tolerance
of our users
varies
are considerably
delay-tolerant
Transportation
Security
Delay-Tolerance
Dealing with
Child development
Issues
Data-Intensive, Delay-Tolerant Applications
Video
Data-Intensive
Sensing
Download
Context
Prefetching
Audio .
Large Database.
Delay-Tolerant
• Large Database
(e.g. WPS)
• Maps
• Media
Application
Participator
y Sensing
Upload
• Monitoring
Personal
resources
• Dietary Habits
• Urban
Documentation
Our Focus
Transferring Large Volumes of Data
Leverage Delay Tolerance
Reduce the energy cost
Trade-offs
EDGE/3G
WiFi
Energy
(J/bit)
HIGH
LOW
Availability
HIGH
Channel
Quality
Delay transmission
LOW
Adapt to wireless
Time-Varying Time-Varying
channel quality
A Motivating Example
EDGE
3G
WiFi
video 1
arrives
200 KB/s
video 2
arrives
40 KB/s
50 KB/s
10 KB/s
TIME
Strawman Approaches
Energy-Optimal
Min-Delay
WiFi-Only
EDGE
3G
WiFi
Optimal can save
significant
energy
video
2
video 1
TIME
246
J
sec
Challenge: How
95
Energy
305 320
to design
Delay
the optimal
trade-off algorithm?
50
MD ME EO
Motivation
242
MD ME EO
Problem
Solutions Approach
Evaluation
Conclusion
SALSA
Whether, When, Which
SALSA
Delayed
Transmission
Tunable
Delay-Tolerance
Main Results
Ignorequeue
link quality
Ignore
backlog
Since SALSA takes all factors into account,
it performs closest to the optimal
Min-Delay
WiFi-only Static-Delay Know-WiFi
SALSA
The Bottom Line
Save 2%
~
80%
of
Gain?
battery capacity
+ 2 min ~
Loss?
2 hour
Battery
Energy Savings
Additional Delay
Bridging the Capability Gap
Dealing with Constraints
Energy-Delay Tradeoffs in a Video Documentation System
Overcoming Constraints
Toward Cloud-Assisted Interactive Mobile Perception
23
Interactive Mobile Perception
Advent of
Inference
Technology
• Computer Vision
• Language Translation
• Natural language
processing, etc.
Rich Sensors
and Smart
Mobile
Devices
• Perception application is
very relevant for mobile
devices.
• Smartphone technology
• High data rate sensors
- Interactive. (low latency 10 ~ 100ms)
- High data rate. (media data)
- Computationally Intensive. (ML/Vision-based algorithms)
M. Ra, A. Sheth, L. Mummert, P. Pillai, D. Wetherall, Odessa: Enabling Interactive
Perception Applications on Mobile Devices, ACM MobiSys 2011
Challenge
Constraint
Many perception applications require significant
compute power
Overcoming the constraint
Offload computation to the “cloud”
Offloading decision is non-trivial, since wireless
network bandwidth is also constrained
26
Odessa
A system that automatically and dynamically
offloads components of a perception data-flow to the cloud 27
Applications
Face Recognition
Gesture Recognition
Pose Estimation
29
Odessa Goals and Techniques
High Throughput (frames per second)
Low Latency (time to process a single frame)
Offloading
Parallelism
Motivation: Offloading
Variability on different devices
Offloading decision should be adaptive.
A static decision may not work.
Variable on Inputs
for Face Recognition
Variability on network conditions
Parallelism
Throughput
Makespan
1. Data Parallelism
Frame 1
2. Pipeline Parallelism
Frame 3
Frame 2
Throughput
Frame 1
Frame 1
Frame 2
Frame 3
Motivation: Data Parallelism
Accuracy
time of face
detection
stage
The
leveland
of execution
data parallelism
affects
accuracy
recognition)
and (face
performance
much,
and it should consider precedence partition.
Avg. execution time of SIFT feature extraction stage.
(object and pose recognition)
Motivation: Pipelining
A static choice of pipeline parallelism can
lead to suboptimal makespan
or leave the pipeline underutilized.
- Desirable # of tokens may be different.
- How do we even know right # of tokens a priori?
Odessa Design
Lightweight
Application Profiler
Uses statistics from
application pipeline
to make decisions
Decision Engine
Adapts data parallelism
and stage offloading.
Adapts
pipeline parallelism
Profiler
Start
Stage
Report
Decision
Engine
Stage
Piggybacking
Lightweight
Suppress
duplicated
information
Cycles,
Parallel
Stages
Stage
Stage
Stage
Stage
Stage
Report statistics
to the decision
engine per frame
Decision Engine
Pick a Bottleneck Stage (Start
Point)
Compute
Stage?
No
Incremental
Algorithm
Bottleneck is Network
Edge
Yes
Estimate Necessary Costs
Choose the Best Choice;
Offload, Spawn or Do nothing.
Consider
data parallelism and
stage offloading
simultaneously
Evaluation
Throughput
Odessa finds an optimal configuration
automatically.
Makespan
Comparison
FPS: Higher is better
Makespan: Lower is better
Odessa shows 3x improvement
over competing strategies
Convergence
Interesting research opportunities at the
boundary between mobile and cloud computing
40
Download