RI Training March-2012

advertisement
OCAP RI Training
CableLabs Winter Conference
Philadelphia
March 15-16, 2012
Schedule
•
Thu Mar 15 – 8:30am to 4:30 pm
»
»
»
»
»
Project Overview and History - DaveH
Build system - Chris
Logging and debugging - ScottD
Software architecture overview - MarcinK
Profiling applications with the RI’s JVM – SteveA/ScottD
[lunch]
»
»
»
»
•
System Information – Prasanna/Craig
Service Selection and DVR subsystem - Craig
PC Platform – Marcin/SteveM/Neha
HN subsystem – Doug/Craig/Marcin/Lori
Fri Mar 16- 8:30am to 2:30
» Graphics - SteveA
» Testing – ScottA/Nicolas
» Issue reporting and handling - ScottA
[lunch]
» Stabilization/Robustness project – Marcin/Chris
» Miscellaneous topics - TBD
Project Overview and
History
March 15-16, 2012
Project started June 2008
• Core OCAP I16+ compliant
» Several ECNs short of 1.0.0 compliance
• DVR I03 extension
» Missing ECNs 897, 994, 1017
• Front Panel I02 extension
• Approx 200 bugs carried over from legacy code
base
• Vision Workbench/Clientsim SDK
• Closed code base
Current Status
• Feb 2012 (1.2.2 Bundle Release)
» Core OCAP 1.2.2
» DVR Extension I08
» Front Panel Extension I05
» Device Settings Extension I05
» Home Networking Extension I08
» Home Networking Protocol I07
• JVM
» Based on Sun open source PhoneME advanced MR2
• PC Platform
» Windows XP initially; now Windows & Linux (several versions)
• SDK
» Rel-6 released Nov 2011 – no further versions planned
• This is an open-source project
» http://java.net/projects/ocap-ri/
2012 Schedule
• 1.2.1 Bundle released on 01/12/12
• 1.2.2 Bundle (1.2.2 Rel-A RI) released on 02/24/12
» Included VPOP as primary feature
» Also MIBObject.getValue
» Completed implementation of ServiceResolutionHandler
» Completed implementation of DTCP/IP on PC platform
» Maintenance against 1.2.1 RI
• 1.2.2 Rel-B planned for 04/05/12
– No feature development
– Maintenance against 1.2.2 RI
– Fixes will be taken off the current trunk
– 1.2.2 will not be maintained as a separate branch
• 1.2.3 Bundle planned for 05/17/12
» HN features to align with CVP-2 compliance
» Maintenance against 1.2.2 RI
» Backlog items
• Add IPv6 and https support to the PC platform
• Stack housekeeping tasks
Other
• DTCP platform port for Windows and Linux
» MPEOS API changes were included with 1.2.1 RI
release
» RI Implementation now available for internal testing at
CableLabs
• Phasing out support from Windows XP to Windows7
» Some performance issues with GStreamer plugin on
Win7/newer Linux platforms are being investigated
• SDK Rel-6 released on 11/17/11
» No current plans for a future release
• Upgrade of our DirectFB library we use (low priority)
• Port RI to the MAC (very low priority)
Project site walk through
•
•
•
•
•
•
•
•
Main Project Page
Bug Databases
Forums
Wikis
Contribution Process
Bug fix cutoffs
Release notes
Coding standards
Why an RI?
OCAP Specifications
Stubs & DTDs
Release
Bundle
OCAP
Reference
Implementation
Test
OCAP
Tests (CTP)
OCAP Release Bundle
• Components of a bundle are
» Specs
» Stubs
» DTDs
» RI implementation, including integration tests
» CTP conformance tests
OCAP RI Requirements
•
•
•
•
•
•
•
•
RI runs on a PC
» Windows initially – now Linux
RI and PC IDE must be available on open-source terms
RI and PC IDE must only include components with licenses compatible with
the ODL dual-license plans
» Components available only under GPL are not OK
» Licenses for all third-party RI components must be reviewed by both
CableLabs and the ODL legal teams
RI works with existing CableLabs ATE/CTP tests
RI adheres to current and future OCAP core specs
RI adheres to current and future OCAP extensions specs
To ensure backwards compatibility of the spec,MSO guides must run on the RI
To ensure backwards compatibility of stack ports of the RI, any changes to the
MPEOS porting layer must be approved by the RI steering committee
Licensing Models
• GPL License on java.net
» CableLabs OpenCable Project
– OCAP Stack, PC Platform, Head-end Emulator
» Sun PhoneME Project - JVM
• Commercial License
» CableLabs Commercial License
–
–
–
–
Also free
Stack, platform and emulator
RAND IPR commitment
Bug fixes in stack contributed back
» Sun or other JVM vendor
– Commercial CDC/PBP 1.1 JVMPhoneME JVM
OCAP RI Branching Strategy
•
Three principal branches
» Mainline/Development Branch
– Code implemented by internal RI Dev Team
– Code from open source contributors that are vetted by RI Tech Leads
– Other working branches get merged back to Mainline periodically
» Branded Branch (eg, “1.1.4”)
– Fixes and enhancements that are tied to the spec and which have been
verified by the CTP
– Branded branch is maintained separately from mainline
– Changes from branded branch eventually migrate back to mainline
development
– One branded branch per spec release
» Experimental Branch
– Open source contributors have write access to this directory
– No other restrictions
– Merging to Mainline on a case-by-case basis
Bug Tracking
• Two Bug Tracking Databases
» Internal (private) JIRA db (OCORI) at CableLabs,
tied to CableLabs CTP bug db
» External (public) JIRA db on java.net (IT); hides
details of CTP-related issues
RI Build System
March 15-16, 2012
Building the RI – The Easy Way
See https://community.cablelabs.com/wiki/display/OCORI/Quick+Start for
detailed instructions.
• Setup development environment
» Cygwin + JDK + Ant for Windows
» A little more required for Linux (see Wiki)
• Get checkout_dev_env.sh (from svn)
• Use checkout_dev_env.sh to get source, create setEnv
file
• Execute ant in appropriate directory.
» Builds Platform and Stack.
• See Wiki for detailed instructions.
Build System – Environment Variables
• Easy to work in several different RI code bases at the
same time.
• OCAPROOT
» The absolute path to the OCAP-1.0 directory.
» Required for compilation/execution
» Example: E:/Cablelabs/svn/OCAPRI/trunk/ri/RI_Stack
• OCAPHOST
» Defines the host build environment
» Build system reads host environment configuration files
from ($OCAPROOT/hostconfig/$OCAPHOST)
» Required for compilation only
» Example: Win32-Cygwin
Build System – Environment Variables
• OCAPTC
» The Target Configuration for the build. Basically the port
you are working on.
» Defines a subdirectory hierarchy where:
– build configuration files are found
($OCAPROOT/target/$OCAPTC)
– binary intermediate products are built
$(OCAPROOT/gen/$OCAPTC)
– final binary products are installed and runtime configuration
files are kept ($OCAPROOT/bin/$OCAPTC)
» Suggested format is:
– <org>/<platform>/<os>/[debug|release]
– Example: CableLabs/simulator/Win32/debug
» Required for compilation/execution
Build tools
• Make
» Compiles JNI, MPE, MPEOS, and thirdparty
native libraries
• Ant
» Coordinates the entire build system
» Wiki contains a list of top-level build targets
• JDK (1.4 or higher)
» Used to compile stack and test application
sources
Win32 Port
• Host environment is Cygwin
» See Wiki for a full list of Cygwin libraries required to build
the RI Stack and Platform
• Cross-compile to MinGW (no Cygwin DLL)
• Lots of work (including JVM patches) to deal with POSIX
vs. Win32-style paths
» POSIX for gcc
» Win32 for javac, javah, etc..
• VERY SLOW (compared to Linux)
» JVM binaries pre-built and checked-in to save compilation
time since most won’t be modifying the JVM
• WindowsXP, Vista
Linux Port
• Known working distros/versions:
» Fedora 10/12/13/15
» Ubuntu 10.04/10.10/11.04
» Ubuntu 11.04
• Much faster than Win32 on the same hardware.
• See Wiki for detailed instructions.
Logging and Debugging
March 15-16, 2012
Stack logging
• log4j APIs included in the spec for use by
applications
• Additional Logger methods avoid String
concatenation overhead in most cases
• Monitor applications configure logging through
DOMConfigurator or PropertyConfigurator
• Groups
• Multiple loggers can share a common
group name, which can be used during
configuration
Stack logging continued
• New appenders
• MIB Appender
• AsyncAppender uses an additional thread and
a queue to offload writing to the target
appender off of the caller thread
• New configuration capabilities
• Configure at the ‘group’ level or the logger
level
• Filter support, including ExpressionFilter
(ability to use regular expressions for finegrained control over logging verbosity)
Stack logging continued
• Additional information available from the Wiki
• https://community.cablelabs.com/wiki/display/
OCORI/Configuring+Java+stack+logging
Platform logging
• Platform code uses log4c to manage logging
• Configuration found in
$PLATFORMROOT/log4crc
• Additional information available from the Wiki
• https://community.cablelabs.com/wiki/display/
OCORI/RI+PC+Platform+Logging
Logging and IssueTracker
When attaching a log to IssueTracker
• Ensure the log contains timestamps
• Helpful if Java debug logging is enabled
Chainsaw screenshot
Java Stack debugging
• Possible to step through breakpoints in Java
stack code, generate stack traces and thread
dumps
• Stack trace, thread dumps available via jdb
(included with the Sun JDK)
• To enable Java debugging, un-comment
VMOPT19 & 20 in mpeenv.ini and start
debugger or jdb
• Re-comment VMOPT 19 & 20 when done..
Platform debugging
• gdb can be used to generate a trace if the
Platform terminates unexpectedly
• ./runRI.sh -gdb
RI Software Architecture
March 15-16, 2012
Host Operating System
Windows and Linux APIs
Platform Support Libraries
RI Platform Support Libraries
SNMP
GLib
PThreads
GStreamer
XML2
Int’l
Gst-Plugins-Base
Log4c
ZLib
Gst-Plugins-Good
FFMPEG
Windows and Linux APIs
wxWidgets
Platform Implementation
RI Platform Implementation
Tuner Control
GStreamer Plugins
User Interface
RI Platform Support Libraries
SNMP
GLib
PThreads
GStreamer
XML2
Int’l
Gst-Plugins-Base
Log4c
ZLib
Gst-Plugins-Good
FFMPEG
Windows and Linux APIs
wxWidgets
Platform API
RI Platform API
RI Platform Implementation
Tuner Control
User Interface
GStreamer Plugins
RI Platform Support Libraries
SNMP
GLib
PThreads
GStreamer
XML2
Int’l
Gst-Plugins-Base
Log4c
ZLib
Gst-Plugins-Good
FFMPEG
Windows and Linux APIs
wxWidgets
Platform Summary
• Full software emulation of STB media decoding
and presentation hardware.
• Majority of the code is 3rd party support libraries.
• Leverages existing frameworks:
» GLib – utility library.
» Gstreamer / HDHomerun / VLC– tuner control.
» GStreamer – media decoding and presentation.
» wxWidgets – user interface.
» Net-SNMP – Master Agent.
• No OS abstraction APIs.
OCAP Porting API
MPEOS API
OS APIs
Presentation APIs
Decoding APIs
DLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, Util
Display, FP, Graphics, UI Events
Closed Captioning, DVR, Filter,
Media, POD, Sound, VBI
MPEOS Implementation
RI Platform API
FT2
Direct
FB
OCAP Native Library
MPE Library
File System Management
DSM-CC
SI Database &
Parsing
MPEOS API
OS APIs
Presentation APIs
Decoding APIs
DLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, Util
Display, FP, Graphics, UI Events
Closed Captioning, DVR, Filter,
Media, POD, Sound, VBI
MPEOS Implementation
RI Platform API
FT2
Direct
FB
OCAP JVM
phoneME
JVM
PBP 1.1
VM
MPE Library
File System Management
DSM-CC
SI Database &
Parsing
MPEOS API
OS APIs
Presentation APIs
Decoding APIs
DLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, Util
Display, FP, Graphics, UI Events
Closed Captioning, DVR, Filter,
Media, POD, Sound, VBI
MPEOS Implementation
RI Platform API
FT2
Direct
FB
VM
Core Implementation
NanoXML
Log4J
SNMP
Record
...
TSB
OCAP Java Managers
Host
Service
EAS
PBP 1.1
Auth
phoneME
JVM
App
OCAP Java Implementation
JUnit
MPE Library
File System Management
DSM-CC
SI Database &
Parsing
MPEOS API
OS APIs
Presentation APIs
Decoding APIs
DLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, Util
Display, FP, Graphics, UI Events
Closed Captioning, DVR, Filter,
Media, POD, Sound, VBI
MPEOS Implementation
RI Platform API
FT2
Direct
FB
OCAP API
VM
Core Implementation
NanoXML
Log4J
SNMP
Record
...
TSB
OCAP Java Managers
Host
Service
EAS
PBP 1.1
Auth
phoneME
JVM
App
OCAP API
JUnit
MPE Library
File System Management
DSM-CC
SI Database &
Parsing
MPEOS API
OS APIs
Presentation APIs
Decoding APIs
DLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, Util
Display, FP, Graphics, UI Events
Closed Captioning, DVR, Filter,
Media, POD, Sound, VBI
MPEOS Implementation
RI Platform API
FT2
Direct
FB
OCAP Summary
• Ported to Platform APIs for STB/video-related
functionality.
• OS functionality MPEOS port maintained
together with the stack code.
• MPE contains platform-independent / portable C
code.
• Integrates Advanced phoneME JVM.
• OCAP Functionality implemented via pluggable
Java Manager implementations.
Profiling applications with
the RI’s JVM
March 15-16, 2012
phoneME JVM
• RI uses the open source (GPL2) phoneME Advanced JVM from
Sun/Oracle
» Closely based on J2SE 1.4
» Adding JSSE (secure sockets) support in an upcoming release
• OCAP requires a JVM compliant with the latest maintenance
releases of:
» Personal Basis Profile 1.1
» Foundation Profile 1.1
» Connected Device Configuration 1.1
• Last update: October 24, 2009
• Patch common source to fix bugs and build problems
» Win32/Cygwin/Make filesystem issues
» JDWP (VM debugging) thread sync issues
» PNG bug
» GIF bug?
JVM Build
• All JVM-related files located in
$OCAPROOT/jvm
• Build disabled by default for RI Win32 – pre-built
binaries checked in to SVN
» Enable/Disable building the VM with “build.jvm.exclude”
entry in
$OCAPROOT/target/$OCAPTC/buildrules.properties
ocap_vmlib
• Interfaces and classes to assist in integrating a
VM with the RI Stack
• Includes a full AWT implementation ported to
MPE graphics APIs (DirectFB)
• Documentation
» $OCAPROOT/docs/JVMPortingGuide.doc
Profiling and analysis tools
•
•
•
•
•
There are a number of tools available for
investigating issues in the Java stack code
NetBeans Profiler
CVM Inspector
HPROF
jdb
Profiling with NetBeans 6.8
•
•
•
•
•
JVMTI-based
Supports profiling of CPU, Memory and high level
JVM stats (GC, thread activity)
Used to identify CPU and memory hot spots
Does not support creation or comparison of heap
dumps
https://community.cablelabs.com/wiki/display/OCO
RI/Profiling+the+RI%27s+JVM+using+NetBeans+6.
8
CVM Inspector
•
•
•
•
•
Set of utility functions and a shell script which can
assist in inspecting the state of the JVM
Available by setting a flag in the JVM build
Used to generate and compare heap dumps
Runs in either client-server mode (standard JVM
client connects via a socket to the RI) or standalone
mode (GDB)
https://community.cablelabs.com/wiki/display/OCO
RI/Generating+heap+dumps+on+the+RI%27s+JVM
+using+CVM+Inspector
HPROF
•
•
•
A command-line profiling tool
Used to generate detailed monitor statistics
https://community.cablelabs.com/wiki/display/OCO
RI/Generating+thread+monitor+stats+on+the+RI%2
7s+JVM+using+hprof
JDB
•
•
•
The Java debugger command line interface
Used to generate lock and thread stack information
https://community.cablelabs.com/wiki/display/OCO
RI/Generating+Java+thread+dumps+and+monitor+i
nformation
Service Information (SI)
March 15-16, 2012
OCAP SI
• OCAP SI access API
» Provides information about available services in an
interactive broadcast environment
• OCAP uses SCTE-65 SI model (standard for SI
delivered out-of-band for Cable networks)
• SI tables are acquired from out-of-band channel
(ex: legacy OOB/DAVIC or DSG broadcast tunnel)
• Incorporates JavaTV SI API
• When CableCARD is present, all applications
(service bound/unbound), have access to SI
delivered on the OOB channel.
SI profiles
• SCTE-65 defines 6 profiles for OOB SI
• RI stack supports Profiles 1-3
• Includes tables: NIT-CDS, NIT-MMS, SVCT-DCM,
SVCT-VCM, NTT-SNS and STT (optional)
SI Profile
Tables
1 - Baseline
SVCT, NIT, NTT
2 – Revision Detection
Versioning enabled for NIT, NTT, SVCT
3 – Parental Advisory
Profile 2 plus RRT
4 – Standard EPG Data
Profile 3 plus AEIT, AETT
5 - Combination
LVCT, MGT. Backward compatible (1-4)
6 – PSIP only
LVCT, AEIT, Optional AETT
Stack Components
OCAP SI
• Incorporates following Java TV SI packages
» javax.tv.locator
» javax.tv.service.transport.Transport
» javax.tv.service.transport.Network
» javax.tv.service.transport.TransportStream
» javax.tv.service.transport.ServiceDetails
OCAP PSI
• Incorporates following packages
» javax.tv.service.navigation.ServiceComponent
» org.ocap.si.ProgramAssociationTable
» org.ocap.si.ProgramMapTable
• Supports Object Carousel
» Carousel ID resolution
» Deferred association tag resolution
» NSAP resolution
Java SIDatabase/SICache
• Access to native SIDB via Java SIDatabase
• SI data/events processed and dispatched to
registered listeners
• Most recent SI is cached in the SICache to avoid
trips across JNI layer to MPE SI Database
• SICache is flushed periodically (flush interval
configurable)
• Discrete SI request types to access various SI
objects
• SI requests asynchronously satisfied or timeout
(timeout value configurable) after time elapses
MPE SI
• When does it start?
» Gated by POD_READY indication
» Event-driven acquisition and parsing of OOB
table sections via MPEOS section filtering API
» Dedicated thread for OOB SI
» At start-up sets filters for all five table types NITCDS, NIT-MMS, SVCT-DCM, SVCT-VCM, NTTSNS
» Individual table timeout values configured via
mpeenv.ini
» SIDB populated in MPE layer
SITP SI State diagram
start
NIT/SVCT/NTT
Received
OOB SI
Acquiring
timeout
SI Not Available
Yet
NIT/SVCT/NTT
Received
SI DISABLED
All OOB tables
received
All OOB tables
received
SI Fully
Acquired
SI Updates
SI NOT
AVAILABLE
MPE SI (contd.)
• Profile-1 SI and CRC processing
» Section count unknown. MPE SI engine employs
a heuristic based on unique section CRC values
and CRC match count for deterministic...
» Section CRC match counts are configurable for
individual tables via mpeenv.ini
» Table acquisition timeout values also aid in faster
acquisition
» SVCT DCM scoreboard used to accelerate VCM
acquisition
» On slower networks with infrequent section repeat
cycle SI acquisition can be problematic
MPE SI (contd.)
• Profile-2 and above rely on the Revision
Detection Descriptor (RDD) for table section
count
MPE SI Cache
• Enable/Disable by mpeenv.ini configuration
(Disabled in RI by default)
• Speeds up stack start-up on slower networks by
using cached SI
• SI data is written to persistent memory
• SI data read from file at start-up if cache is
enabled
• Normal OOB SI acquisition also continues
• Updates to SI are reflected in the cache
• For testing only (Not intended for deployment)
MPE PSI Subsystem
• PSI: PAT & PMT acquisition using the MPE Section
Filter subsystem
• Manages PAT/PMT acquisition and parsing from OOB
(DAVIC/DSG Broadcast), DSG application tunnels,
tuners, and HN streams
MPE PSI Subsystem
• SITP uses 6 different filter classes on each transport stream:
• Initial PAT
• Initial selected PMT
• Initial secondary PMT(s)
• Revision PAT
• Revision selected PMT
• Revision secondary PMT(s)
• Fixed resources (local tuners & legacy out-of-band) are assigned
filter groups at startup according to mode
• Dynamic resources (DSG app tunnels, HN streams, remote tuners)
are assigned filter groups when the session is started
• SITP acquisition logic then works in terms of classes and filter
priorities without concern for class-to-group associations
MPE PSI
• State machine:
P
A
Tm
a
tc
h
e
d
A
2
+
A
3
P
A
Tm
a
tc
h
e
d
A
3
T
U
N
E
_
S
Y
N
C
A
1
ID
L
E
P
A
Tm
a
tc
h
e
d
A
2
+
A
3
W
a
itIn
itia
l
P
A
T
T
U
N
E
_
U
N
S
Y
N
C
A
2
tim
e
o
u
t
A
8
W
a
itIn
itia
l
W
a
itIn
itia
l prim
a
ryP
M
T
S
e
c
o
n
d
a
ry
a
tc
h
e
d
p
rim
a
ryP
M
T m
A
4
+
A
5
P
M
T
T
U
N
E
_
U
N
S
Y
N
C
A
2
tim
e
o
u
t
A
9
T
U
N
E
_
U
N
S
Y
N
C
A
2
a
lls
e
c
o
n
d
a
ry
P
M
T
sm
a
tc
h
e
d
A
4
+
A
6
P
A
Tm
a
tc
h
e
d
A
2
+
A
3
W
a
itR
e
v
is
io
n
tim
e
o
u
t
A
9
P
M
Tm
a
tc
h
e
d
A
4
+
A
7
T
U
N
E
_
U
N
S
Y
N
C
A
2
A
c
tio
nT
a
b
le
A
1
:S
e
tP
A
Tp
o
s
itiv
efilte
r(w
ithtim
e
o
u
t)
A
2
:C
a
n
c
e
la
llfilte
rs
A
3
:P
a
rs
eP
A
T
S
e
tP
A
Tn
e
g
a
tiv
efilte
r
S
e
tp
rim
a
ryP
M
Tfilte
r(w
ithtim
e
o
u
t)
S
e
ts
e
c
o
n
d
a
ryP
M
Tp
o
s
itiv
efilte
rs
A
4
:P
a
rs
eP
M
T
A
5
:S
e
tp
rim
a
ryP
M
Tre
v
is
io
nfilte
r
A
6
:S
e
ts
e
c
o
n
d
a
ryP
M
Tp
o
s
itiv
efilte
r
A
7
:S
e
ts
e
c
o
n
d
a
ryP
M
Tn
e
g
a
tiv
efilte
r
A
8
:S
e
tP
A
Tp
o
s
itiv
efilte
r(n
otim
e
o
u
t)
A
9
:S
e
tP
M
Tp
o
s
itiv
efilte
r(n
otim
e
o
u
t)
MPE PSI Subsystem
• Has 6 defined acquisition modes for tuner-based
section acquisition, to tailor section filter resource usage
to the platform’s filtering capabilities.
• Mode 1: Legacy single-filter sharing
• Mode 2: Dedicated filter per tuner
• Mode 3: Dedicated 2 filters per tuner without secondary
acquisition
• Mode 4: Dedicated per-filter tuner for PAT and selected PMT
with “wandering” PSI pre-fetch filter
• Mode 5: Mode 3 +1 filter that picks up all secondary PMTs
across all tuners
• Mode 6: No filter sharing (every section request uses its own
filter)
SIDB
• MPE SIDB contains following (SI/PSI) objects
» Transports
» Networks
» Transport Streams
» Services
» Programs
» Elementary Streams
• Provides API to access the various SI objects
SIDB API
• Lock/unlock SI DB for read/write access
• Access provided using opaque SI handles
• Service resolution methods
»
»
»
»
mpe_siGetServiceHandleBySourceId()
mpe_siGetServiceHandleByFPQ()
mpe_siGetServiceHandleByServiceName()
Etc.
• SI enumeration methods
»
»
»
»
mpe_siGetAllNetworks()
mpe_siGetAllServicesForTransportStream()
mpe_siGetServiceComponentsForServiceHandle()
Etc.
SIDB API (contd.)
• Object Carousel support methods
» Look-up PID given CarouselID/Component Tag
» Find Program Number given Deferred Association
Tag
• CA support methods
» Enumerate CA descriptors
» Lookup ECM PID
System Time
• RI parses out-of-band System Time Table (STT)
to extract system time in UTC
• STT section filtering and parsing done on a
dedicated thread in MPE SITP layer
• Can be disabled for platforms which directly
process STT or use alternate mechanism
• Maintains a difference between system time and
network time
• Employs a smoothing algorithm to avoid jumps
Service Selection
& DVR Subsystem
March 15-16, 2012
ServiceContext
• JavaTV ServiceContext is the primary entity for
Service presentation/selection
• AV Services can be:
» “Broadcast” Service. Service object retrieved from
SIManager or OCAPLocator representing a
frequency/program/modulation.
» RecordedService. A recording accessed via a
RecordingRequest.
» RemoteService. A Service representing an HNstreamable ContentItem retrieved from the
ContentServerNetModule
ServiceContext
• The RI’s ServiceContext implementation utilizes delegates
for each type of Service that can be presented:
» BroadcastServiceContextDelegate: Used for presenting
Broadcast Services when the DVR extension is not present
» RecordedServiceContextDelegate: Used for presenting
RecordedService (DVR-only)
» RemoteServiceContextDelegate: Used for presenting
RemoteServices (HN-only)
» DVRBroadcastServiceContextDelegate: Used for presenting
Broadcast Services when the DVR extension is present. Supports
the TimeShiftProperties interface which allows the app to
enable/disable timeshift and specify timeshift duration.
» AbstractServiceContextDelegate: Used for selecting
AbstractServices – Services which represent unbound xlets
ServiceContext
• Class relationship:
ServiceContext
• OCAP requires that the ServiceContext provide a JMF Player to the
application for media control
• On the RI, the Player and the Player-associated Controls provide
rendering control, but actual rendering is done by the platform.
• The ServiceContextDelegate acquires tuner/TSB resources and the
RI’s Player infrastructure manages everything else, including:
» MediaAccessAuthorization (parental control, etc)
» Conditional Access
» PSI/ServiceComponent retrieval
» NetworkInterface/component validation
» CCI enforcement
» TSB/Segmented Time-shift attachment (for
DVRBroadcastServiceContextDelegate)
» SegmentedRecordedService navigation (for
RecordedServiceContextDelegate)
ServiceContext
• Resources required for presentation:
Resource
AppsDatabase
Video device
NetworkInterface
MediaAccessHandler
RecordingManager
MediaStorageVolume
UPNP/DLNA framework
Abstract service
Yes
No
No
No
No
No
No
Broadcast service
Yes
Yes
Yes*
Yes
No
No
No
Recorded service
No
Yes
Yes**
Yes**
Yes
Yes
No
Remote service
No
Yes
No
No
No
No
Yes
* Non-DVR ServiceContextDelegate manages a NetworkInterface directly, DVR broadcast
ServiceContextDelegate uses a NetworkInterface indirectly (via TimeShiftWindowClient)
** If presenting an ongoing recording and switching to the 'live point'
ServiceContext Basics
• Delegate/Player relationship:
ServiceContext
• Basic Broadcast Service “live” playback example:
T
ransport streamsectionfilteringsession
(s)
m
pe_m
ediaT
une(freq1,m
od1)
m
pe_filterS
etF
ilter()
S
ectionfilteringfor
P
A
T
/P
M
Tinitiatedby
M
P
ES
IM
anager
M
P
E
_S
F
_E
V
E
N
T
_S
E
C
T
IO
N
_F
O
U
N
D
P
A
T
/P
M
Tfound
D
V
B
S
erviceC
ontext.select()
calledforS
ervices1
m
p
e
_
dvn
rT
sbC
onvertS
tart()
D
ecodes
e
s
s
io
T
unecom
pletes
m
pe_m
ediaD
ecode()
M
P
E
_T
U
N
E
_S
Y
N
C
t1 t2 t3 t4 t5
m
pe_m
ediaS
top()
JM
FB
roadcast P
layerstarted
.
D
ecodeinitiatedw
ithLocator-specified
com
ponents/P
ID
sfromP
M
T
JM
Fplayerstopped
t6
ServiceContext
• Basic Broadcast Service time-shift playback example:
T
r
a
n
s
p
o
r
ts
tr
e
a
m
s
e
c
tio
n
filte
r
in
g
s
e
s
s
io
n
(
s
)
m
p
e
_
m
e
d
ia
T
u
n
e
(
fr
e
q
1
,m
o
d
1
)
T
S
B
fillin
g
s
tillb
e
lo
w
c
a
p
a
c
ity
m
p
e
_
filte
r
S
e
tF
ilte
r
(
)
M
P
E
_
S
F
_
E
V
E
N
T
_
S
E
C
T
IO
N
_
F
O
U
N
D
tim
e
s
h
ifte
d
s
1
c
o
n
te
n
t
m
p
e
_
d
v
r
T
s
b
B
u
ffe
r
in
g
S
ta
r
t(
)
D
V
B
S
e
r
v
ic
e
C
o
n
te
x
t
.s
e
le
c
t(
)
c
a
lle
d
fo
rS
e
r
v
ic
e
s
1
D
e
c
o
d
e
S
e
s
s
io
n
1
m
p
e
_
m
e
d
ia
D
e
c
o
d
e
(
)
T
im
e
s
h
iftP
la
y
b
a
c
k
S
e
s
s
io
n
1
m
p
e
_
m
e
d
ia
S
to
p
(
) m
p
e
_
d
v
r
T
s
b
P
la
y
S
ta
r
t(
)
J
M
F
T
S
B
P
la
y
e
rs
ta
r
te
d
.
In
itia
liz
e
s
in
liv
e
m
o
d
e
.
T
u
n
e
c
o
m
p
le
te
s
M
P
E
_
T
U
N
E
_
S
Y
N
C
t1t2 t3 t4 t5t6
D
e
c
o
d
e
S
e
s
s
io
n
2
m
p
e
_
d
v
r
P
la
y
B
a
c
k
S
to
p
(
)m
p
e
_
m
e
d
ia
D
e
c
o
d
e
(
)
J
M
F
T
S
B
P
la
y
e
rs
w
itc
h
e
s
to
tim
e
s
h
iftp
la
y
b
a
c
k
U
s
e
r
in
itia
te
d
r
e
w
in
d
v
ia
P
la
y
e
r
.s
e
tR
a
te
(
2
)
.
J
M
F
T
S
B
P
la
y
e
rs
w
itc
h
e
s
to
liv
e
p
la
y
b
a
c
k
U
s
e
r
in
itia
te
d
ju
m
p
to
liv
e
v
ia
P
la
y
e
r
.s
e
tM
e
d
ia
T
im
e
(
P
O
S
IT
IV
E
_
IN
F
IN
IT
Y
)
t7
T
S
B
w
r
a
p
p
in
g
s
ta
r
ts
(
o
n
th
is
p
la
tfo
r
m
)
t8
ServiceContext
SC State
machine:
DVR TimeShiftManager
TimeShiftManager Basics
• Internal Manager only (not exposed to OCAP Applications)
• Provides both tuning and TSB management for multiple parts of the
DVR-enabled RI stack
• ServiceContext implementation uses TSM to:
•
•
•
•
Tune
Enable Service buffering (via TimeShiftProperties)
Discover already-tuned/buffering Services (and NetworkInterfaces)
Acquire/enumerate TimeShiftBuffers (TSBs) for playback.
• RecordingManager uses TSM to:
• Tune
• Enable Service Buffering (via BufferingRequests)
• Discover already-tuned/buffering/buffered Services (and
NetworkInterfaces)
• Convert previously-buffered and currently-buffering TSBs into
RecordedServices
TimeShiftManager Basics
• HN TSBChannelStream uses TSM to:
•
•
•
•
Tune (ChannelContentItem Tuning Locator)
Discover already-tuned/buffering Services
Initiate buffering
Acquire/enumerate TSBs for streaming
TimeShiftManager
g,
istin
from
r ex esent
r
ove
disc ering, p
buff
ate
initi
T
im
e-shift R
esources
/ tim
e-shiftedcontent forserviceS
H
Nstream
inginitiatedfor
C
hannelC
ontentItem
associatedw
ithserviceS
initi disco
at e
v
buff er exis
erin
t
g, s ing,
trea
m fr
om
B
ufferingR
equest
forserviceS
initiate buffe
ring
R
ecording
of serviceS
fering,
f
u
b
e
t
ia
it
n
i
isting,
discover ex
ording
c
e
r
to
t
r
e
v
con
S
erviceC
ontext
presentingserviceSw
ith
D
V
Rextensionenabled
TimeShiftManager Responsibilities
• Base Service acquisition mechanism for DVR-enabled RI stack.
• Manages the pool of MPE time-shift buffers (the number of TSBs
and their size)
• Embodies the knowledge of how Network Interfaces are shared and
creates SharedResourceUsages
• Represents the use of the NI in ResourceContention and manages
the coordinated shutdown of NI-based operations when/if the NI is
lost or necessary components are lost, including:
• Transport stream synchronization
• Service component availability
• Conditional Access (CA)
• Switch digital video transitions
• Manages the TimeBases for all TSBs to allow for:
• Proper navigation of TSBs by JMF
• Discovery of TSB/TSBs for retroactive recording (conversion of
already-buffered content into RecordedService(s))
TimeShiftManager Internals
<<interface>>
TimeShiftManager
implements
<<interface>>
TimeShiftWindowClient
TimeShiftManagerImpl
1
1
1
1
ManagerManager
1
implements
n
TimeShiftWindowClientImpl
n
TimeShiftWindow
1
n
TimeShiftBuffer
NetworkInterfaceController
TimeShiftManager Internals
Static/Sequence
Diagrams
DVR RecordingManager
RecordingManager Basics
•
•
•
•
Defined as part of OCAP DVR I08 and DVB GEM
DVR
org.ocap.shared.dvr.RecordingManager is defined
by DVB GEM – meaning its interface is used by
specifications other than OCAP
org.ocap.dvr.OCAPRecordingManager extends the
GEM RecordingManager – adding OCAP-defined
functionality
Has to work with/enable HomeNetworking
extension when present
RecordingManager Responsibilities
• RecordingManager-provided functionality:
• Is responsible for starting and stopping
RecordingRequests at particular times and managing
conditions which can interrupt/resume recordings (so
RecordingManager is always running)
• Manages and attempts to honor BufferingRequests
• Manages RecordedServices (“Recordings”)
• Persistently saves RecordingRequests and associated
application-supplied metadata (so RRs survive reboots)
• Manages the space allocation/prioritization for
RecordingRequests
• Provides warnings when NetworkInterfaces are about to
be utilized for RecordingRequests (RecordingAlertListers)
RecordingManager API
RecordingManager API
Diagrams
RecordingManager Implementation
• RecordingManager Class Diagram
RecordingRequest
RecordedService
OcapRecordingRequest
ParentRecordingRequest
<<implements>>
Segmented
RecordedService
RecordingRequestImpl
<<implements>>
<<implements>>
<<implements>>
0..n
RecordedServiceImpl
RecordingImpl
ParentNodeImpl
0..1
1..n
1
<<implements>>
Segmented
RecordedServiceImpl
0..1
RecordingInfo2
0..n
1
RecordedSegmentInfo
1
RecordingInfoTree
0..n
RecordingInfo Metadata
PersistentData
1..n
RecordingInfoNode
RecordingInfoTree
RecordingInfo
RecordingInfo2 (old)
RecordingInfo2
0..n
Tree
Each of these forms represents
a discreet persistent file type
Leaf
0..n
SegmentedLeaf (old)
RecordedSegmentInfo
1..n
RecordedService
ComponentInfo
SegmentedLeaf
RecordedSegmentInfo
1
TimeTable
1..n
TimeBased
DetailsInfo
1..n
RecordedService
ComponentInfo
RecordingRequest States
The “blue sky” path for a (Leaf) RecordingRequest
takes it through these states:
PENDING_NO_CONFLICT: Recording is ready to
record and there is no contention for tuners with
any other RecordingRequest
IN_PROGRESS: Recording is on-going (tuner is
tuned and data is being written to disk)
COMPLETE: Recording has completed and
content is present in its entirety
DELETED: Recording has been deleted
(RecordedService.delete() has been called)
RecordingImpl states
• Many things may not go as planned during the recording
process which put the RR into
IN_PROGRESS_WITH_ERROR:
• The RecordingRequest may not acquire an NI (tuner)
at the time it’s supposed to start
• The NI may be lost after the RR has started
• Sync may be lost on the NI
• PSI (PAT/PMT) may not be acquired or lost
• Conditional Access may be denied
• RemovableStorageVolume is disconnected
• MediaStorageVolume (disk) becomes full
• Service may be re-mapped via SPI (SDV)
Any of these conditions may remedy themselves
RecordingRequest States
Completely
Resolved
Cancelled
Partially
Resolved
Unresolved
Failed
Record ()
In Progress
Complete
Pending
Without
Conflict
In Progress
Insufficient
Space
Incomplete
Cancelled
Legend
Pending With
Conflict
Parent
States
Failed
Leaf States
In Progress
With Error
Test
RecordingImpl States
• Each RecordingImpl has an external state and
an internal state (IState)
R
e
c
o
r
d
i
n
g
I
m
p
l
.
I
S
t
a
t
e
I
S
t
a
t
e
I
n
i
t
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
T
u
n
e
F
a
i
l
e
d
I
S
t
a
t
e
P
e
n
d
i
n
g
I
S
t
a
t
e
W
a
i
t
T
u
n
e
S
u
c
c
e
s
s
I
S
t
a
t
e
S
t
a
r
t
e
d
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
T
u
n
e
r
U
n
a
v
a
i
l
a
b
l
e
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
B
u
f
f
e
r
i
n
g
D
i
s
a
b
l
e
d
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
M
S
V
U
n
a
v
a
i
l
a
b
l
e
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
C
a
D
e
n
i
e
d
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
T
u
n
e
r
N
o
t
R
e
a
d
y
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
S
e
r
v
i
c
e
U
n
a
v
a
i
l
a
b
l
e
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
S
e
r
v
i
c
e
R
e
m
a
p
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
T
S
W
B
u
f
S
h
u
t
d
o
w
n
I
S
t
a
t
e
E
n
d
e
d
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
C
o
p
y
P
r
o
t
e
c
t
e
d
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
I
n
s
u
f
f
i
c
i
e
n
t
S
p
a
c
e
I
S
t
a
t
e
S
u
s
p
e
n
d
e
d
M
S
V
U
n
a
v
a
i
l
a
b
l
e
Recording Process 1
OCAP ODL Stack Scheduled Recording Scenario 1
time-shifted
s1 content
free space
OcapRecordingManager.
record(OcapRecordingRequest orr1)
for s1, startTime 7:00, dur 60m
RecordingManager initiates
TimeShiftWindow attach - causing
a tune and timeshift TSB1 start
mpe_dvrTsbBufferingStop()
TSB content
earmarked/copied
from TSB1 for RS1
mpe_dvrTsbBufferingStart()
RecordedService
1
RecordingManager initiates
time-shift-to-recording conversion
mpe_dvrTsbConvertStart()
Soon after 8:00, TimeShiftManager
stops buffering into TSB1 and
releases the associated tuner
mpe_dvrTsbConvertStop()
07:00
08:00
t2
t3
+
t1
t4
More RM Detailed Diagrams
More detailed recording
scenarios
RI PC Platform
March 15-16, 2012
Platform Components
• Support Libraries
» 3rd party libraries supplying foundation for providing
TV/Media functionality in the Platform.
» Meet the RI dual licensing requirements.
• Implementation
» Leverages features provided by the support libraries.
» Adds extensions and customizations.
» Glues everything together into a coherent implementation.
• API
» Hides the underlying technology-specific terminology.
» Uses STB concepts to simplify command and control of
the Platform features.
Support Libraries Overview
gst-plugins-good0.10.10
ffmpeg-0.5
gst-plugins-base-0.10.22
liblog4c-1.2.1
Net-SNMP-5.6.1
gstreamer-0.10.22
pthreads-2-8-0
libxml2-2.6.32
Win32: compile from src
Linux: part of libc
Win32: compile from src
Linux: pre-installed
wxWidgets-2.6.4
liboil-0.3.15
gtk+-2.10.14
Win32: compile from src
Linux: pre-installed
Win32: unused
Linux: compile from src
glib-2.18.4
zlib-1.2.3
gettext-0.17
Win32: compile from src
Linux: pre-installed
Win32: compile from src
Linux: pre-installed
libiconv-1.12
Win32: compile from src
Linux: part of libc
External library code
Support Libraries Overview
• iconv & gettext provide internationalization support and are required
by glib
• xml2 library requires zlib and is used by both gstreamer and clinkc
• pthreads implementation wraps Win32 thread support in POSIX
APIs and is required by clinkc
• gstreamer requires glib
• gst-plugins-base need gstreamer and oil – optimized low-level
routine library (i.e. fast memcpy using MMX or SSE extensions)
• gst-plugins-good requires gst-plugins-base
• ffmpeg supplies A/V decoding support; at present, only MPEG-1/2
video decoding capability is used
• wxWidgets library provides multi-platform GUI framework and
requires gtk+ library on Linux
• log4c is a native implementation of the Apache logger
Implementation Overview
RI PLATFORM INTERFACE
CABLELABS GSTREAMER PLUGINS
LAUNCHER
RI PLATFORM IMPLEMENTATION
gst-plugins-good0.10.10
LOGGING
SNMP Master Agent
liblog4c-1.2.1
Net-SNMP-5.6.1
Tuner Control
gst-plugins-base-0.10.22
gstreamer-0.10.22
pthreads-2-8-0
libxml2-2.6.32
Win32: compile from src
Linux: part of libc
Win32: compile from src
Linux: pre-installed
ffmpeg-0.5
UI
CONFIG
wxWidgets-2.6.4
liboil-0.3.15
gtk+-2.10.14
Win32: compile from src
Linux: pre-installed
Win32: unused
Linux: compile from src
glib-2.18.4
zlib-1.2.3
gettext-0.17
Win32: compile from src
Linux: pre-installed
Win32: compile from src
Linux: pre-installed
libiconv-1.12
Win32: compile from src
Linux: part of libc
CableLabs developed code
External library code
Recent updates
•Trying to stay up-to-date on 3rd party library revisions as
much as possible.
•Glib
» Updated from 2.22.4 to 2.28.7
» Should be available in the 1.2.2 Rel-B and future releases.
•Gstreamer
»
»
»
»
Currently in-progress.
gstreamer : 0.10.22 to 0.10.35
gst-plugins-base : 0.10.23 to 0.10.35
gst-plugins-good : 0.10.10 to 0.10.30
Supporting Infrastructure
• Logging – uses log4c, available to all platform modules.
• Config – human-readable name-value keystore:
» # This is a comment
» RI.Launch.App.0 = ocap_stack
» RI.Launch.App.0.cwd = $(OCAPROOT)/bin/$(OCAPTC)/env
• Launcher - .EXE that:
» Starts the platform followed by all platform client DLLs.
» Enters that main loop and waits for reset/shutdown signal.
GStreamer Overview
In-Band Pipelines:
Section
Filtering
Tee
Section
Assembler
Section
Filter
Section
Sink
UP
LL
CA
FFMPEG
Library
PID
Filter
TSB
Tee
Decode
Tee
PID
Filter
ES
Assembler
File
Source
File
Sink
MPEG
Decoder
Display
OpenGL
Rendering
Colorspace
Converter
DIS
RE K
AD
UDP
Source
K
DIS ITE
R
W
NET
HDD
Out-Of-Band Pipeline:
VPOP Pipeline:
FDC
App
Source
Queue
App
Source
Section
Filter
Section
Sink
UP
LL
CA
UP L
L
CA
UI
GStreamer Summary
• Framework and building blocks for the MPEG-2
transport stream processing.
• Supplies in-band and OOB section filtering.
• Supplies MPEG-1 & 2 video decoding capability.
• Provides Time-Shift Buffer recording/playback.
• Supplies video and graphics plane display
capability (in conjunction with UI module).
• Supplies VPOP output stream packet source
User Interface Overview
Multi-plane Image Source
GStreamer Display
Scaling / Compositing
Common OpenGL
Infrastructure
Rendering / Windowing / Eventing
Native
Win32
wxWidgets
wxWidgets
Win32 build
Native
Linux/X11
wxWidgets
Linux build
GTK+
User Interface Summary
• Emulates the final TV display output – provides a
physical view of the rendered frame buffer.
• Relies on OpenGL technology for graphicsintensive operations (scaling, compositing,
rendering).
• Abstracts out window-specific operations into an
API to permit the usage of any UI/Widget
Toolkit/Framework.
• Currently supports wxWidgets framework as well
as native OS targets (Win32/Linux).
Platform API
• Object-based.
• Attempts to be functionally equivalent with an STB driver
API.
• Supports multiple instantiation of each object/API.
• Possible to supply multiple implementations of each API
(like Java interfaces).
• Removes dependencies on any technology-specific
terminology (abstraction layer).
• Some objects support multiple clients (e.g. display
module for iDCR).
Home Networking
Subsystem
March 15-16, 2012
HN Topics
•
•
•
•
•
Overview
Public Class Hierarchy
Streaming
Mapping to Platform
Gstreamer pipeline for serving, playing back
» Porting
• DTCP-IP
• Current Status
» Limitations
HN Overview
The Home Networking Extension provides support
for an ever increasing set of use cases
surrounding the discovery and delivery of
content over the customer’s home network.
• Use cases have evolved from an adjunct option for
content delivery to a primary one.
» Multi-room DVR (1.1.3 – 1.1.5)
» 3rd Party DLNA Device compatibility (1.2 – 1.2.2)
» CVP-2 and Cloud based scenarios (1.2.3 – later)
HN Overview
• OCAP 1.1.3 : Multi-room DVR
» Initial Release of HN Extension
» Focus on CTP test conformance
» Limited streaming capability
• OCAP 1.1.4 – 1.1.5 : Multi-room DVR cont.
» Redesigned Content Directory Service
» Continued CTP test conformance
» Improved streaming capability
HN Overview
• OCAP 1.2 - 1.2.2 : 3rd Party Device Compatibility
» Lower level UPnP Diagnostics API
» Improved UPnP/DLNA compatibility
» Live channels, Hidden items, Video Primary Output
Port, Alternate URI
• OCAP 1.2.3 : CVP-2 Support
» RUI Discovery and Authentication
HN Overview
• Java abstraction of:
» Network device discovery
» Content publishing
– Recordings, Live streaming, personal content
» Content discovery
» Content playback/display
» Remotely scheduling recordings
• Based upon, but abstracted from UPnP
• 2 specs
» HNEXT I08 Spec (Java APIs)
» HNP2.0 I07 Spec (Mapping to UPnP)
Baseline HN Technology
• UPnP
» Device Architecture 1.0
» Media Server 2.0
» Content Directory Service 3.0
» Connection Manager Service 2.0
» Scheduled Recording Service 1.0
• DLNA
» DLNA Guidelines, August 2009
HN Stack Components
HN Stack Architecture
Java Stack
org.ocap.dvr
org.ocap.hn
upnp media server
UPnP Diagnostics API
CyberLink For Java
mpe
mpeos
Platform
HN Gstreamer Pipeline
Design Choices
• UPnP Diagnostics API implemented using a 3rd
party UPnP stack
• Almost all code at Java level
• Content served/consumed at platform layer
» Rather than pass data up/down from Java
» Sockets passed between layers
UPnP Diagnostics Classes/Interfaces
• Singleton UPnPControlPoint
» Discovers UPnP devices on the network
» Register low level message handlers
• UPnPClientDevice
» Maps to each UPnP device on a specific network
• UPnPClientService
» Maps to each UPnP service on a device on a specific
network
» Invoke UPnP actions on a network service
• UPnPActionInvocation
Basic Public Classes/Interfaces
• Singleton NetManager
» Get list of discovered Devices on the home network
» Get aggregated list of NetModules
– NetModule is java abstraction of UPnP Service
• Device
» Straightforward mapping of UPnP device
• NetModules
» Loosely based on UPnP Services
» ContentServerNetModule
» RecordingNetModule
• Event driven listeners for discovery and activity
» DeviceEventListener/NetModuleEventListener/Conten
tServerListener
Basic Client Operation
•
•
•
•
•
NetManager.addNetModuleEventListener(…)
NetManager.getNetModuleList(…)
ContentServerNetModule.requestRootContainer
ContentServerNetModule.requestBrowseEntries
ContentServerNetModule.requestSearchEntries
» Get back ContentList of Items/Containers
» Render Item(s)
Content Public Classes/Interfaces
Content package
• Natural 1:1 mapping to UPnP ContentDirectory
» ContentContainer = <container>
» ContentItem, ChannelContentItem = <item>
» ContentResource = <res>
» MetadataNode ≈ <property attribute=blah>
» requestBrowseEntries = Browse action
» requestSearchEntries = Search action
Recording Content Interfaces
Client Recording Content
•
•
•
•
Not so simple UPnP Mapping
RecordingContentItem ≈ <item> in CDS
NetRecordingEntry ≈ <item> in CDS
But also:
» RecordingContentItem ≈ recordTask in SRS
» NetRecordingEntry ≈ recordSchedule in SRS
• Refer to HNP2.0 Appendix I
» Singleton NetManager gives access to
RecordingNetModules
» RecordingNetModule represents a DVR device
» requestSchedule ≈ CreateRecordSchedule SRS
» Also requestReschedule, etc.
Recording Scheduling Interfaces
Implemented by Stack Class
Implemented by Application Class
Server Recording Content
• Server stack per se does not initiate recording
• Application (EPG) gets RecordingNetModule and registers
request handler
» RecordingNetModule is a NetRecordingRequestManager
» UPnP SRS actions result in handler invocation
» Handler responsible for calling DVR APIs and publishing
resulting recordings
» OcapRecordingRequests are RecordingContentItems
• Note: RI HN extension currently requires DVR extension
» In future, no DVR APIs will be referenced in HN
– Allowing HN-Only client or server
HN Streaming
• Once content has been published on the network by the DMS, and
clients (DMCs) have discovered ContentItems, streaming can be
initiated.
• Remote client (DMC) initiates HTTP-based streaming from the DMS
using either the “res” URI from the ContentDirectoryService (CDS)
or the “alternateURI” registered by the guide application on the
ContentItem
• RI uses CyberGarage handle connection initiation to the HTTP port
• Streaming initiation requires authorization by the
NetAuthorizationHandler, if registered.
• For streaming requests, the NAH is passed the requested URI
and source IP
• The NAH2 is additionally passed the entire request line and
request headers
HN Streaming
• Incoming streaming requests are delegated to one of 5 interceptors:
» IconInterceptor: Provides UPnPManagedDeviceIcon support
(URI path /ocaphn/icon)
» RecordedServiceInterceptor: Provides RecordedService
streaming support (URI path /ocaphn/recording)
» ChannelRequestInterceptor: Provides ChannelContentItem
streaming support (URI path /ocaphn/service)
» VPOPInterceptor: Provides VPOP streaming support (URI path
/ocaphn/vpop)
» PersonalContentInterceptor: Intended to provide content to local
files (currently unsupported)
HN Streaming: Recorded Service
• Multiple HTTP range requests for a
RecordingContentItem
H
N
S
P
T
S
S
tre
a
m
S
e
s
s
io
n
1
(
m
p
e
_
H
n
S
tre
a
m
S
e
s
s
io
n
)
m
p
e
_
h
n
S
tre
a
m
O
p
e
n
()
H
N
S
P
T
S
S
tre
a
m
S
e
s
s
io
n
2
(m
p
e
_
H
n
S
tre
a
m
S
e
s
s
io
n
)
m
p
e
_
h
n
S
tre
a
m
C
lo
s
e
()
H
N
P
la
y
b
a
c
kS
e
s
s
io
n
1
(m
p
e
_
H
n
P
la
y
b
a
c
k
S
e
s
s
io
n
)
R
e
c
o
r
d
e
d
S
e
r
v
ic
e
In
te
r
c
e
p
to
rc
a
lls
N
e
tS
e
c
u
r
ity
M
a
n
a
g
e
rto
a
u
th
o
r
iz
e
p
la
y
b
a
c
ka
c
tiv
ity
N
e
x
tG
E
T
a
u
th
o
r
iz
e
d
m
p
e
_
h
n
P
la
y
b
a
c
k
S
ta
rt()
H
T
T
P
G
E
T
/h
e
a
d
e
r
sp
a
r
s
e
d
,U
R
L
ism
a
tc
h
e
d
to
C
o
n
te
n
tIte
m
,a
n
d
p
r
o
c
e
s
s
in
g
d
e
le
g
a
te
d
to
R
e
c
o
r
d
e
d
S
e
r
v
ic
e
In
te
r
c
e
p
to
r
S
o
c
k
e
th
a
n
d
e
d
o
ffto
p
la
tfo
r
m
fo
rs
tr
e
a
m
in
g
R
e
c
o
r
d
in
g
S
tr
e
a
m
in
itia
te
s
R
e
c
o
r
d
e
d
S
e
r
v
ic
e
R
1
r
a
n
g
e
p
la
y
b
a
c
ku
s
in
g
m
p
e
_
H
n
P
la
y
b
a
c
k
P
a
r
a
m
s
M
e
d
ia
S
e
r
v
e
r
H
ttp
w
ith
m
p
e
_
H
n
S
tr
e
a
m
C
o
n
te
n
tL
o
c
a
tio
n
o
f
L
O
C
A
L
_
M
S
V
_
C
O
N
T
E
N
T
m
p
e
_
h
n
S
tre
a
m
O
p
e
n
()
m
p
e
_
h
n
S
tre
a
m
C
lo
s
e
()
H
N
P
la
y
b
a
c
kS
e
s
s
io
n
2
(m
p
e
_
H
n
P
la
y
b
a
c
k
S
e
s
s
io
n
)
m
p
e
_
h
n
P
la
y
b
a
c
k
S
ta
rt()
R
e
c
o
r
d
in
g
S
tr
e
a
m
in
itia
te
s
R
e
c
o
r
d
e
d
S
e
r
v
ic
e
R
1
r
a
n
g
e
p
la
y
b
a
c
k
/
L
O
C
A
L
_
M
S
V
_
C
O
N
T
E
N
T
N
e
x
tG
E
T
p
a
r
s
e
d
&
d
e
le
g
a
te
d
M
P
E
_
H
N
_
E
V
T
_
E
N
D
_
O
F
_
C
O
N
T
E
N
T
M
P
E
_
H
N
_
E
V
T
_
E
N
D
_
O
F
_
C
O
N
T
E
N
T
S
o
c
k
e
tc
o
n
n
e
c
tio
n
e
s
ta
b
lis
h
e
d
to
H
T
T
P
p
o
r
t(
v
ia
C
y
b
e
r
G
a
r
a
g
e
)
t1t2t3t4t5
t6
t7t8t9t1
0
t1
1
HN Streaming: Recorded Service
• SegmentedRecordedService playback (2 segments,
1x playback)
H
N
S
P
T
SS
tre
a
m
S
e
s
s
io
n1(m
p
e
_
H
n
S
tre
a
m
S
e
s
s
io
n
)
m
p
e
_
h
n
S
tre
a
m
O
p
e
n
()
m
p
e
_
h
n
S
tre
a
m
C
lo
s
e
()
H
N
P
la
y
b
a
c
kS
e
s
s
io
n1
(m
p
e
_
H
n
P
la
y
b
a
c
k
S
e
s
s
io
n
)
R
e
c
o
rd
e
d
S
e
rv
ic
e
In
te
rc
e
p
to
rc
a
lls
N
e
tS
e
c
u
rity
M
a
n
a
g
e
rtoa
u
th
o
riz
e
p
la
y
b
a
c
ka
c
tiv
ity
H
N
P
la
y
b
a
c
kS
e
s
s
io
n2
(m
p
e
_
H
n
P
la
y
b
a
c
k
S
e
s
s
io
n
)
m
p
e
_
h
n
P
la
y
b
a
c
k
S
ta
rt(R
S
1
)
H
T
T
PG
E
T
/h
e
a
d
e
rsp
a
rs
e
d
,U
R
L
ism
a
tc
h
e
dtoC
o
n
te
n
tIte
m
,a
n
d
p
ro
c
e
s
s
in
gd
e
le
g
a
te
dto
R
e
c
o
rd
e
d
S
e
rv
ic
e
In
te
rc
e
p
to
r
m
p
e
_
h
n
P
la
y
b
a
c
k
S
ta
rt(R
S
2
)
R
e
c
o
rd
in
g
S
tre
a
m
in
itia
te
s1
xp
la
y
b
a
c
ko
f
firs
tR
e
c
o
rd
e
d
S
e
rv
ic
e(R
S
1
)in
S
e
g
m
e
n
te
d
R
e
c
o
rd
e
d
S
e
rv
ic
eu
s
in
g
m
p
e
_
H
n
P
la
y
b
a
c
k
P
a
ra
m
s
M
e
d
ia
S
e
rv
e
rH
ttp
w
ithm
p
e
_
H
n
S
tre
a
m
C
o
n
te
n
tL
o
c
a
tio
no
f
L
O
C
A
L
_
M
S
V
_
C
O
N
T
E
N
T
M
P
E
_
H
N
_
E
V
T
_
E
N
D
_
O
F
_
C
O
N
T
E
N
T
S
o
c
k
e
tc
o
n
n
e
c
tio
n
e
s
ta
b
lis
h
e
dtoH
T
T
Pp
o
rt(v
ia
C
y
b
e
rG
a
ra
g
e
)
R
e
c
o
rd
in
g
S
tre
a
m
in
itia
te
s1
xp
la
y
b
a
c
ko
f
s
e
c
o
n
dR
e
c
o
rd
e
d
S
e
rv
ic
e(R
S
2
)in
S
e
g
m
e
n
te
d
R
e
c
o
rd
e
d
S
e
rv
ic
e
L
a
s
tR
e
c
o
rd
e
d
S
e
rv
ic
einth
e
S
e
g
m
e
n
te
d
R
e
c
o
rd
e
d
S
e
rv
ic
e
M
P
E
_
H
N
_
E
V
T
_
E
N
D
_
O
F
_
C
O
N
T
E
N
T
t1 t2 t3 t4 t5
t6 t7
t8
HN Streaming: Live Streaming
• Streaming requests for ChannelContentItems also are subject to
additional processing steps:
• The “Channel Locator” is subject to resolution via the
ServiceResolutionHandler to establish the “Tuning locator”
• NetworkInterface has to be tuned to the program specified in the
Channel Locator (if the program is not already tuned)
• In attempting to acquire a tuner for streaming, the
ResourceContentionHandler may be invoked – with a
NetResourceUsage representing the network-initiated tuner
acquisition
• When the DVR extension is present, time-shift buffering also
needs to be started (if the program is not already buffering)
HN Streaming: Live Streaming
• ChannelContentItem playback
T
ra
n
s
p
o
rts
tre
a
m
s
e
c
tio
nfilte
rin
gs
e
s
s
io
n
(s
)
m
p
e
_
m
e
d
ia
T
u
n
e
(fre
q
1
,m
o
d
1
)
M
P
E
_
S
F
_
E
V
E
N
T
_
S
E
C
T
IO
N
_
F
O
U
N
D
M
P
E
_
T
U
N
E
_
C
O
M
P
L
E
T
E
T
S
B
C
h
a
n
n
e
lS
tre
a
m
a
c
q
u
ire
sa
T
im
e
S
h
iftW
in
d
o
w
v
iaT
im
e
S
h
iftM
a
n
a
g
e
r
C
h
a
n
n
e
lR
e
q
u
e
s
tIn
te
rc
e
p
to
rc
a
lls
N
e
tS
e
c
u
rity
M
a
n
a
g
e
rtoa
u
th
o
riz
e
p
la
y
b
a
c
ka
c
tiv
ity
H
T
T
P
G
E
T
/h
e
a
d
e
rsp
a
rs
e
d
,U
R
L
ism
a
tc
h
e
dtoC
o
n
te
n
tIte
m
,a
n
d
p
ro
c
e
s
s
in
gd
e
le
g
a
te
dto
C
h
a
n
n
e
lR
e
q
u
e
s
tIn
te
rc
e
p
to
r
tim
e
-s
h
ifte
ds
1c
o
n
te
n
t(T
S
B
1
)
m
p
e
_
d
v
rT
s
b
B
u
ffe
rin
g
S
ta
rt()
m
p
e
_
d
v
rT
s
b
B
u
ffe
rin
g
S
to
p
()
H
N
S
P
T
S
S
tre
a
m
S
e
s
s
io
n1(m
p
e
_
H
n
S
tre
a
m
S
e
s
s
io
n
)
m
p
e
_
h
n
S
tre
a
m
O
p
e
n
()
m
p
e
_
h
n
S
tre
a
m
C
lo
s
e
()
H
N
P
la
y
b
a
c
kS
e
s
s
io
n1(m
p
e
_
H
n
P
la
y
b
a
c
k
S
e
s
s
io
n
)
m
p
e
_
h
n
P
la
y
b
a
c
k
S
ta
rt()
S
o
c
k
e
tc
o
n
n
e
c
tio
n
e
s
ta
b
lis
h
e
dtoH
T
T
P
p
o
rt(v
ia
C
y
b
e
rG
a
ra
g
e
)
M
P
E
_
H
N
_
E
V
T
_
P
L
A
Y
B
A
C
K
_
S
T
O
P
P
E
D
B
u
ffe
rin
g
s
to
p
sa
fte
r
tim
e
o
u
t
T
S
B
C
h
a
n
n
e
lS
tre
a
m
in
itia
te
sT
S
B
1p
la
y
b
a
c
ku
s
in
g
m
p
e
_
H
n
P
la
y
b
a
c
k
P
a
ra
m
s
M
e
d
ia
S
e
rv
e
rH
ttpw
ith
m
p
e
_
H
n
S
tre
a
m
C
o
n
te
n
tL
o
c
a
tio
no
f
M
P
E
_
H
N
_
C
O
N
T
E
N
T
_
L
O
C
A
T
IO
N
_
L
O
C
A
L
_
T
S
B
R
e
m
o
tee
n
d
p
o
in
tte
rm
in
a
te
sp
la
y
b
a
c
k
c
lo
s
e
ss
o
c
k
e
tc
o
n
n
e
c
tio
n
0
t1 t2 t3t4 t5 t6 t7 t8 t9t1
t1
1
t1
2
HN Streaming: VPOP
• Virtual Primary Output Port: Allows
for streaming of whatever content is
playing back on the DMS
TheLO
C
A
L_D
IS
P
LA
Ystream
ing/playback
sessionspersist acrossbroadcast session
decodes, TS
Bplaybacks, andrecording
playbacksandterm
inatesw
hentheD
M
C
disconnects
H
NS
P
TSS
treamS
ession(m
pe_H
nS
tream
S
ession)
H
N
S
erverS
essionM
anagerinitializesthe
server-sideconnectionusing
m
pe_H
nS
tream
P
aram
sM
ediaS
erverH
ttp
–passingtheconnectiontotheplatform
H
NP
laybackS
ession(m
pe_H
nP
laybackS
ession)
m
pe_hnP
laybackS
tart(..., LO
C
A
L_D
IS
P
LA
Y
, disp1)
m
pe_hnS
tream
O
pen()
H
N
S
erverS
essionM
anagerinitiatesV
P
O
Pstream
ingby
passingam
pe_H
nP
laybackP
aram
sM
ediaS
erverH
ttpw
ith
m
pe_H
nS
tream
C
ontentLocationof LO
C
A
L_D
IS
P
LA
Yandthe
m
pe_D
ispD
evicehandleof theprim
arydisplay
S
tream
ingA
ctivityListenerinvoked
N
etA
uthorizationH
andlerinvoked
M
P
E
_H
N
_E
V
T_S
E
S
S
IO
N
_C
LO
S
E
D
D
ecodeS
ession1
D
ecodeS
ession2
m
pe_m
ediaTune(freq1)
m
pe_m
ediaTune(freq2)
m
pe_filterS
etFilter()
m
pe_filterS
etFilter()
m
pe_m
ediaD
ecode(…
,disp1)
m
pe_m
ediaD
ecode(…
,disp1)
H
TTPrequest received
w
ithV
P
O
PU
R
I &handled
byV
P
O
P
Interceptor
m
pe_m
ediaS
top()
JM
FTS
BP
layerstarted.
(B
ufferingdisabled)
t1 t2 t3 t4 t5 t6
V
P
O
Pconnectionappearspaused/stalledw
hen
nocontent isbeingdisplayedontheD
M
S
R
ecordingP
laybackS
ession
m
pe_dvrR
ecordingP
layS
tart(…
,disp1)
m
pe_dvrP
layB
ackS
top()
m
pe_m
ediaS
top()
P
laybackof recordedservice
C
hannel change
t7 t8
TheM
P
E
O
Sport/platformrenderstheA
V
content fordisp1tothesocket providedinthe
m
pe_H
nP
laybackS
ession
t9 t10
A
nytrickm
odesperform
edduringrecording
playbackarerenderedaslivetothesocket
providedinthem
pe_H
nP
laybackS
ession. P
referably,
P
ID
sarerem
appedtoconsistent values
t11
t12
HN Streaming: RemoteService playback
• OCAP RI DMC can playback a MPEG single-program
transport stream (modeled on the DMC as a
RemoteService)
H
NS
P
T
SS
tre
a
m
S
e
s
s
io
n(m
p
e
_
H
n
S
tre
a
m
S
e
s
s
io
n
)
m
p
e
_
h
n
S
tre
a
m
O
p
e
n
()
m
p
e
_
filte
rS
e
tF
ilte
r(M
P
E
_
F
IL
T
E
R
_
S
O
U
R
C
E
_
H
N
_
S
T
R
E
A
M
)
S
e
ctio
nfilte
rin
gfo
r
P
A
T
/P
M
Tin
itia
te
db
y
M
P
ES
IM
a
n
a
g
e
r
D
V
B
S
e
rvice
C
o
n
te
xt.se
le
ct()
ca
lle
dfo
rR
e
m
o
te
S
e
rvices
1
(co
u
ldre
p
re
se
n
tare
m
o
te
R
e
co
rd
in
g
C
o
n
te
n
tIte
m
o
r
C
h
a
n
n
e
lC
o
n
te
n
tIte
m
).
P
A
T
/P
M
Tfo
u
n
d
R
e
m
o
te
S
e
rvice
D
e
ta
ils
.g
e
tC
o
m
p
o
n
e
n
ts
()
re
q
u
e
stsa
tisfie
d
H
NP
la
y
b
a
c
kS
e
s
s
io
n
m
p
e
_
d
vrT
sb
o
n
ta
rt()
(m
p
e
_
H
n
P
la
y
b
a
c
k
S
e
s
s
iC
o
n
)vertS
M
P
E
_
S
F
_
E
V
E
N
T
_
S
E
C
T
IO
N
_
F
O
U
N
D
m
p
e
_
h
n
P
la
yb
a
ckS
ta
rt()
m
p
e
_
h
n
P
la
yb
a
ckS
to
p
()
JM
FR
e
m
o
te
S
e
rvice
/P
la
ye
rsta
rte
d
.
P
la
yb
a
ckin
itia
te
dw
ithd
e
fa
u
lto
ra
p
p
se
le
cte
dco
m
p
o
n
e
n
ts
/P
ID
s
t1
t2 t3 t4
JM
FP
la
ye
rsto
p
p
e
d
t5
HN Platform Functionality
• Two levels of RI HN Platform Code
• MPEOS C code
• GStreamer pipeline C code
•
MPEOS Level – Player
• Performs HTTP request & response handling
• Receives content via socket
• Transfers to HN player pipeline for decoding
• MPEOS Level – Server
• Sends content retrieved from platform via socket
• NOTE: HTTP handling is performed in RI Stack
• GStreamer HN Pipelines
• Server pipeline supplies content
• Player pipeline decodes content
MPEOS HN Changes
• Defines the porting interface for HN - mpeos_hn.h
• Methods & data structures used in platform for both
Server & Player roles
• Summary of recent changes
• Non-platform specific Server side processing of HTTP
requests moved to RI Stack level
• Clearer separation of methods b/w player & server
• Enhanced “javadoc” type documentation updates
• Added DTCP support
• Need to be sensitive to changes
• Impacts ALL platforms not just RI PC Platform
• All changes are posted to forum
DTCP/IP
• RI Stack support complete with 1.2.1.
» DTCP_ profile ID signalling.
» application/x-dtcp1 mime types.
» Support for Range.dtcp.com in 1.2.2-B.
• RI Platform integration complete with 1.2.1.
» Integrated Intel SIK v4.02.
» Library presence dynamically detected and loaded at
RI start-up.
» Fallback to No-OP version.
» MPE configuration options:
– MPEOS.HN.DTCTPIP.DLL=/f/encrypted/dtcpip.dll
– MPEOS.HN.DTCPIP.STORAGE=/f/encrypted/keys/
DTCP/IP (cont’d)
DTCP/IP Testing
• Successful sink/source interoperability.
• Verified support/interoperability with both facsimile
and production keys.
• Successful streaming to a production DMP device.
• Currently working through issues with DLNA LPTT.
Limitations
• Network interface mapping
» MoCA support
» Supporting multiple interfaces
– Bridging/loop mitigation
• Playback formats
» No Audio playback support
» Only standard MPEG transport stream
– DLNA “ISO” ts formats, not tts (zero or otherwise)
– No MPEG PS support (adding in July)
Graphics
March 15-16, 2012
Setting up Graphics Resolution
Platform Window size from
display.window.width
display.window.height
(platform.cfg)
TV Screen size from
DISP.DEFAULT.CONFIG
(mpeenv.ini)
Video scaled according
to incoming video size
and applicable DFC rule
Coherent Configurations
DISP.DEFAULT.CONFIG
Configuration
0
640x480 1:1 graphics, 720x480 8:9 video, 640x480 1:1
background
1
640x480 1:1 graphics, 720x480 8:9 video, 720x480 8:9
background
2
960x540 3:4 graphics, 720x480 8:9 video, 640x480 1:1
background
3
960x540 3:4 graphics, 720x480 8:9 video, 720x480 8:9
background
4
640x480 4:3 graphics, 1920x1080 1:1 video, 1920x1080
1:1 background (with I-frame support)
5
960x540 1:1 graphics, 1920x1080 1:1 video, 1920x1080
1:1 background (with I-frame support)
Stack Graphics Overview
MPE Java Classes (partial list)
•
•
•
•
•
•
MPEGraphics
MPEGraphicsConfiguration
MPEGraphicsDevice
MPEToolkit
MPESurface
MPEImage
MPEOS Graphics Object:
mpeos_GfxScreen
typedef struct mpeos_GfxScreen
{
int32_t
x;
int32_t
y;
int32_t
width;
int32_t
height;
int32_t
widthbytes;
mpe_GfxColorFormat
colorFormat;
mpe_GfxBitDepth
bitdepth;
mpeos_GfxSurface
*surf;
os_GfxScreen
osScr;
} mpeos_GfxScreen;
MPEOS Graphics Object:
mpeos_GfxSurface
typedef struct mpeos_GfxSurface
{
os_Mutex
mutex;
/**< surface is thread safe */
int32_t
width;
/**< width of surface in pixels */
int32_t
height;
/**< height of surface in pixels */
int32_t
bpl;
/**< bytes per line */
mpe_GfxBitDepth
bpp;
/**< bit depth (bits per pixel) */
mpe_GfxColorFormat
colorFormat; /**< color format */
void*
pixel_data;
/**< pixel data */
mpe_Bool
primary;
/**< true if on-screen surface */
mpe_GfxPalette
clut;
/**< color palette used (if
colorFormat == MPE_GFX_CLUT8) */
os_GfxSurface
os_data;
/**< os-specific surface info */
} mpeos_GfxSurface;
MPEOS Graphics Object:
mpeos_GfxContext
typedef struct mpeos_GfxContext
{
mpe_GfxColor
color;
mpe_GfxFont
font;
mpe_GfxPoint
orig;
mpe_GfxRectangle
cliprect;
mpe_GfxPaintMode
paintmode;
uint32_t
modedata;
mpeos_GfxSurface
os_GfxContext
} mpeos_GfxContext;
*surf;
os_ctx;
Third-party packages
• DirectFB -- Used for alpha blending onto
graphics buffer
• FreeType – Font support
Relationship with Platform
• Graphics Buffer
» Allocated by Platform
» Pointer passed to Stack via get_graphics_buffer
(display.c)
» Drawn upon request via draw_graphics_buffer
(display.c)
• Graphics resolution
» Changed by a call from Stack to Platform:
update_configuration (display.c)
Use Case: FillRect() in Paint()
1) MPEToolkit creates MPEGraphics and wraps it in
DVBGraphicsImpl2 object, which is passed into
paint.
2) FillRect called on DVBGraphicsImpl2 passes to
inner MPEGraphics object.
3) Call passes to gfxRectangle in mpeos_draw.c, with
native graphics context passed in.
4) gfxRectangle calls FillRectangle on
IDirectFBSurface in native graphics context
5) DirectFB paints to graphics buffer, then calls
Platform to redraw
RI Graphics UML - 1
UML Diagrams
•
https://community.cablelabs.com/svn/OCAPRI/trunk/ri/RI_Stack/docs/design/Graphi
cs/OCAP_graphics_PhoneME.vsd
•
Free Microsoft Visio 2003 Viewer
•
enableTv Contribution (Russell Greenlee)
•
9 Structure Diagrams
»
»
»
»
»
»
•
RI Launcher & Display
MPE/MPEOS
DirectFB
Java
HAVi
AWT
23 Sequence Diagrams
»
»
»
»
»
»
»
RI Emulator Startup
MPE/MPEOS Initialization
DirectFB Initialization
OCAP Main/AWT Toolkit Initialization
MPE Graphics Initialization
HAVi Initialization
AWT Rendering
RI Graphics UML - 2
Graphics vs VideoOutputPorts
• VideoOutputPorts: physical “spigots” on back of
OCAP box. Each HScreen has a main
VideoOutputPort.
• VideoOutputPorts AND CoherentConfigs control
video plane resolution
CoherentConfig vs VideoOutputPort
• Coherent Config and Video Output Config
BOTH control video resolution.
• On RI startup, persisted Video Output Port
Config is read and supercedes CoherentConfig
• When OCAP app change coherent config,
Video Output Config is changed
• When OCAP app changes Video Output Config,
CoherentConfig is changed
Testing
March 15-16, 2012
Different Tests Play Different Roles
•
•
•
•
Smoke Tests
Standards Compliance
Integration Demonstrations
See:
https://community.cablelabs.com/wiki/display/OC
ORI/Testing+the+RI
Smoke Tests - Overview
•
•
•
•
Philosophy of daily smoke tests
Current smoke tests
For more information
Recommendations for smoke tests
Smoke Tests - Philosophy
• Discover unexpected consequences ASAP
» Documented procedures for how and when to run the
tests
» Published reports from running guide tests
» Routine, periodic execution
» Executed by all engineers, verified by the build.
• Maximum coverage for short period of time
» Focus on troublesome areas of the code
» Manual execution < 30 minutes effort for MSO guides
» Manual execution < a few minutes effort for CTP tests
Smoke Tests – Current Procedures
• Manual tests, not fully automated
• For every code change
» CTP test suite
» TuneTest
» Other tests, as appropriate
• Daily, at least once
» Aspen Guide, dual tuner configuration
» ODN, will begin soon
Smoke Tests - Guides
•
Smoke Tests – For more information
• Wiki page:
» https://devzone.cablelabs.com/web/oc/9//wiki/OCAP%20RI/Smoke+Testing
Smoke Tests - Recommendations
• Suggestions for additional tests?
• Recommendations for alternative procedures?
Smoke Tests - Summary
• Smoke tests are manual, quickly-executed,
formal tests to gauge the health of a build
• All engineers on the RI project are running
smoke tests
• Please contribute on the forum your thoughts
about smoke tests.
CTP Testing against RI
• Full suite of unattended CTP tests (approx 5500) are run
against tip of RI trunk every week.
• CTP test are run using a “real ATE” – Automated Test
Environment:
» Dedicated machine that executes ATE software, generates
streams in real time and sends them via RF channel to RI
running on Windows and Linux.
• Priority given to testing full configuration of RI.
• The other three configuration are tested monthly.
CTP Testing against RI
• Failure Results:
» RI QA does initial analysis.
» File bugs in RI JIRA db at CableLabs which is tied
to CableLabs CTP bug db.
» If the failure is determined to be due to a test bug,
an issue is entered into the CL CTP bugs
database and the 2 issues are linked.
• Weekly results and corresponding Jira issues
are captured in a spreadsheet for easy
comparison of RI status from week to week.
Attended CTP Test Automation
• Existing Scenario
• Problems faced
• Tool used for Analysis/Report Generation.
Attended CTP - Existing Scenarios
• 2 Groups of Attended Test
Interactive Test
Visual Inspection Test
• Automation is done only for the second group.
• This is to increase the efficiency of the tester.
 Reduce the wait time spent for running tests.
• Minimal changes were made to ATE emulator
 Script changes to avoid pop up of a question.
 Take a screenshot of RI screen.
Tools Used for Test Run/ Test Analysis
• Screenshot tool: (Test Run)
» Tool written in Java to enable screenshot of RI
screen.
• Spreadsheet creation tool:
» After the Test run, to collect all the images and
questions and organize them to a spreadsheet.
• Test Result Update tool(TestResultMux):
» To view the images and questions through an
interface and create a test result.
Advantages of using the tool
• Usability :
» More user friendly so that it takes less time for a
new person using this tool.
• Efficiency :
» Reduces the time taken for test result analysis
and creating a test report.
• Robust :
» Avoids human errors while executing/updating
test analysis reports by showing all the mandatory
parameters on the screen.
Integration Tests
• $OCAPROOT/apps/qa/org/cablelabs/xlets
• Each test application should have a readme.txt
in its directory.
• Ongoing clean-up and rework
» Latest status is found in
$OCAPROOT/apps/qa/docs/IntegrationTestInfo.doc
• New test applications will be added going
forward for integration testing of new features.
Integration Test Automation AutoXlet
• Automated xlet examples
($OCAPROOT/apps/qa/org/cablelabs/xlet/):
» TuneTest
» PermissionTest
» FileAccessPermissionTest
» PropertiesTest
» DvrTest
Home Networking Test Automation
• Home Networking tests are challenging
» Video comparison
» Network issues
• Rx Project
• RiScriptlet
• RiExerciser
• Pragmatic automation of HN Integration tests
• Overnight smoke tests
MSO Guide Testing - Approach
• We want to run the same tests against the RI
that are run against any STB
» RI/QA is always looking for more tests
• “Canary” boxes
• All non-blocked suites are run before a release,
or as needed.
• TestLink manages the test procedures,
execution and results.
MSO Guide Testing - Suites
•
•
•
•
Tuning, live streaming, VPOP
DVR and Trick Mode viewing
Locks and PINS
Various ways to find content – Guides and HN
content searches
• All buttons on the simulator remote control
• All “feature menus”
MSO Guide Testing – Tests NOT Run
• Environment at Cablelabs determines tests to be
run. Some tests not run due to:
» Specialized headend support at CableLabs
(Upsell, CallerID, etc.)
» Headend triggered features
» VOD and CA support
» Platform features – audio, analog channels
• What tests do you think should be run, and how
frequently?
TestLink – Manage MSO Guide Tests
•
•
•
•
•
Web Based Test Management
http://www.teamst.org/
GNU GPL, version 2
Architecture – Apache web server, MySQL, PHP
http://www.apachefriends.org/en/xampp.html
TestLink – Test Cases
TestLink - Reports
MSO Guide Testing - TestLink
• We want to run the same tests against the RI
that are run against any STB
» RI/QA is always looking for more tests
» RI/QA is always looking for more canary boxes
• Not all tests can be run here – however, if YOU
can run them at your site…..
• TestLink is our Web Based Test Management
tool
Unit Tests – Design and Expectation
• Historical JUnit Tests
• $OCAPROOT/java/test
• Out of date when received by RI Project
• Contain useful coding examples
• Design of new tests for fixes and contributions
• Fail against old code, pass on new code
• Expectation of new tests
• Coverage for all lines of code for contributions
• Your experience with JUnit tests?
Rx - Ri eXerciser
March 15-16, 2012
Goals and Benefits of RX
Rx Project Overview: The RxProject is an effort to take the
current DvrExerciser OCAP Xlet and refactor it into a frame
work which provides the following:
•A library of functionality, called OcapAppDriver, that
aggregates OCAP API calls into functional building blocks
provided by RI Stack & PC Platform.
•A scripting interface, RiScriptlet Xlet, that allows automation
of testing of these functional building blocks.
•A "Guide" Type OCAP Xlet, which is referred to as the
RiExerciser, that supports functional/application level testing
and development through a GUI.
Another testing framework… Really?
• This framework allows for automation without much
additional effort.
• RxProject will be part of the open source so outside
contributors will be able to write bean shell scripts to
illustrate issues and also demonstrate fixes
• Currently have a multiple of Xlets in the QA directory, and
there is much redundant code. The Rx framework allows
Xlets to utilize common utilities already found in
OcapAppDriver or add to the common functionality
• For HN, we need a framework where tests can be run to
verify robustness in a network environment with many
devices. HN CTP tests assume an isolated network.
OcapAppDriver
• A library of commonly used functions (e.g. tune, record,
etc) that can be used by xlets or scripts. The idea of this
library is to be easier to use than the raw OCAP calls so
that a script-writer in particular does not need detailed
OCAP API knowledge to write a script.
• OcapAppDriver lib calls need to clearly assert whether
they are synchronous or asynchronous. In general,
synchronous calls will be favored over asynchronous.
• OcapAppDriver lib calls will pass/return only simple Java
primitives (String, int, etc) in keeping with the philosophy
that the API should require minimal OCAP API knowledge
to use.
Asynchronous vs Synchronous
example
• As previously mentioned, OCAP API calls will in general
be synchronous. However, there are asynchronous OCAP
API calls such as tuning and recording
• For any asynchronous API calls, a synchronous
counterpart will exist. For example, waitForTuningState() is
a synchronous counterpart to the asynchronous
serviceSelectByIndex() method.
• In general, synchronous counterparts should include
“waitFor” in the method name
Rx Architecture
RiScriptlet
• RiScritplet is an xlet that executes scripts
• Scripts
» Written in "Java" and executed by BeanShell
interpreter
» Specified in hostapp.properties or via telnet
» Can have subroutines for code org
• Results file written when script is complete
Example Script
// Delete All Recordings
rxLog.info ("Deleting all recordings...");
rxReturn = true;
if(!rxDrvr.deleteAllRecordings())
{
rxReturn = false;
rxReturnString = "deleteAllRecordings failed"
}
rxLog.info ("Done deleting all recordings...");
Synchronizing Scripts
• Can run multiple scripts in parallel
• Can sync scripts via sync points on one or more
RI instances
• One RiScriptlet acts as sync server to coordinate
sync points
• Syncing is TCP-based
Synchronization Architecture
Script #1
Script #2
TCP
(register, sync,
unregister)
TCP
(register, sync,
unregister)
RiScriptlet
(Sync Master)
Sync Example Scripts
// Script #0
rxSyncClient.register ((byte)0 /* clientId */, (byte)2 /* expected num clients */, 3000 /* timeout */);
Thread.currentThread().sleep(5000);
rxSyncClient.sync ((byte)0 /* syncId */, 10000 /* timeout */);
rxSyncClient.unregister(3000 /* timeout */);
// Script #1
rxSyncClient.register ((byte)1 /* clientId */, (byte)2 /* expected num clients */, 3000 /* timeout */);
Thread.currentThread().sleep(1000);
rxSyncClient.sync ((byte)0 /* syncId */, 10000 /* timeout */);
rxSyncClient.unregister(3000 /* timeout */);
Home Networking Integration Testing
•
•
•
•
•
•
•
Benefits
Designing tests
Building and running HN test
Handy tools
Inventory of current tests
Continuous integration scripts
Where to find more information
Benefits - HN Integration Testing
• Testing of HN features without needing a
headend or guide
• RI can be server – third party devices
• RI can be a player for your STB port
• Advantages of PC tools for network debugging
• API code examples for new ECs
Design – HN Integration Tests
•
•
•
•
•
•
Use Cases from specifications
Narrative fro the beginning of specifications
Common sense application of a feature
Only happy path tests
Limited scope - not exhaustive combinations
Mind Maps
Manual HN Integration Tests
Inventory – HN Integration Tests
•
•
•
•
•
Streaming, recording and trick modes
VPOP
Hidden content
Resource contention handling
New test suites for new ECs
Tools – HN Integration tests
• RiExerciser
• Intel’s UPnP Device Spy
• Third party devices – PS3, Samsung SmartTv
HN – Continuous Integration
•
•
•
•
Regression – all tests expected to pass
Results summarized and archived
Optimized for our HN test lab
…/OCAPRI/trunk/ri/hnTestScripts/buildRx.xml
More information – HN Tests
• How we test the RI
• https://community.cablelabs.com/wiki/display/OC
ORI/Testing+the+RI
• Location of integration spreadsheets
• OCAPRI/trunk/ri/RI_Stack/apps/qa/hn/integration
Contributions – HN Integration Tests
• Designs for integration tests
• Additions to OcapAppDriver
• Contribute Scriptlets
Summary – HN Integration Tests
• Demonstrations of functionality – end-to-end
• Do not need:
» Guides
» Headends
•
•
•
•
Learn about new APIs
Designed from specs and common sense
Manual test procedures in spreadsheets
Automated with scriptlets
Issue Reporting and
Handling
March 15-16, 2012
Issues – information to report
• Basic journalism
• In what environment (RI Rev level, OS, etc.)?
• What did you do (sample code is best)?
• What happened (observations, logs, screen
shots)?
• Priority – what should it be?
• Lastly, why do you believe this is an issue?
• Include sample code to demonstrate the
behavior
http://java.net/jira/browse/OCAP_RI
http://java.net/jira/browse/OCAP_RI
http://java.net/jira/browse/OCAP_RI
Issues – Life Cycle
•
•
•
•
•
Filed
JIRA counterpart
Comments, comments, comments
Resolution
Closed with the filer’s assent
Issues – Status
• Unresolved
• NEW – Initial state
• STARTED – Assigned and work begun
• REOPENED – Once resolved, but more work needed
• Resolved
•
•
•
•
•
FIXED - A fix is checked in
INVALID – The problem described is not an issue
WONTFIX – The issue will never be fixed
DUPLICATE – Duplicates an existing issues
WORKSFORME – Attempts to reproduce were futile
Issues – Guidelines for Priority
• Blocker- Most important.
• Catastrophic consequences
• Potentially effects many applications
• No workaround by changes to the app, etc.
• Critical
• More limited scope or consequences
• Major
• Workaround available, limited scope
• Minor and Trivial – Least important.
Issues - Process
• Every release has a published cutoff date for
considering issues to be resolved for that
release. Three weeks prior to release.
• Higher priority, well documented issues are
addressed sooner
• All correspondence should be through
comments in the issue
Issues - Summary
• All correspondence and information about issues
must be archived in the Issue Tracker system
• Report the required information for the most
efficient response
• Recommendations and observations?
Issues - http://xkcd.com/627
Stabilization/Robustness
Project
March 15-16, 2012
Stabilization (aka Brownian Motion)
• Rationale
» Some tests are not predictably repeatable – sometimes they pass,
sometimes they fail
» This is true for a wide range of test approaches – CTP, TDK, etc.
• Approach
» Need to bound the population of Brownian tests
– Approximately 114 of 5600+ CTP tests (all – not just HN)
– Approximately 5 of 400+ TDK tests (all “good” tests)
» HN is a focus area since there are many potential factors that can affect
test results (eg, # of interfaces, fluctuating network conditions)
• Resolution
» Determine if the cause of the uncertainty is the network, the RI or the
test itself
• Eliminate the test, fix the test, or fix the RI to accommodate a fluctuating
network environment
• Will rely on RiScriplet for test automation
Robustness
• Rationale
» We have done very little stress testing
on the RI
» We would like to begin addressing
robustness issues now
• Approach
» Identify potential areas to research
» Focus on a few of those initially
Possible Approaches - 1
•
•
•
•
Brownian Motion CTP investigation
FindBugs (a static analysis tool)
Drill down into code/walk throughs
Identify functional areas
»
»
»
»
•
•
•
•
•
•
•
Socket timeouts
Run in context
Monitor/locks
Lack of synchronization in CyberGarage
Design review of Cybergarage to gain better understanding
Address individual OCORIs which are categorized
Stop new feature development and focus on fixing issues
Setup “Typical/Real world” Home Network testing environment
Perform STB type testing
Gather stability metrics
Expand Rx Framework testing
Possible Approaches - 2
•
•
•
•
•
•
•
•
•
•
•
Run CTP tests (either all or a subset) without consecutive reboots
More Guide Type testing by developers
Identify scenarios with guides to be run
Start fixing known issues rather than looking for more issues
Perform deployment type testing
Run CTP tests on a busy network
Investigate isolation of jvm/platform /stack to do targeted testing
Run surfer tests
Prioritize guide issues
Investigate TDK Brownian motion
Form a focus team to address discovery issue in simplified
environment (local loopback)
• Re-architect problem areas
Miscellaneous Topics
March 15-16, 2012
MPEOS Review
• Low priority task to review and correct MPEOS
header file comments, remove unused items, etc
• Comments in Doxygen format
» Will expose this info as HTML pages
» Will not update MPEOS Porting Guide
• Initial file to be refactored was mpeos_dvr.h
» Was integrated into the 1.2.1 Release
• Next up are mpeos_media.h and mpeos_hn.h
» Targeting 1.2.2 Rel-B
RI Conditional Access
RI provides support for Conditional Access.
–
–
–
–
Portable implementation
Compliant with CCIF 2.0 CA Support (Section 9.7)
MPE and Java-level decrypt session constructs
RI MPE CA management can be disabled via ini
configuration (When disabled the CA management is
done in the MPEOS porting layer)
MPE POD Manager Responsibilities
• Methods to initiate/terminate decrypt sessions
• CAS session opened when POD_READY indication is
received
• All CA APDUs processed by the MPE POD Manager
• CA_PMT constructed for the selected Service when MPE
decrypt session is started
• Explicit LTSID assignment. MPE POD Manager assigns a
randomly generated LTSID when initiating decrypt session
• LTSID is included in the CA session event and passed
around to all layers to maintain tuner to LTSID association
• All CA sessions on a tuner share the same LTSID
• CA sessions between clients are shared when possible (e.g.
decode and buffering of the same program on a tuner)
MPE POD Manager Responsibilities
• CableCARD decrypt resource capabilities queried at start-up
• POD Manager internally keeps track of resources being used
• High priority decrypt requests will pre-empt lower priority
requests (e.g. ServiceContext select is higher priority than
DSMCC ServiceDomain attach)
• CP session is opened when CA session is successfully
started
• Terminate CP session when CA session is stopped
• CCI support (can be signaled via java/MPE decrypt session)
RI Conditional Access
CA session supported for
– JMF BroadcastSession (initiate decrypt session
prior to calling mpe_mediaDecode())
– DVR (initiate decrypt session prior to
mpe_dvrTsbBufferingStart())
– Object Carousel (initiate decrypt session before
attaching a DSMCC service domain)
– Section Filter (initiate decrypt session before setting
native section filter)
RI Conditional Access
OCAP ODL Stack Basic Decrypt/Decode
Transport stream section filtering session(s)
mpe_mediaTune(freq1,mod1)
mpe_filterSetFilter()
Section filtering for
PAT/PMT initiated by
MPE SI Manager
MPE_SF_EVENT_SECTION_FOUND
DVBServiceContext.select()
called for Service s1
Decrypt session
mpe_dvrTsbConvertStart()
Decode session
mpe_podStartDecrypt()
Tune completes
mpe_mediaDecode()
t1 t2
JMF Broadcast Player started.
Decode initiated with Locator-specified
components/PIDs from PMT
LTSID extracted from decrypt session.
CA_PMT generated.
Random generated
LTSID. Program index
table updated.
MPE_TUNE_SYNC
t3 t4
t5
mpe_stopDecrypt()
JMF player stopped
mpe_mediaStop()
t6
CA_PMT generated
(not selected)
program index table upda
t7
Download