Functional Areas and Associated Test Cases - exchange-iwg

advertisement
eHealth Exchange and EHR │ HIE Interoperability Workgroup
Joint Testing Task Group
Test Process Guide
A single-source set of guidance for conducting testing for onboarding to the eHealth
Exchange or EHR│HIE Interoperability Workgroup.
Page 1
Joint Testing Task Group
Test Process Guide
CHANGE HISTORY
DATE
VERSION
DESCRIPTION
CHANGED BY
8/24/12
v001
Initial draft
ONC Test Team
8/28/12
v002
General edit; bring questions and
discussion items inline; add section on
Community Profiles; add Functional
Areas and Associated Test Cases
ONC Test Team
Page ii
Joint Testing Task Group
Test Process Guide
Table of Contents
1. Introduction ....................................................................................................... 1
1.1
Purpose ....................................................................................................................................... 1
1.2
Intended Audience .................................................................................................................. 1
1.3
Relationship to Other Documents ..................................................................................... 1
2. Test Process ....................................................................................................... 2
2.1
Test Process Summary........................................................................................................... 2
2.2
Workstream I: Exchange-Ready System .......................................................................... 4
2.2.1
Requirements ...................................................................................................................................4
2.2.2
Roles and Responsibilities ..........................................................................................................5
2.2.3
Activities .............................................................................................................................................5
2.3
Workstream II: Event-Based Interoperability Testing .............................................. 6
2.3.1
Requirements ...................................................................................................................................6
2.3.2
Roles and Responsibilities ..........................................................................................................7
2.3.3
Activities .............................................................................................................................................7
3. Test Approach ................................................................................................... 9
3.1
Coverage of Spec Functionality ........................................................................................... 9
3.2
Top Down v. Bottom up Approach ..................................................................................... 9
3.3
Modular Approach ............................................................................................................... 10
3.4
Harmonization of Additional Specifications ............................................................... 11
3.5
System Actors ......................................................................................................................... 11
4. Test Tools......................................................................................................... 14
5. Reporting Test Results..................................................................................... 15
5.1
Message Conformance ........................................................................................................ 15
5.2
Evidence of Message Exchange ........................................................................................ 15
5.2.1
Server Logs ..................................................................................................................................... 16
5.2.2
Extracting a Message from a Log ........................................................................................... 16
5.3
Testing Summary Report ................................................................................................... 18
6. Appendix A: Testing Summary Report – Exchange Ready System ................. 19
7. Appendix B: Testing Summary Report – Event-Based Interoperability Testing
24
Page iii
Joint Testing Task Group
Test Process Guide
8. Appendix C: Community Profiles for Event-Based Interop Testing ................ 26
8.1
Introduction ........................................................................................................................... 26
8.2
Creating a Community Profile .......................................................................................... 26
8.2.1
Exchange Community ................................................................................................................ 26
8.2.2
Building a Community Profile................................................................................................. 27
8.3
Selecting a Community Profile ......................................................................................... 28
8.4
Example Community Profile ............................................................................................. 28
8.5
Functional Areas and Associated Test Cases .............................................................. 32
Page iv
Joint Testing Task Group
1.
Test Process Guide
Introduction
Throughout the summer of 2012, the process for demonstrating conformance to
NwHIN specifications and cross-gateway interoperability is evolving from a highly
supported process to an independent self-guided process.
The eHealth Exchange has collaborated with the EHR│HIE Interoperability
Workgroup to develop a single set of testing artifacts and a unified testing process
for the certification of compliant products, and for applicants to join either exchange
network. This will align testing requirements for both networks, allow for shared
resources, and minimize market fragmentation.
1.1
Purpose
This Test Process Guide provides information for system developers that are testing
to certify as Exchange-Ready Systems, and for implementers of those systems to
conduct Event-Based Interoperability testing. The Test Process Guide includes
testing processes and requirements, assessing available tools and resources, and
executing testing. It is recommended that this artifact be reviewed before reading
the Test Execution Guides.
NOTE: This published version of the Test Process Guide includes open questions
and discussion items that will be addressed through pilot project work and future
work of the Certified Testing Body. Open questions and discussion items are
highlighted for easy reference.
Example: How should this issue be addressed?
1.2
Intended Audience
This document is intended for test engineers, technical/testing project managers,
and executive personnel to understand the testing process, roles and
responsibilities, and activities for the workstreams described.
1.3
Relationship to Other Documents
The reader should use this Test Process Guide as an introduction to testing. This
document references other artifacts that provide more detailed requirements for
executing test cases; test packages; and test execution tools. All of these artifacts are
indexed in Artifact 1: Test Guide Index.
Page 1
Joint Testing Task Group
2.
Test Process
2.1
Test Process Summary
Test Process Guide
This Test Process Guide defines a self-guided course for testing systems that wish to
be certified as Exchange-Ready Systems, and for testing implementations of certified
systems for onboarding to either the eHealth Exchange or the EHR│HIE
Interoperability Workgroup healthcare exchange networks.
The self-guided process contains information for the following Workstreams:
 I: Exchange-Ready System. A product vendor or system developer executes selfguided testing for certification as an Exchange-Ready System (ERS) by the
Compliance Testing Body (CTB).
 II: Event-based Interoperability Testing. An implementation of an ExchangeReady System (ERS) demonstrates interoperability with production Participants
from either the eHealth Exchange or the EHR│HIE Interoperability Workgroup in
order to join these exchange networks.
Discussion Item: “Exchange-Ready System” and “Event-based Interoperability
Testing” have been defined here as independent workstreams. To date, there has
been no explicit direction from the program about linkage between the two
workstreams: i.e., that Event-based Interoperability testing must be executed with
an Exchange-Read System. The assumption is that this is the intention; should this
be made explicit here? Additionally, both workstreams utilize a single suite of
testing artifacts, from which specific test cases are identified for each workstream.
Separating the activities into Workstream I and Workstream II aligns with this
approach.
Open Question: Will the CTB publish a list of Exchange-Ready Systems? Where?
The following testing workflow diagram represents a high-level view of the test
process applied to the eHealth Exchange onboarding process.
Page 2
Test Process Guide
Joint Testing Task Group
2. Test Process
Figure 1: Test Process Overview
Page 3
Test Process Guide
Joint Testing Task Group
2. Test Process
2.2
Workstream I: Exchange-Ready System
In Workstream I, a product vendor or system developer executes testing of a
production-ready system to attain certification of an Exchange-Ready System (ERS)
by the Compliance Testing Body (CTB).
2.2.1 Requirements
The system developer selects one or both roles for one or more of the defined web
services adopted by the two exchange networks, and executes required testing for
that role/service together with the required message transport and security testing,
as applicable.
Links to the NwHIN specifications for these web services are available at:
http://www.healthit.hhs.gov/portal/server.pt/community/healthit_hhs_gov__nhin_
resources/1194
Open Questions: What will the CTB’s requirements be for entering this
workstream? What are the specific requirements that define a production-ready
system? (this was an issue for Onboarding to date)
Following is a selection of the specifications, roles and the transport / security specs
that apply to those web services.
Specification
Role
Patient Discovery (PD)
Initiator
Responder
Query for Documents (QD)
Initiator
Responder
Retrieve Documents (RD)
Initiator
Responder
Messaging Platform (MP)
Initiator
Responder
Authorization Framework (AF)
Initiator
Responder
PLUS Transport / Security
Figure 2: Web Services + Transport and Security
The requirements for this Workstream consist of testing for basic spec functionality
for PD/QD/RD, modularly combined with messaging transport and security
protocols, as applicable.
Discussion Item: We have proposed drawing the line for basic testing at basic
send/receive plus processing a SOAP fault (includes full message verification). This
bar may shift based on pilot feedback.
Page 4
Test Process Guide
Joint Testing Task Group
2. Test Process
The required test cases for basic spec functionality are identified in the individual
Test Execution Guide and test package for each web service. All other test cases in
the test packages represent advanced functionality and are utilized in Workstream
II: Event-Based Interoperability testing.
Note: Refer to the Messaging Platform & Authorization Framework Test Execution
Guide for information about testing requirements for MP & AP.
2.2.2 Roles and Responsibilities
The following roles and responsibilities are identified for this Workstream.
Product Vendor / System Developer: Executes self-guided basic testing of the
selected specifications and roles for a production-ready system, and submits results
to the CTB.
CTB: Provides subject matter expertise pertaining to specifications and test artifacts
during testing. Evaluates the results of testing and certifies the product as an
Exchange-Ready System.
2.2.3 Activities
To enter Workstream I: Exchange-Ready System, a product developer / vendor
provides the following information to the CTB:
Open Question: What information is documented/provided at this step?
2.2.3.1 Self-Guided Testing
The CTB directs the product vendor / system developer to test artifacts and
documentation.
Open Question: Where do these artifacts live?
The vendor / developer retrieves the appropriate test packages and conducts selfguided testing of the required test cases for Basic Functions of the selected web
services/specifications. Basic Functions include basic send and receive and error
processing. Detailed information about required test cases is provided below and in
the companion Test Execution Guides for Patient Discovery, Query for Documents,
Retrieve Documents, and Messaging Platform & Authorization Framework.
Throughout testing, the CTB provides subject matter expertise to the developer for
questions relating to the specifications or test artifacts.
Open Question: How is this service level quantified?
2.2.3.2 Submit Evidence of Testing
Upon completion of testing, the product vendor / system developer submits test
results and messaging artifacts to the CTB as described blow, Reporting Test
Page 5
Test Process Guide
Joint Testing Task Group
2. Test Process
Results. The CTB evaluates the test results and either certifies the product as an
Exchange-Ready System or directs the developer to re-test.
Discussion Item: What does formal CTB results evaluation look like? Example?
2.2.3.3 Certification
Open Question: Are there formal certification steps that need to be detailed either
here or in an attachment? Other CTB processes?
2.3
Workstream II: Event-Based Interoperability Testing
In Workstream II, an implementation of a certified Exchange-Ready System
demonstrates interoperability with other production participants in the eHealth
Exchange and/or EHR│HIE Interoperability Workgroup, according to that exchange
network’s requirements.
2.3.1 Requirements
The implementation selects one or more exchange communities that it will
interoperate with and conducts interoperability testing with one or more
production participants from that exchange community in a live, production-like
environment. Testing requirements are defined in Community Profiles published by
each exchange community. Community Profiles and exchange communities are
described in further detail below, Appendix C: Community Profiles for Event-Based
Interoperability Testing
Discussion Item: The IHE Connectathon requires testing with 3 partners. Is this
realistic for these applicants? Can eHealth Exchange and the IWG over-ride this
requirement? Is this Connectathon requirement driving the eHealth Exchange’s
requirement that applicants must test with 3 partners? If so, what about other
interoperability testing venues that don’t have this requirement? If not, why does
Exchange require testing with 3 partners? Testing with 3 different types of
production participants (i.e., representing 3 different community profiles and 3
different sets of testing requirements) may not be realistic, particularly for an
implementation that is supporting only a small sub-set of functionality. For example,
the eHealth Exchange allows participants to onboard with as little as one spec
supported, as one actor – i.e., Document Retrieve as Responder only.
Discussion Item: The IHE Connectathon has its own requirements for “passing” its
testing event, including its own test packages. How will this dovetail with our testing
requirements?
Discussion Item: What are the requirements for an interop testing venue? Could a
Federal production participant conceivably execute testing with an applicant (i.e.,
partner testing) that covers all of our testing requirements and call that Event-based
Interop testing? If so – and we think this is a very plausible scenario – this could be
Page 6
Test Process Guide
Joint Testing Task Group
2. Test Process
very attractive to the Federal partners. Think about publishing requirements for an
interop testing venue: environment, tools, oversight, etc.
Discussion Item: What happens if there isn’t a production Participant that has
implemented the Community Profile that an wants to test? For example, a new
spec? Or advance functionality that no other Participants currently use?
Interoperability testing encompasses all functional areas of the specifications
beyond the basic functionality already demonstrated by Exchange-Ready Systems,
which are supported by the selected exchange community and documented in an
exchange community profile. Testing may also include content requirements
adopted by the exchange community.
2.3.2 Roles and Responsibilities
The following roles and responsibilities are identified for this Workstream.
Exchange Community: An Exchange Community consisting of one or more
production participants from the eHealth Exchange or EHR│HIE Interoperability
Workgroup publishes a Community Profile based on the messaging functionality
that it requires its exchange partners to support. Detailed information about
Community Profiles can be found below, Appendix C: Community Profiles for EventBased Interoperability Testing.
Implementers: Implements an Exchange-Ready System. Selects a Community Profile
and executes interoperability testing of advanced functions with a production
participant from that exchange community during a live testing event. Submits
evidence of testing to the CTB.
CTB: Coordinates live, production-like interoperability testing events. Provides
subject matter expertise pertaining to specifications and test artifacts during testing.
Evaluates the results of testing and makes a recommendation to the Governing Body
of the exchange network(s) for acceptance of the implementation as a participant.
2.3.3 Activities
To enter Workstream II, an organization implements an Exchange-Ready System.
Open Question: What documentation needs to be supplied to the CTB to enter this
workstream?
2.3.3.1 Select a Community Profile
A Community Profile is a bundle of messaging functions that are adopted by a
community of exchange partners on an exchange network. These messaging
functions are described in the core specifications, which levy requirements on each
functional area. The test artifacts include test cases that map to functional areas of
the specifications; a functional area may map to one or several test cases. See Test
Page 7
Test Process Guide
Joint Testing Task Group
2. Test Process
Execution Guides for more detail on bundling of messaging functions into spec
functional areas.
Detailed information about Community Profiles can be found below, Appendix C:
Community Profiles for Event-Based Interoperability Testing.
2.3.3.2 Conduct Event-Based Interoperability Testing
The implementation executes testing of the bundle of spec functions, per the
selected Community Profile, during live interoperability testing events. The
interoperability testing event provides a production-like environment where
multiple exchange partners conduct real-time messaging and demonstrate
interoperability.
The CTB coordinates testing events and provides oversight and assistance during
testing, including issues related to the specifications, test artifacts and tools.
Exchange communities can also use these interop testing events to test for other
requirements such as edge protocols, content specifications, or other communityspecific implementation requirements.
Discussion Item: Based on real-world experience, it is this last area of testing that
is often the key blocker to interoperability. Not the adoption of spec functions, but
rather implementation choices where spec ambiguity exists. It should be noted that
an implementation could pass all of the test cases against an RI and still be
unsuccessfully in live interop testing with a production participant because of this
issue. While the governing bodies (i.e., CC) may not mandate full interoperability
testing of these edge protocols, the interop testing events are a great opportunity to
work out the kinks and demonstrate true interoperability with production
participants. Note the optional section for defining these implementation-specific
requirements in the example Community Profile, below.
Discussion Item: How will the exchange networks define/address re-testing
requirements?
2.3.3.3 Submit Evidence of Testing
Upon completion of testing, the implementer submits test results and messaging
artifacts to the CTB as described below, Reporting Test Results. The CTB then
evaluates the test results and makes a recommendation for participation to the
governing body of the exchange network (eHealth Exchange or EHR│ HIE
Interoperability Workgroup).
Once approved for participation, the governing body (or its designee) issues a
production digital certificate and updates the services registry with new participant
information.
Page 8
Joint Testing Task Group
3.
Test Process Guide
Test Approach
In creating a unified test process for the eHealth Exchange and the EHR│HIE
Interoperability Workgroup, the existing testing artifact from both exchange
networks were analyzed and the following issues were addressed.




Coverage of specification functionality
Top down v. bottom up approach
Modular approach
Black box v. white box
The existing test artifacts were harmonized into a single test package for each
specification with these considerations.
3.1
Coverage of Spec Functionality
The eHealth Exchange test artifacts provide comprehensive coverage of all of the
technical functions and message syntax requirements described in the
specifications. Test cases or verification tools verify conformance to all of the
requirements for behavior and message formats that are required by the
specification. This enables an Exchange participant to demonstrate conformance to
an entire spec.
The EHR│ HIE Interoperability Workgroup test artifacts provide coverage of the
basic messaging functions described in the specifications, and list some verification
checks for messages exchanged. This enables an EHR│ HIE Interoperability
Workgroup member to demonstrate its ability to exchange general messaging as
required by a business use case.
The harmonized test packages retain all of the test cases and conformance checks,
organized into functional areas of the specs. This allows an exchange community to
define its business use case and then select the relevant areas of the specifications
that need to be tested to support that use case. For added traceability, test cases that
map to the test scenarios described in the EHR│ HIE IWG Test Spec reference the
original ID for those scenarios.
3.2
Top Down v. Bottom up Approach
The eHealth Exchange test artifacts take a “bottom up” approach to testing. That is,
test cases and verification checks are provided to enable a system to verify
conformance with every requirement in the spec.
The EHR│ HIE Interoperability Workgroup test artifacts take a “top down” approach
to testing. That is, test cases and verification checks are presented in a sequence of
steps as required by a particular business use case.
Page 9
Joint Testing Task Group
Test Process Guide
3. Test Approach
The unified test process allows an exchange community to define a business use
case that utilizes distinct functional areas of the specifications (top down), while
assuring rigorous testing of those areas (bottom up).
3.3
Modular Approach
The eHealth Exchange test artifacts are presented modularly for each of the core
specifications (PD, QD, RD) as individual, stand-alone packets. Although these
services are often used in combination, this approach allows implementers to test
any one of the services individually and also to test only one side (role) of the
transaction (i.e., as Initiator or as Responder).
The Messaging Platform (MP) and Authorization Framework (AF) test package is
also presented as an individual stand-alone packet so that a tester can select any of
the core specs (PD, QD, RD) and then test those in combination with MP and AF
functions.
Many of the functional tests for Messaging Platform & Authorization Framework
functions cannot be executed as is, but must have some sort of message layered on
top of the function under test in order to execute the test. This may be a PD, QD, RD,
or other type of message.
The EHR│HIE Interoperability Workgroup test artifacts describe test scenarios for
Patient Query, Document Query, Record Retrieval , and other specs, in which the
tester executes a test scenario that comprises an end-to-end transaction. These test
scenarios include: End User Authentication, Messaging Platform, Authorization
Framework, the core specification under test, and Message Rendering.
The unified test process maintains a modular approach by keeping all of the test
packages separate and enabling a tester to select which functions of which spec will
be tested. The End User Authentication and Message Rendering requirements for
EHR│HIE IWG (which are not tested for eHealth Exchange), are also presented
separately so they can be tested modularly with any of the core specs.
The following graphic illustrates how an implementation of an Exchange-Ready
System selects test cases for functional areas required by a Community Profile,
whether as a core specification or as an edge protocol.
Page 10
Joint Testing Task Group
Test Process Guide
3. Test Approach
Figure 3: Modular approach
3.4
Harmonization of Additional Specifications
A delta analysis of the eHealth Exchange test artifacts and the EHR│HIE test artifacts
revealed different levels of spec coverage, as described above (Coverage of Spec
Functionality).
For Patient Query (or Patient Discovery), the EHR│HIE IWG adopted three IHE
profiles: XCPD, PIX, and PDQ. The eHealth Exchange only adopted XCPD. To create a
harmonized test package for Patient Query (PD), the PIX and PDQ testing described
in the EHR│HIE Test Spec was transcribed into the “test steps” format that exists for
the XCPD testing in the eHealth Exchange test package format. These new test cases
were added to the PD test package, so that all three IHE profiles are represented in a
single artifact. A column was added to enable a tester to sort for test cases that are
relevant to the Patient Query method selected.
3.5
System Actors
The NwHIN specifications levy requirements on messaging between gateways and
do not dictate functions for systems/actors that sit behind the gateway (MPI,
regional HIEs that feed data to the gateway, other actors). The eHealth Exchange
test artifacts were created to reflect this “black box” platform-neutral approach and
test cases deliberately do not identify actors, only a System Under Test (“System”)
and Test Tool. Test cases are written to examine messaging only between gateways,
and not between other actors behind the gateway.
Page 11
Test Process Guide
Joint Testing Task Group
3. Test Approach
Although the EHR│ HIE Test Spec adopts the NwHIN specifications, the IWG test
approach looks at message transactions between a variety of actors including EHR
(Electronic Health Record), HIE (Health Information Exchange), SMPI (Statewide
Master Patient Index), Tester, and RLS (Record Locator Service). However, the Test
Spec often refers to the “System” performing a test action without identifying the
actors of that System.
To create a unified test process, the harmonized test artifacts bring the EHR│ HIE
test approach in line with the eHealth Exchange test approach by applying a
consistent use of the term System to refer to any actor that comprises the System
Under Test, which is taking the action in the test case, without identifying distinct
actors.
The test environment is intentionally simplified, to accommodate different
deployment models. We begin by abstracting two actors: an Initiator and a
Responder.
Initiator
Responder
Figure 4: Actors for testing
The Initiator aggregates all behavior expected of the initiating system in a black-box
fashion, regardless of how that behavior is distributed among components. For
example, for a PD request, the Initiator is responsible for the following:




Locating the local patient record
Locating and verifying the certificate of the responder
Signing the request message and transmitting over TLS
Verifying a patient match after a Response is received
Note that Initiator and Responder were chosen rather than the common “client” and
“service” terms. This is because the services support more complex transactions, for
example: in a deferred PD request, the Initiator is a web service client for the initial
request, but a web service for the deferred response.
Likewise, the Responder aggregates all behavior expected of the responding system.
For example, for a PD response, the Responder is responsible for the following:




Locating and verifying the certificate of the Initiator
Receiving and processing the request message from the Initiator
Verifying the signatures of the message
Running patient matching algorithms
Internal behavior/messaging within the Initiator and Responder actors (for
example, between a Master Patient Index and a gateway) is not tested.
Page 12
Joint Testing Task Group
Test Process Guide
3. Test Approach
Where the EHR-HIE Test Spec does test internal actors, adding testing requirements
outside the scope of NwHIN specs, this testing is distinct from the spec-mandated
test cases and incorporated as “edge protocols”, as described above.
Page 13
Joint Testing Task Group
4.
Test Process Guide
Test Tools
The unified test approach utilizes a common set of testing and verification tools,
such as automated SOAP UI test scripts, NIST validator tools, and other utilities,
which are collated in a Reference Toolkit. A long-term goal for the eHealth Exchange
and EHR│HIE IWG community is to continue to advance these tools, which will also
be leveraged by the CTB in its certification process.
The Reference Toolkit is fully described in Artifact 3: Testing Toolkit User’s Guide.
This artifact provides detailed information about





Setting up a test environment
Automated test execution methods
Reference test data
Message verification tools
More . . .
Page 14
Joint Testing Task Group
5.
Test Process Guide
Reporting Test Results
This section describes the process and artifacts for recording and submitting test
results. The artifacts that serve as evidence of testing are similar for Workstream I
and Workstream II, as both workstreams draw from the same set of test cases. Use
the Test Summary Report formats that follow to submit evidence of message
conformance and message exchange for each test case.
5.1
Message Conformance
For each test case executed, the system developer (WSI) or implementer (WSII) will
document and submit verification of message conformance for messages that were
sent or received, as directed by the test steps. Message conformance MUST be
verified using the manual checklists included with each test package. In addition,
automated tools such as the NIST validators may be used. A complete description of
message validation tools can be found in Artifact 3: Testing Toolkit User’s Guide.
Discussion Item: While the HIE community routinely turns to the NIST validators
to verify message conformance, we have not seen an analysis of the coverage of
those validators. It is known that there are some areas that the NIST tools do not
verify, as well as known errors in the tools. The manual checklists provide a line by
line checks of every element in the message, providing, hopefully, complete
coverage of spec requirements for every element.
Will the CTB require messages to use NIST validators?
Is the applicant required to pass NIST validators cleanly?
Is use of manual checklists required?
How will a candidate present evidence that messages “pass” the manual checklist
validation? Self-attestation?
 How do you know when the developer/implementer doesn’t agree with the
requirements and marked as a pass anyhow? (Onboarding Team has lots of good
examples of this)
 Should the CTB perform an analysis of NIST validator coverage in order to
explicitly identify gaps that can be covered by the manual checklists?
 How will a candidate use other, as yet unidentified, message validation tools?
Does a new tool need to be vetted by the CTB?




5.2
Evidence of Message Exchange
The developer (WS1) or implementer (WS2) will submit evidence of testing of a
selection of system functions. The following artifacts must be submitted for each
test case executed.
Page 15
Joint Testing Task Group
Test Process Guide
5. Reporting Test Results
1.
2.
3.
4.
Application server logs from both initiator and responder (i.e., Glassfish).
Cross-gateway SOAP messages from the server logs.
Mapping of messageIds to the specific test case they reflect.
Notes or other observations of system behavior.
5.2.1 Server Logs
In each web service call, both sides of the exchange (i.e. the initiator and the
responder) should log the SOAP message that is passed. The way to configure this is
specific to each implementation, but here is an example using a Metro/Glassfish
server: https://blogs.oracle.com/arungupta/entry/totd_1_soap_messaging_logging.
Once you have server logs, you will need to extract specific SOAP messages for
validation. The focus for the analysis is on the Test Tool logs if possible. By
extracting the message from the Test Tool side, we are assured that the extracted
message (whether it is a request, a response, or an acknowledgement) is the
message that was actually sent by the System Under Test. If you extract messages
from the System Under Test logs, you run the risk of extracting an intermediate
message created along the way to building the message that actually was exchanged.
However, if the Test Tool log has been cut off or truncated, the analysis or
verification can be done on the System Under Test logs if necessary. A message has
been cut off or truncated if you're unable to find the entire message between the
soap element’s start tag (one example: <soap:Envelope>) to the end tag (one
example: </soap:Envelope>) - the Soap node for the message you're trying to find.
5.2.2 Extracting a Message from a Log
First search the log for the WS-Addressing action of the message you want to
extract. Some common actions are shown below:






Patient Discovery request: urn:hl7org:v3:PRPA_IN201305UV02:CrossGatewayPatientDiscovery
Patient Discovery response: urn:hl7org:v3:PRPA_IN201306UV02:CrossGatewayPatientDiscovery
NOTE: The difference between the PDI string and the PDR string is just a
‘5’ or a ‘6’ in the string ‘PRPA_IN20130...UV02’.
Query for Documents request: urn:ihe:iti:2007:CrossGatewayQuery
Query for Documents response:
urn:ihe:iti:2007:CrossGatewayQueryResponse
Retrieve Documents request: urn:ihe:iti:2007:CrossGatewayRetrieve
Retrieve Documents response:
urn:ihe:iti:2007:CrossGatewayRetrieveResponse
In this screenshot, we are searching for a Patient Discovery request: “urn:hl7org:v3:PRPA_IN201305UV02:CrossGatewayPatientDiscovery”:
Page 16
Joint Testing Task Group
Test Process Guide
5. Reporting Test Results
Once you find the action you are looking for, select everything within the SOAP
envelope element <S:Envelope> (everything between <S:Envelope> and
</S:Envelope>) and copy it to the clipboard.
These screenshots show the selected text from the beginning of the SOAP envelope:
to the end of the SOAP envelope:
Page 17
Joint Testing Task Group
Test Process Guide
5. Reporting Test Results
Then, paste the content into a text editor. Repeat this process until you have
extracted and saved all XML messages needed for analysis.
NOTE: Depending on the way namespaces are used by the system, the Soap XML
elements in the logs may use a different namespace short name, for example
<soap:Envelope>' or '<soapenv:Envelope>'.
5.3
Testing Summary Report
A Testing Summary Report for Workstream I is attached as Appendix A, and for
Workstream II as Appendix B.
Page 18
Joint Testing Task Group
Test Process Guide
6. Appendix A: Testing Summary Report – Exchange Ready
System
Page 19
eHealth Exchange and EHR│HIE Interoperability Workgroup
Appendix A
Testing Summary Report: Exchange-Ready Systems
Section I: System Developer / Vendor Information
Organization Name
Address
Phone
Project Contact:
Name, Phone, e-mail
Testing Contact:
Name, Phone, e-mail
System Name
Description
Version
Date of Testing Results
Submission
Section II: Specifications tested
Select one or more Specifications tested and indicate one or both Roles for each.
Indicate the system’s Role for Messaging Transport & Security testing (required).
Specification
Role
Patient Discovery (PD)
Initiator
Responder
Query for Documents (QD)
Initiator
Responder
Retrieve Documents (RD)
Initiator
Responder
Messaging Transport & Security
Role
X
Messaging Platform (MP)
Initiator
Responder
X
Authorization Framework (AF)
Initiator
Responder
Page 20
eHealth Exchange and EHR│HIE Interoperability Workgroup
Appendix A
Testing Summary Report: Exchange-Ready Systems
Section III: Test Case Execution
Section IIIa: Test Cases
List the test cases for Basic Functions that were executed for each specification,
using the following matrix to identify each test case and its attached evidence of
conformance and messaging.
For every test case, attach
(1) Evidence of message conformance – NIST validator, manual checklist
validation.
(2) Message logs and extracted message that was sent during testing.
Test
Case
ID
Spec
Role
Example:
Patient
Discovery
Initiator PD1
Test Result
Status*
Message ID
(Pass, Pass with
comment, Fail)
58a90a66:1319050bb5e:-7f41
Pass*
*Insert notes and observations on system behavior under Test Case Notes and
Observations, referencing the Test Case ID and corresponding Message ID.
Page 21
eHealth Exchange and EHR│HIE Interoperability Workgroup
Appendix A
Testing Summary Report: Exchange-Ready Systems
Section IIIb: Test Cases Notes and Observations
Note system behavior and other observations from testing in this section. Include
the Test Case ID and corresponding Message ID for each test case discussed.
Example:
PD1: 58a90a66:1319050bb5e:-7f41
System successfully executed this test; message passed the NIST validator with
some errors, per the attached report. All of the errors are known false negatives.
The message passed the manual checklist successfully.
Section IV: Attachments – Logs, Messages, and Validation
For each test case, attach:
(1) Message logs
(2) Messages exchanged
(3) Evidence of message validation
Highlight the messageID for each test message that maps to the test cases listed in
Section III.
Section V: Self-attestation of Message Validity
This statement serves as a self-attestation of the validity of the messages exchanged
during testing. The messages have been validated line-by-line against the manual
Checklists included in each test package.
Signature of Test Engineer
Name
Page 22
eHealth Exchange and EHR│HIE Interoperability Workgroup
Testing Summary Report: Exchange-Ready System
Organization
Phone
e-mail
Discussion Item: What evidence is attached for manual checklist validation? Does
the CTB need a statement of self-attestation?
Discussion Item: Is there going to be a list of acceptable ‘false negatives’ that are
known from the NIST validators? Is the CTB going to keep that list updated? Are
there going to be errors that are going to be passable as no gateway can fix it to
conform to spec?
Discussion Item: We could (a) list out the actual Test Case IDs for all the required
test cases, or (b) ask the applicant to list the Test Case IDs. We have shied away from
identifying individual test cases throughout this documentation to accommodate
future edits and changes.
Open Question: Does this work for Direct? We may need to have a more complex
questionnaire to elicit information about the specific deployment and which test
cases apply.
Page 23
Joint Testing Task Group
Test Process Guide
7. Appendix B: Testing Summary Report – Event-Based
Interoperability Testing
Page 24
eHealth Exchange and EHR│HIE Interoperability Workgroup
Appendix B
Testing Summary Report: Event-Based Interoperability Testing
[same as previous with additional sections for information about the interop testing
event, production participants tested with, other interop testing notes/feedback.
Note that the matrix for recording test cases executed will be much larger.]
Page 25
Joint Testing Task Group
Test Process Guide
8. Appendix C: Community Profiles for Event-Based Interop
Testing
8.1
Introduction
The NwHIN specifications provide for an array of messaging functions associated
with each spec. Some of these functions are required by the specification to be
implemented/supported, and some are optional. The testing process for
implementations of Exchange-Ready Systems to join either the eHealth Exchange or
the EHR│HIE Interoperability Workgroup exchange network uses Community
Profiles to identify which messaging functions have been implemented by
production participants in a selected exchange community.
This allows production participants within an exchange community to specify which
functions of which specifications they require their exchange partners to support,
and accordingly, which functions applicants must test during Event-Based
Interoperability Testing.
The test packages for each of the NwHIN specifications include test cases and
message verification checks that map to each of the functional areas of the specs.
Test cases are flagged as Basic or Advanced. Basic functions are tested by ExchangeReady Systems; all other functions are Advanced, and are tested during Event-Based
Interop testing as required by Community Profiles.
Discussion Item: We introduce the term Community Profile here. How do we
distinguish between a Spec Factory-type profile like esMD and what we’re trying to
express here? Should we use a term other than Community Profile?
8.2
Creating a Community Profile
8.2.1 Exchange Community
To create and publish a Community Profile, we begin by defining an exchange
community. An exchange community is one or many participants in the eHealth
Exchange or the EHR│HIE Interoperability Workgroup exchange network that have
implemented a common set, or bundle, of spec functions to enable full
interoperability among their exchange partners.
An exchange community can be defined at any level of granularity: a single
participant that has implemented a unique bundle of functions that are rarely used
by other participants; a group of adjacent HIEs; a set of HIEs, IDNs, and Federal
partners all exchanging data for a single use case; or even an entire exchange
network that requires its participants to support particular spec functionality.
Discussion Item: While there is good utility to allowing any level of granularity for
defining exchange communities, this is a real discussion item. In theory, the
Exchange and the IWG “require” all of their participants to “fully” support the
Page 26
Joint Testing Task Group
Test Process Guide
Appendix C: Community Profiles
NwHIN specs. In practice, this isn’t the case. Defining community profiles is a step
towards enabling real interoperability among exchange partners who truly need to
achieve it; it’s a step away from “everyone interoperates with everyone”.
8.2.2 Building a Community Profile
The test packages for each spec contain many columns that can be used to sort the
test cases. An exchange community uses the following columns/designations to
identify which test cases it will require its exchange partners to execute.
8.2.2.1 Functional Area
The exchange community identifies which specific functions of the NwHIN
specifications it requires its exchange partners to implement in order to support its
use case. Using the “Functional Areas” designated in each test package, the exchange
community lists these in its Community Profile. Note that a Functional Area in the
test package may map to one or many test cases.
A complete list of Functional Areas and associated test cases is included below.
8.2.2.2 Basic / Advanced
The test cases are designated as Basic or Advanced. Basic test cases have already
been executed by Exchange-Ready Systems. The exchange community focuses on
the Advanced test cases to identify which functional areas its partners will test.
8.2.2.3 R/C/P
Every test case is marked as Required, Conditional, or Provisional. These
designations denote the classification of the function under test, as per the NwHIN
specifications (not as per an exchange community). The exchange community can
use this designation to understand which test cases will be required for the
functions it adds to its Community Profile.

Required: This function must be support by all implementations. Every
Community Profile must support this function, and therefore all
implementations must execute this test.

Provisional: All implementations should execute this test case. Provisional
test cases have test steps or outcomes that have not been defined by the
specification. However, since they test important functionality, it is suggested
that the Tester note test outcomes or System performance for Provisional
test cases for further assessment.

Conditional: All implementations that support the function under test must
execute this test case. The Additional Info/Comments column provides more
specifics about the condition that must be met for Systems to execute this
test case. The majority of the test cases are Conditional.
Page 27
Joint Testing Task Group
Test Process Guide
Appendix C: Community Profiles
The Conditional test cases are where a Community Profile can select which optional
functions its partners will test. The Conditional test cases then become Required for
implementations that wish to conform to this Community Profile.
8.2.2.4 Additional Considerations
Interoperability testing events create a great opportunity for exchange communities
to share additional, implementation-specific requirements that are defined beyond
the specifications. For example, time formats, restrictions on required value sets, or
test outcomes that were left undefined by the specifications. It is often these kinds of
variables, which are protected by the Local Autonomy Principle, that are the key to
achieving interoperability. An exchange community may choose to publish these
requirements up front in their Community Profile, or share this information during
testing.
Discussion Item: Do the governing bodies want to “publish” a list of Community
Profiles?
8.3
Selecting a Community Profile
An HIO that has implemented an Exchange-Ready System selects one or more
Community Profiles that it will test to. These are the Community Profiles of the
exchange communities with which the applicant intends to exchange data in
production.
8.4
Example Community Profile
Following is an example of a Community Profile for HIE “X”. This eHealth Exchange
production participant requires its exchange partners to demonstrate support for
the following Advanced functions.
An implementation that wishes to exchange data with HIE “X” must execute all of
the test cases that map to these functional areas in the test packages for each of
these specifications.
Page 28
Test Process Guide
Joint Testing Task Group
EXAMPLE COMMUNITY PROFILE
HIE “X”
Description:
HIE “X” is a community of exchange partners that are exchanging data in support of
the “Y” use case. In addition to the required Basic Functions of each NwHIN
specifications supported by HIE “X”, this Community Profile requires
implementations to support the following Advanced Functional Areas of the NwHIN
specifications.
Specification: Patient Discovery (XCPD)
NwHIN Spec Version: 2.0
Functional Area
Role
Test Case ID
Send SSN
Initiator
PD-4
Send Address and Phone Number
Initiator
PD-2
Send Multiple Addresses and Phone Numbers
Initiator
PD-3
Send Middle Name
Initiator
PD-5
Send Multiple Names
Initiator
PD-6
Process No Match Returned
Initiator
PD-13
Process SSN
Responder
PD-17
Process Address and Phone Number
Responder
PD-18
Return Address and Phone Number
Responder
PD-28
Process Multiple Addresses and Phone Numbers
Responder
PD-19
Process Middle Name
Responder
PD-21
Process Multiple Names
Responder
PD-23
Return Multiple Addresses and Phone Numbers
Responder
PD-29
Return Multiple Names
Responder
PD-30
Functional Area
Role
Test Case ID
Find Documents
Initiator
QD-3003
Specification: Query for Documents
NwHIN Spec Version: 3.0
QD-3011
Page 29
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles – EXAMPLE COMMUNITY PROFILE
QD-3140
QD-3024
Find Documents: error
Responder
QD-3163
QD-3041
Specification: Retrieve Documents
NwHIN Spec Version: 3.0
Functional Area
Role
Test Case ID
Retrieve Multiple Documents
Initiator
RD-102
RD-113
Retrieve Multiple Documents
Responder
RD-215
RD-209
RD-213
RD-209
Specification: Messaging Platform & Authorization Framework
NwHIN Spec Version: 3.0
Functional Area
Role
Test Case ID
(no additional testing required)
Additional Interoperability Considerations (Optional)
The following requirements are based on implementation choices that HIE “X” has
made, which affect interoperability with our exchange partners.




Zip code must be in +4 format. Example: 85001-5000
Must use port 443
For QD, DocumentEntryServieStartTime and StopTime are ignored
purposeOfUse code and displayName values to be set to TREATMENT
Page 30
Joint Testing Task Group
Test Process Guide
Appendix C: Community Profiles – EXAMPLE COMMUNITY PROFILE
Note: May not want to include Test Case IDs in Community Profile. Might be better if
this info lives in a worksheet attached to testing results submission.
Note: This approach puts the initial burden of identifying required test cases on the
exchange community.
Page 31
Test Process Guide
Joint Testing Task Group
8.5
Functional Areas and Associated Test Cases
The following matrix identifies all of the Functional Areas found in the test case
packages for PD, QD, and RD. This matrix is current as of August 2012, and will be
updated as future versions of the test packages are released.
The grayed out text represents Basic Functions that are already tested by ExchangeReady Systems.
Patient Discovery (XCPD): Initiator
Functional Area
Test Case IDs
1
Basic send query
PD1
2
Process SOAP fault
PD14
3
Deferred: Basic deferred send query
PD36
4
Send all optional query parameters
PD50
5
Send address and phone number
PD2
6
Send multiple LivingSubjectIds
PD52
7
Send SSN
PD4
8
Process multiple matches from diverse Assigning PD9
Authorities
9
Process no match returned
PD13
10
Handle missing SAML assertions
PD42
11
Send multiple addresses and phone numbers
PD3
12
Send middle name
PD5
13
Send multiple names
PD6
14
Send multiple PrincipalCareProviderIDs
PD7
15
Process request for additional query parameters
PD8
16
Process multiple phone numbers and addresses
returned in match
PD10
17
Process special error condition
PD11
PD12
18
Deferred: Process empty RelatesTo
PD37
19
Deferred: Process invalid acknowledgement
PD38
20
Deferred: Process no acknowledgement
PD39
Page 32
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Patient Discovery (XCPD): Initiator
Functional Area
Test Case IDs
21
Deferred: Handle duplicate response
PD40
22
Deferred: Handle unrelated response
PD41
23
Deferred: Process invalid response
PD51
Patient Discovery (XCPD): Responder
Functional Area
Test Case IDs
1
Basic receive query
PD15
2
Process other optional query parameters
PD22
3
Return multiple matches from diverse Assigning
Authorities
PD27
4
Return no match
PD33
5
Return SOAP fault
PD34
6
Handle phonetic issues
PD35
7
Deferred: Basic receive deferred query
PD43
8
Process SSN received in query
PD17
9
Process address and phone number
PD18
10
Return multiple matches from diverse Assigning
Authorities
PD26
11
Return address and phone number
PD28
12
Process invalid query
PD16
13
Process multiple addresses and phone numbers
PD19
14
Process middle name
PD21
15
Process multiple names
PD23
16
Process multiple PrincipalCareProviderIDs
PD24
17
Request additional query parameters
PD25
18
Return multiple addresses and phone numbers
PD29
19
Return multiple names
PD30
Page 33
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Patient Discovery (XCPD): Responder
20
Functional Area
Test Case IDs
Return special error condition
PD31
PD32
21
Deferred: Error processing
PD44
PD45
PD46
PD47
PD48
PD49
Patient Discovery (PIX): Initiator
Functional Area
Test Case IDs
1
Basic send query
PD52
2
Send query with universal id
PD53
3
Send non matching patient id and universal
patient id
PD54
4
Process matches from multiple Assigning
Authorities
PD55
5
Query for known domain
PD61
6
Query for unknown domain
PD62
PD56
PD63
Patient Discovery (PIX): Responder
Functional Area
Test Case IDs
1
Process PIX query
PD57
2
Process PIX query with universal id
PD58
3
Process PIX query with patient id and universal
id
PD59
Page 34
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Patient Discovery (PIX): Responder
4
Functional Area
Test Case IDs
Process PIX query with non-matching patient id
and universal id
PD60
Patient Discovery (PIXv3): Initiator
1
Functional Area
Test Case IDs
Basic Success
PD64
Patient Discovery (PIXv3): Responder
1
Functional Area
Test Case IDs
Basic Success
PD65
Patient Discovery (PDQ): Initiator
Functional Area
Test Case IDs
1
Basic Success
PD66
2
Alternate success
PD67
PD69
PD72
3
Error
PD68
PD70
PD71
Patient Discovery (PDQ): Responder
Functional Area
Test Case IDs
1
Basic success
PD73
2
Alternate success
PD74
PD76
PD78
Page 35
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Patient Discovery (PDQ): Responder
Functional Area
Test Case IDs
PD79
3
Error
PD75
PD77
Patient Discovery (PDQv3): Initiator
1
Functional Area
Test Case IDs
Basic success
PD80
Patient Discovery (PDQv3): Responder
1
Functional Area
Test Case IDs
Basic success
PD81
Query for Documents: Initiator
Functional Area
Test Case IDs
1
Find documents
QD3000
2
Find documents (advanced)
QD3003
QD3011
QD3121
QD3001
QD3135
QD3123
QD3156
3
version 2.0 deferred creation documents find
documents
QD3093
4
version 3.0 - find documents on-demand
documents
QD3061
5
Find submission sets
QD3092
Page 36
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Initiator
Functional Area
Test Case IDs
QD3094
6
Error
QD3010
QD3012
QD3013
QD3014
7
Find folders
QD3096
QD3097
8
Get all
QD3098
QD3099
QD3225
9
Get documents
QD3100
QD3101
10
Get folders
QD3102
QD3103
11
Get associations
QD3104
QD3197
12
Get documents and associations
QD3105
QD3106
13
Get submission sets
QD3107
QD3108
14
Get submission sets and contents
QD3109
QD3110
15
Get folder and contents
QD3111
QD3112
16
Get folder for contents
QD3113
QD3114
17
Get related documents
QD3115
Page 37
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Initiator
Functional Area
Test Case IDs
QD3116
Query for Documents: Responder
Functional Area
Test Case IDs
1
Find documents
QD3019
2
Find documents (advanced)
QD3140
QD3024
QD3158
QD3027
QD3158
QD3028
QD3029
QD3030
QD3203
QD3157
QD3145
QD3146
QD3147
QD3002
QD3122
QD3136
QD3005
QD3031
QD3023
QD3066
QD3222
QD3006
Page 38
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Responder
3
Functional Area
Test Case IDs
find documents with service times
QD3025
QD3202
QD3124
QD3125
QD3126
4
find documents with creation times
QD3026
QD3127
QD3128
5
find documents - v3.0 on-demand documents
QD3139
QD3141
6
find documents - v2.0 deferred creation
QD3032
7
Find documents: error
QD3144
QD3163
QD3164
QD3196
QD3033
QD3036
QD3037
QD3041
8
Get documents
QD3077
QD3045
QD3211
QD3171
QD3216
QD3155
QD3020
QD3192
Page 39
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Responder
Functional Area
Test Case IDs
QD3182
QD3183
QD3150
9
General error
QD3034
10
Find submission sets
QD3042
QD3056
QD3057
QD3223
QD3058
QD3059
QD3137
QD3138
QD3159
QD3221
QD3195
11
Find folders
QD3043
QD3071
QD3072
QD3073
QD3198
QD3004
QD3074
QD3165
QD3166
QD3218
QD3194
12
Get all
QD3044
Page 40
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Responder
Functional Area
Test Case IDs
QD3075
QD3200
QD3076
QD3205
QD3167
QD3168
QD3193
13
Get folders
QD3079
QD3046
QD3210
QD3172
QD3215
QD3129
QD3191
14
Get associations
QD3062
QD3170
QD3130
QD3190
15
Get documents and associations
QD3047
QD3080
QD3209
QD3180
QD3214
QD3154
QD3189
QD3151
16
Get submission sets
QD3048
Page 41
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Responder
Functional Area
Test Case IDs
QD3121
QD3081
QD3123
QD3188
QD3174
17
Get submission sets and contents
QD3049
QD3082
QD3083
QD3199
QD3084
QD3022
QD3212
QD3085
QD3067
QD3131
QD3187
QD3175
18
Get folder and contents
QD3050
QD3086
QD3087
QD3201
QD3207
QD3088
QD3208
QD3089
QD3206
QD3217
Page 42
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Query for Documents: Responder
Functional Area
Test Case IDs
QD3069
QD3186
19
Get folder for documents
QD3063
QD3090
QD3177
QD3178
QD3219
QD3132
QD3185
20
Get related documents
QD3064
QD3091
QD3133
QD3134
QD3070
QD3068
QD3220
QD3153
QD3184
Retrieve Documents: Initiator
Functional Area
Test Case IDs
1
Retrieve documents (basic)
RD101
2
Retrieve multiple documents
RD102
RD103
RD109
RD118
Page 43
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Retrieve Documents: Initiator
Functional Area
Test Case IDs
RD107
RD110
3
Retrieve documents (advanced)
RD120
4
Workflow: Data changes using Stable Documents
RD220
RD223
5
On-demand documents
RD224
RD226
RD229
RD231
Retrieve Documents: Responder
Functional Area
Test Case IDs
1
Retrieve documents (basic)
RD201
2
Retrieve multiple documents
RD215
RD209
RD213
RD210
3
Error
RD104
RD108
4
Retrieve documents (advanced)
RD202
RD203
RD204
RD205
RD206
5
Workflow: Data changes using Stable Documents
RD221
RD222
Page 44
Test Process Guide
Joint Testing Task Group
Appendix C: Community Profiles
Functional Areas and Associated Test Cases
Retrieve Documents: Responder
6
Functional Area
Test Case IDs
On-demand documents
RD225
RD227
RD228
RD230
Page 45
Download