1.1.1 Security Test Plan

advertisement
(TBD)Site
SECURITY TEST PLAN AND EVALUATION REPORT
IN SUPPORT OF AN
CERTIFICATION AND ACCREDITATION EFFORT TYPE
for
STRATEGIC SITUATIONAL AWARENSS SYSTEM
SSAW, Version 1.0
01 MAY 2013
Include the month and year only on the cover page when completing this Artifact.
For Certification and Accreditation (C&A) visits, the above date will be the first day of
baseline and mitigation, respectively. The final version date (i.e., for inclusion within
C&A packages) will be the date that all updates are completed.
UNCLASSIFIED
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
TABLE OF CONTENTS
1
INTRODUCTION..................................................................................................... 1
1.1
2
Scope ................................................................................................................... 1
1.1.1
Security Test Plan ................................................................................... 1
1.1.2
Evaluation Report ................................................................................... 1
1.2
Responsibility Matrix.......................................................................................... 1
1.3
Site/PO Name Status ........................................................................................... 2
SECURITY TEST APPROACH ............................................................................. 3
2.1
Retina, AppDetectivePro, and WebInspect Assessment Policy ......................... 4
2.2
Defense Information Systems Agency Gold Disk, Security Readiness Review,
and Manual Checklists Assessment Policy ..................................................................... 6
2.2.1
2.3
Printer Test .............................................................................................. 6
Assessment Tools Overview ............................................................................... 6
2.3.1
Retina Overview ..................................................................................... 6
2.3.2
AppDetectivePro Overview .................................................................... 6
2.3.3
WebInspect Overview ............................................................................. 7
2.3.4
Defense Information Systems Agency Gold Disks Overview ................ 7
2.3.5
Defense Information Systems Agency Security Readiness Review
(DISA SRR) Overview ............................................................................................... 8
2.3.6
Accreditation Effort Workstation Assessment Policy ............................ 9
2.3.7
Servers and other Devices ..................................................................... 10
2.4
3
Security Test Resources .................................................................................... 11
TESTING SCOPE .................................................................................................. 12
3.1
Availability ....................................................................................................... 12
3.1.1
Security Testing Personnel ................................................................... 13
3.2
Limitations ........................................................................................................ 13
3.3
Assumptions...................................................................................................... 13
3.4
Security Test Conduct ....................................................................................... 14
3.5
Certification and Accreditation Boundary ........................................................ 14
3.5.1
Ports, Protocols, and Services Testing .................................................. 15
3.5.1.1
3.6
Red Escalation Procedures .................................................................... 15
Changes between the Baseline and Mitigation Visits ....................................... 15
i
UNCLASSIFIED
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
3.7
Deliverable Requirements ................................................................................. 16
3.7.1.1
3.7.2
4
False Positive Review Process .............................................................. 16
Engineering/Trend Analysis Database Update Process ........................ 16
SECURITY TEST DELIVERABLES .................................................................. 17
4.1
Air Force Medical Service Information Assurance Vulnerability Matrix ........ 17
4.1.1
DIACAP Severity Category .................................................................. 17
4.1.1.1
CATEGORY I Weaknesses .................................................................. 17
4.1.1.2
CATEGORY II Weaknesses................................................................. 17
4.1.1.3
CATEGORY III Weaknesses ............................................................... 18
4.1.2
Vulnerability Impact Code Determination............................................ 18
5
EVALUATION REPORT ...................................................................................... 19
6
EVALUATION REPORT SUMMARY ............................................................... 20
7
PREVIOUS AND CURRENT RESIDUAL RISK ............................................... 22
8
SECURITY ASSESSMENT .................................................................................. 23
8.1
Assessment Summary ....................................................................................... 23
8.2
Ports, Protocols, and Services Results .............................................................. 24
8.3
Privacy Act........................................................................................................ 25
8.4
Protected Health Information ............................................................................ 25
8.5
Conclusions and Recommendations ................................................................. 25
APPENDIX A
REFERENCES ................................................................................ 26
APPENDIX B
SSAWS ............................................................................................. 30
APPENDIX C
EXECUTIVE VM REMEDIATION REPORT ........................... 34
ii
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
List of Tables
Table 1-1: SSAW C&A Effort Type Responsibility Matrix ............................................. 2
Table 2-1: Accreditation Effort Workstation Assessment Summary .............................. 10
Table 2-2: Device Assessment Summary ........................................................................ 11
Table 2-3: Annual Review Effort Assessment Summary .Error! Bookmark not defined.
Table 2-4: Security Testing Resources ............................................................................ 11
Table 3-1: SSAW Security Testing Schedule .................................................................. 12
Table 7-1: Previous/Current Residual Risk ..................................................................... 22
Table 8-1: Ports, Protocols, and Services ........................................................................ 25
List of Figures
Figure 8-1: SSAW IS Vulnerability Summary SAND Chart .......................................... 23
Figure 8-2: SSAW IS Assessment Summary................................................................... 24
iii
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
1 INTRODUCTION
EXAMPLE TEXT: This Information System (IS) Security Test Plan and Evaluation
Report (STP&ER) is developed under the authority of the Air Force Medical Service
(AFMS) Information Assurance (IA) Certifying Authority (CA) specifically for Strategic
Situational Awareness (SSAW) system, as a means of outlining Department of Defense
(DoD) Instruction 8510.01, “DoD Information Assurance Certification and Accreditation
Process (DIACAP),” dated 28 November 2007, for this Certification and Accreditation
(C&A) Effort Type effort.
As part of the DoD Information Assurance Certification and Accreditation Process
(DIACAP), it is required that SSAW undergo security testing to determine its compliance
with the DoD Information Assurance (IA) Controls. In order to achieve an Authorization
to Operate (ATO) through the process of a Risk Assessment, or In order to maintain an
ATO through an Annual Review (select one as appropriate), SSAW must meet the
criteria documented within DoD Directive 8500.01E, “Information Assurance (IA),”
dated 24 October 2002.
This document defines the security testing approach, objectives, and procedures that will
be utilized during the baseline and mitigation/validation phases and documents test
results of the DIACAP for SSAW IS.
1.1 Scope
1.1.1
Security Test Plan
The security testing will be conducted against SSAW, located at Address of the IS, and
will exercise the security of the system afforded by SSAW in its current deployment
configuration. Hardware, operating systems, and database software configurations will
be examined during the risk assessment process, utilizing the methods depicted in
Sections 1-4.
1.1.2
Evaluation Report
The Evaluation Report found in Sections 5-8 will document the residual risk and allow
for measurement of the overall progress associated with the mitigation and the validation
of vulnerabilities on the SSAW’s IS. The AFMS IA Team will provide the Evaluation
Report and Vulnerability Matrix (VM) to the CA.
1.2
Responsibility Matrix
STANDARD LANGUAGE (DO NOT MODIFY): Table 1-1, SSAW Certification and
Accreditation (C&A) Effort Type Responsibility Matrix, lists key personnel of the
AFMSDIACAP and their responsibilities:
1
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
Name
Title
Responsibility
SSAW Point of Contact
(POC)
IS/Program Office (PO) Name
SSAW Program Manager
Individual with the responsibility for and
authority to accomplish program or
IS/application objectives for development,
production, and sustainment to meet the
user’s operational needs.
Information Assurance
Manager (IAM)
SSAW Information Assurance
Manager (IAM)
The individual responsible for the IA
program of a/an SSAW IS or organization.
Information Assurance
Officer (IAO)
Site/PO Name Abbreviation
Information Assurance Officer
(IAO)
Individual responsible to the IAM for
ensuring the appropriate operational IA
posture is maintained for a system,
program, or enclave.
SSAW User
Representative
Site/PO Abbreviation User
Representative
The individual or organization that
represents user community for a particular
IS/application and assists in the
Certification and Accreditation (C&A)
process.
CA
AFMS IA CA
Official having the authority and
responsibility for the certification of
AFMS ISs/application.
Designated Accrediting
Authority (DAA)
AFMS IA DAA
Official with the authority to formally
assume responsibility for operating a
system at an acceptable level of risk.
AFMS IA Team
Member Names (list all)
AFMS IA Security Analyst
The AFMS IA Team is the lead for the
security testing and is responsible for
performing activities identified in this
report, such as:
AFMS IA Engineer(s)

Security Testing

Risk Analysis/Review

Requirements Analysis

Policy and Procedure Review
Table 1-1: SSAW C&A Effort Type Responsibility Matrix
1.3
Site/PO Name Status
EXAMPLE TEXT: SSAW is currently scheduled to undergo DIACAP or Annual
Review as part of DIACAP. SSAW and the AFMS PMO must sign a Letter of
Agreement (LOA) prior to beginning any testing. The signing of the LOA will be
coordinated by the AFMS IA Analyst and will be placed in the Letters of Agreement of
SSAW’s DIACAP C&A Package. SSAW management submitted the completed “SSAW
C&A Boundary Device Matrix,” depicted in Section 3.5, Certification and Accreditation
Boundary, to the AFMS IA Team. This section identifies the devices that are in the C&A
boundary.
2
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
2 SECURITY TEST APPROACH
EXAMPLE TEXT: The AFMS IA Team will utilize pre-determined methodologies
and tools to measure the compliance of IS Name (SSAW) with the DoD IA Controls.
The specific tests will focus on all devices within the accreditation boundary, to include,
but not limited to, servers, workstations, printers, and filtering devices, as mentioned in
the C&A Matrix found in Section 3.5, Certification and Accreditation Boundary. The
tests will be discussed throughout the proceeding sections.
STANDARD LANGUAGE (DO NOT MODIFY): The process described in this
section will be adhered to while performing the assessment scans of SSAW’s IS. The
AFMS IA Team will test all devices within the C&A boundary and will run the following
tools against all applicable systems within the boundary:








Automated vulnerability assessment tools
Defense Information Systems Agency (DISA) Security Readiness Review
(SRR) script testing for UNIX and Windows
DISA Database SRR
DISA Web SRR
DISA Field Security Operations (FSO) Gold Disk (hereafter, referred to as
“DISA Gold Disk”) using the Platinum Policy
DISA FSO Security Technical Implementation Guides (STIGs) and Checklists
Manual DISA Checklists for Domain Name System (DNS), Enclave, Network
Infrastructure, Logical Partition (LPAR), and non-standard operating systems
(OSs) (e.g., Prime, etc.) and Database Management System (DBMS) (e.g.,
Informix, etc.) checklists.
Policies and procedures review: All security policies and procedures that are
provided will be reviewed for completeness based on DoD Instruction
8510.01, “DoD Information Assurance Certification and Accreditation
Process (DIACAP),” dated 28 November 2007; DoD Directive 8500.01E; and
DoD Instruction 8500.2, “Information Assurance (IA) Implementation,” dated
6 February 2003.
All applicable tests will be executed during the risk assessment.
Based on information provided by SSAW, if the technology relative to a specific AFMS
IA testing tool is not deployed within SSAW’s C&A boundary, then that testing tool will
not be used during testing. However, once on site, if the AFMS IA Team discovers that
the technology for which the tool is required is within the SSAW C&A boundary, the use
of that specific AFMS IA testing tool will be necessary.
During the actual baseline and mitigation assessment scans, the SSAW IS will remain
frozen. The freeze is only in place during the actual testing periods. Changes can be
made between baseline testing and mitigation testing, as long as the changes are
coordinated and approved by the AFMS IA Program Office (PO) in advance. If no more
than 30 business days have passed between the baseline and mitigation assessment scans,
the same testing tools and policies will be used, with the exception of the current
Information Assurance Vulnerability Management (IAVM) notices to include Alerts
3
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
(IAVAs), Bulletins (IAVBs), and Technical Advisories (IAVTs) as well as any
vulnerability identified that would pose a significant risk to the DoD information.
2.1
Retina, AppDetectivePro, and WebInspect Assessment Policy
STANDARD LANGUAGE (DO NOT MODIFY): The AFMS IA Team while onsite
will need to verify active devices within the workstation Internet Protocol (IP) address
range provided in the C&A Boundary Device Matrix. The team will use Retina to ping
devices within the list of IP ranges. Devices that respond to the ping will be included in
the workstation scan. If the AFMS IA Team is unable to ping devices, the team will
request a copy of the Dynamic Host Configuration Protocol (DHCP) server log from the
site or other information in order to obtain a current view of the active workstations.
Retina, AppDetectivePro, and WebInspect have the following requirements:








At least two static IP addresses
The ability to scan with two machines simultaneously
IP addresses for all devices being scanned
Unfiltered network access to devices being scanned that are within the C&A
boundary (i.e., no Layer 2 or Layer 3 technical controls should block network
traffic originating from the scanning laptops).
If performing a database assessment of Lotus Notes Domino, the site is
expected to install and configure the client software on the scanning laptop. It
may be uninstalled after the assessment is completed.
The Uniform Resource Locators (URLs) of the web servers and applications
being assessed.
When using Retina, the AFMS IA engineers will need a domain account with
administrator privileges for full interrogation of the targeted devices.
The Retina “All Audits” policy will be used to perform the vulnerability
assessment against Site/PO Name Abbreviation to:
o Perform the initial vulnerability assessment scans using Retina on all
Windows workstations and servers, and UNIX servers.
o Perform the initial vulnerability assessment scans using Retina on all
routers, switches, and firewalls (and printers if applicable) within the
accreditation boundary.

For AppDetectivePro, the AFMS IA engineers will need local administrator
rights for the server hosting the database instance, and database owner
privileges for each database.

When using WebInspect, if the Web Application being assessed requires
authentication, then WebInspect must be configured to use the appropriate
4
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
user credentials. These credentials may include public key infrastructure
(PKI), domain, and application level authentication.
o When scanning with WebInspect, the AFMS IA Engineer will:

Phase One - A crawl will be initiated to identify Web Applications,
Web Environment, and components. The AFMS IA Engineer will
review the results from that crawl with the appropriate subject
matter expert or application developer to identify which URLs are
to be excluded from the scan.

Phase Two - Scan sessions will be segmented and customized to
ensure minimal impact to the Web Application environment and to
ensure successful completion of the scan.

Phase Three - Initiate the scan.
SSAW can always be confident that scan reports will reflect the results from the most
recent known checks (as of the date of the AFMS IA C&A Policies and Templates
release) against identified industry-wide vulnerabilities. Web-based updates are
performed on a frequent basis to ensure that the AFMS IA Team is performing
vulnerability assessments using the latest security policy checks.
If for any reason the scanning activities need to be suspended, the AFMS IA Team will
stop the scans until further notice. Identified below are the procedures for stopping the
automated scans. To stop:
AppDetectivePro:

Click the Stop button on the AppDetectivePro Scanner Status window.
Retina:

Click the Abort button on the Retina window.
DISA Gold Disk:


Click the Cancel window button on the Gold Disk window.
Select End “Now” when prompted by “Program Not Responding” window.
WebInspect:

To stop or interrupt an ongoing scan, click the PAUSE button on the Scan
toolbar. Be aware that stopping the scan may take a few minutes until all
processing threads have terminated. If desired, you can continue a paused
scan by clicking the Start/Resume button.
5
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
2.2
Defense Information Systems Agency Gold Disk, Security Readiness
Review, and Manual Checklists Assessment Policy
STANDARD LANGUAGE (DO NOT MODIFY): The AFMS IA Team uses a
comprehensive assessment approach that utilizes DISA and third party automated tools,
as well as Manual Checklists. The general approach is to assess devices based on their
platform type using a formula. The formula ensures that AFMS IA engineers review an
accurate representation of devices within the C&A boundary.
2.2.1
Printer Test
STANDARD LANGUAGE (DO NOT MODIFY): The AFMS IA Team will conduct
a test to see if the printer IP address ranges contained within the C&A boundary can be
reached from outside of the C&A boundary. This test will be conducted from a part of
the SSAW’s network that is outside of the C&A boundary. If the AFMS IA engineer can
ping the IP addresses within the ranges for printers from outside the boundary, the
printers will be included in the assessment scans during the baseline and mitigation
assessments. If the printers are not reachable from outside the C&A boundary, the
printers will be excluded from the assessment scans.
2.3 Assessment Tools Overview
2.3.1
Retina Overview
STANDARD LANGUAGE (DO NOT MODIFY): Retina discovers networked
devices using wired and wireless connections to identify which OSs, applications,
databases, and wireless access points are present. Any unauthorized applications, such as
peer-to-peer (P2P), malware, or spyware, will be detected and identified. Retina is
capable of scanning all ports on every networked device to provide the basis for
remediation. It will scan security threats on every machine on a network, identifying all
types of OSs and networked devices.
2.3.2
AppDetectivePro Overview
STANDARD LANGUAGE (DO NOT MODIFY): A network-based, vulnerability
assessment scanner, AppDetectivePro discovers database applications within the
infrastructure and assesses their security strength. In contrast to other solutions,
AppDetectivePro modules allow enterprise assessment of all three primary application
tiers through a single interface: Web front-end, application/middleware, and back-end
database.
AppDetectivePro will be used to conduct vulnerability assessments on the following
elements within the accreditation boundary for security compliance: IBM DB2 Universal
Database (UDB), IBM DB2 on OS/390 mainframes, Sybase Adaptive Server Enterprise
(ASE), MySQL, and Lotus Notes/Domino databases.
The databases are accessed across the network and reviewed for a wide range of
database-specific vulnerabilities. These scans use a series of security checks that assess
multiple security risks associated with the following audit categories:


Access Control
Application Integrity
6
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013


2.3.3
Identification/Password Control
OS Integrity
WebInspect Overview
STANDARD LANGUAGE (DO NOT MODIFY): The WebInspect application
security assessment tool will be used for vulnerability assessments, web application
security, and the security of critical information by identifying known and unknown
vulnerabilities within the Web application layer. WebInspect performs security
assessments on web applications and web services. Web server security will be assessed
by including checks that validate the web server is configured properly.
Prior to conducting any WebInspect assessment, the AFMS IA Team will collaborate
with the web application owner to identify all sections of the web application that will
need to be excluded from the assessment in order to prevent any unwanted changes.
When performing a WebInspect assessment, the AFMS IA engineer should use an
account with privileges that reflect that of a normal user of the web application.
NOTE: Any functionality provided by the web application to an end user will be
utilized by WebInspect in the same manner.
2.3.4
Defense Information Systems Agency Gold Disks Overview
STANDARD LANGUAGE (DO NOT MODIFY): The DISA Gold Disk is an
automated script created to verify National Security Agency (NSA) and DISA security
policies. DISA Gold Disk (Platinum Policy) will be used on Windows 2000/2003 servers
(domain controllers/member servers) and Windows 2000/XP Professional workstations.
In order to run the DISA Gold Disk on Windows OSs, it must run locally on the target
system via a CD-ROM with specific administrative privileges using the Platinum Policy.
Once executed, the DISA Gold Disk collects security-related information from the OS
scanned. It then reports how many vulnerabilities were checked, errors occurred, and
manual checks remain. This information is then presented in a graphical user interface
(GUI).
The reviewer must examine the results via this GUI by expanding all possible menus next
to each security check. Each check will have an icon next to it, indicating whether it was
a finding, not a finding, or must be checked manually. Once a finding has been fixed, the
status of the finding can be altered through the GUI. Finally, the information from
running the DISA Gold Disk will be exported to a file on a portable medium. This file,
along with all files generated from DISA Gold Disk run on similar systems, will then be
imported into an application that can produce various reports regarding the current
security posture of these systems.
7
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
The AFMS IA PO hereby informs SSAW that the following feature of the DISA Gold
Disk IS NOT TO BE USED:
After running the DISA Gold Disk on a device, there will be an option to
“Remediate” vulnerabilities. It is believed that if this option is used, it may cause
unknown changes to the device or system. The AFMS IA PO highly recommends
that SSAW NOT USE this feature on their IS. The AFMS IA PO hereby
removes itself from responsibility for, and any liability and consequences
resulting from, the use of this feature by SSAW on their IS. Any use of this
functionality on the SSAW IS against the AFMS IA Program Office’s
recommendation, stated herein, will be at the sole risk and responsibility SSAW.
STANDARD LANGUAGE (DO NOT MODIFY): For detailed instructions and
procedures for executing and running the DISA Gold Disk, the AFMS IA Team will
provide the SSAW system administrator with a copy of the DISA Gold Disk Users
Guide. Although documented in the manual, specific requirements and materials are
needed to ensure successful use of DISA Gold Disk on the devices to be tested that
consist of:



Microsoft Internet Explorer 6.0 or higher
DISA Gold Disk Users Guide and Gold Disk CDs
User account from which DISA Gold Disk is run, must have administration
privileges and have the User Right: Manage Auditing and Security Log
The following list contains the recommended steps in running DISA Gold Disk in the
production/test environment:

Perform DISA Gold Disk using the Platinum policy standard
 Perform an initial DISA Gold Disk test on all Windows servers.
 Perform an initial DISA Gold Disk test on a sample set of Windows
workstations.
 The interview portion of the DISA Gold Disk process is performed
independently of the technical review discussed in the checklist manual.
2.3.5
Defense Information Systems Agency Security Readiness Review
(DISA SRR) Overview
STANDARD LANGUAGE (DO NOT MODIFY): For detailed instructions and
procedures for executing and running the DISA SRRs, the AFMS IA Team will provide
the SSAW administrator with copies of the applicable SRR scripts.
The AFMS IA engineer will verify that the SSAW system administrator executes the
applicable SRR and that the pertinent results and associated text files are secured.
The following list contains the recommended steps in running DISA SRR in the
production/test environment:

Perform a DISA manual SRR checklist on all applicable platforms and OSs
that do not currently have a DISA batch script compatible to a specific OS.
8
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013





Perform a DISA manual SRR Database checklist on all applicable databases.
To perform the manual SRR, refer to Section 5 of the “Manual System Check
Procedures,” that will allow the reviewer to analyze the system for security
vulnerabilities. Each procedure maps to a potential discrepancy list tabulated
in Section 2 of the “Manual System Check Procedures.”
Follow the procedures specified in Section 4 of the DISA Windows 2000
checklists manual for instructions on how to perform an SRR using the
automated scripts, and to interpret the script output for vulnerabilities. Each
procedure maps to a packet data interface (PDI) tabulated in the checklist
manual.
Conduct an administrative interview with the system administrator or the
IAO. Procedures for this interview and the associated questions are outlined
in Section 3 of the Manual Security Checklists.
The interview portion of the DISA SRR process is performed independent of
the technical review discussed in the checklist manual.
NOTE: While running these scripts does not modify the OS resources, the scripts can
impair the system performance of older machines. If this is an issue, it is recommended
that unnecessary applications be closed, or, that the review be performed when machine
usage is at a minimum.
2.3.6
Accreditation Effort Workstation Assessment Policy
STANDARD LANGUAGE (DO NOT MODIFY): The AFMS IA Program Office’s
workstation assessment policy requires an automated assessment on 20 percent of each
workstation platform type based on Organization Unit (OU) or equivalent directory
structure, and 20 percent of each mobile device platform type within the C&A boundary.
The platform type is analogous to the operating system of the device such as Windows
2000, Windows XP, Red Hat Professional, Solaris, etc. Along with the 20 percent
automated workstation assessment, the AFMS IA Team will perform a manual
assessment of those workstations that completed the automated assessment. The number
of manual assessments is based on the system size/complexity and the number of
automated DISA Gold Disk completed. Table 2-1, Accreditation Effort Workstation
Assessment Summary, and Table 2-2, Device Assessment Summary, explain the
percentage of manual workstation assessments required based on the total number of
workstations and laptops in the C&A boundary. In addition to the 20 percent of the
workstations and laptops selected as described above, the AFMS IA Team’s assessment
includes 100 percent of system administrator workstations and laptops.
9
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
Accreditation Effort Workstation Gold Disk Assessment Policy
System
Size/Complexity
Small
Medium
Large
Extra Large
Workstation Quantity
Automated Gold Disk
Manual Gold Disk
(based on the # of
Automated Gold Disk
performed)
1 to 500
20%
100%
501 to 1,000
20%
25%
1,001 to 3,000
20%
20%
Tier 1 – 3,001 to 5,999
Tier 2 – 6,000 to 7,999
Tier 3 – 8,000 to 9,999
Tier 4 – 10,000 or greater
20%
20%
20%
20%
12%
9%
8%
5%
Table 2-1: Accreditation Effort Workstation Assessment Summary
2.3.7
Servers and other Devices
STANDARD LANGUAGE (DO NOT MODIFY): The AFMS IA Program Office’s
assessment policy for servers states that the AFMS IA Team will select and conduct
assessments on 100 percent of servers for each device platform type, if 100 servers or
less. However, if there are more than 100 servers, the AFMS IA Team will select and
conduct the assessments on the first 100 servers for each device platform type plus an
additional 25 percent of the difference above one hundred. The table below summarizes
the number of DISA Gold Disk, SRR scripts, and manual assessments needed for each
device platform (excluding workstations).
The table below will be followed for each applicable checklist and/or SRR for each
server platform type. For example, if there exists 200 database server instances which
consists of 100 Microsoft SQL and 100 Oracle servers, the AFMS IA Team will perform
a total of 200 database device assessments and not 125 as the table is applied for each
device platform type (e.g., Microsoft SQL and Oracle), and not device type (databases).
Any operating system or database variance will be treated separately. Refer to Table 2-2,
Device Assessment Summary, for an example.
Additionally, when selecting devices, priority should be given to those devices that
perform mission-critical services, contain DoD information data, or are publicly
accessible. Critical services include, but are not limited to, authentication (domain
controllers), DNS, database services, and front-end Hypertext Transfer Protocol (HTTP)
services.
10
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
Device Assessment Summary
Number of Devices
Platform Type
Number of Device
Assessments
Total % Reviewed
100
100 SQL
100
100%
200
100 SQL
100 Oracle
200
100%
200
200 SQL
125
62.5%
200
100 SQL
50 Oracle
50 DB2
200
100%
Table 2-2: Device Assessment Summary
2.4
Security Test Resources
EXAMPLE TEXT: For successful completion of the risk assessment against SSAW,
both SSAW and the AFMS IA Team will require the testing resources listed in Table 2-4,
Security Testing Resources:
Security Testing Resources
Required Element
Provided by
Hardware and Software Configurations
(see the IS Core, Section 2)
SSAW
Documentation Regarding Security Testing
(see the Security Test Plan and Evaluation Report)
AFMS IA Team
Documentation Regarding Systems
(see the IS Core and associated artifacts)
SSAW
Personnel
SSAW and AFMS IA
Team
Testing Tool Policies (Retina, AppDetective Pro,
and WebInspect)
(see the Security Test Plan)
AFMS IA Team
Manual Checklists and SRR Scripts (UNIX,
Database, Web, and z/OS, etc.)
(see the Security Test Plan)
AFMS IA Team
CA-Examine Mainframe Testing Tool
(if applicable)
(see the Security Test Plan and Evaluation Report)
AFMS IA Team
Table 2-4: Security Testing Resources
11
Add appropriate classification marking
Due Date/Status
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
3
TESTING SCOPE
3.1 Availability
EXAMPLE TEXT: The AFMS IA Team Analyst has coordinated with the IS Name
(SSAW) point of contact (POC) to schedule the baseline and mitigation scans. Refer to
Table 3-1, SSAW Security Testing Schedule, for a high-level listing of activities that will
be performed during baseline and mitigation. A detailed agenda will be provided prior to
the onsite visit. If extra time is needed, both parties must agree to extend the visit or
arrangements will need to be made to work longer days or during the weekend until the
assessment is complete.
The AFMS IA Team Analyst is responsible for developing and maintaining the security
test schedule and will ensure that the necessary measures are taken to meet all milestones
and completion dates, with minimal disruption to SSAW production operations. The
AFMS IA Team will perform testing after hours or at otherwise mutually agreed upon
times. Efforts will be taken to avoid any negative impact to the performance of the
network.
Security Testing Schedule
Baseline Visit
Location
Activity
Dates/Start Time
Assigned
Mitigation Visit
Location
Activity
Dates/Start Time
Table 3-1: SSAW Security Testing Schedule
12
Add appropriate classification marking
Assigned
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
3.1.1
Security Testing Personnel
EXAMPLE TEXT: At a minimum, the personnel involved in the security testing can be
classified in the following functional categories:





AFMS IA Team: Conduct and monitor security testing and record results.
SSAW POC: Monitors security testing execution and verifies results.
SSAW Engineer/System Administrator: Installs, configures, and
troubleshoots equipment, and provides specific equipment expertise.
SSAW Security Tester(s): Performs any required hands-on security testing.
SSAW Documentation Specialist: Assists the AFMS IA Security Analyst
with questions pertaining to the (DIACAP) documentation.
Not all security testing personnel need to be present for the onsite testing at all times.
However, depending on the testing parameters, the senior systems programmer, database
administrator and/or network administrator may need to be available to provide support
in their area of responsibility. It is anticipated that all required SSAW resources will be
available as needed. The SSAW support engineers should be onsite at the start of the
tests to troubleshoot any initial configuration problems. Afterwards, the support
engineers should be available for consultation in person or via telephone
3.2
Limitations
General limitations under which the security test will be conducted are as follows:

3.3
Security testing will be limited to the specific elements described throughout
this document.
 The AFMS IA Team will not disrupt mission operations on the SSAW’s
production network.
 The AFMS IA Team will perform testing after hours or at otherwise mutually
agreed upon times.
 A minimum of five days is typically required for performing security testing.
This period does not account for any hardware, software, data, or test
procedure problems encountered during the security testing. Requirements to
extend the security testing duration must be approved by each participant’s
governing official.
Assumptions
EXAMPLE TEXT: The following assumptions are made during the security testing:




Test user accounts will be created in advance and used during security testing.
Domain, root level administrator, or other accounts with appropriate
permissions must be provided in support of the automated scans.
The hardware and software configuration will remain unchanged between the
baseline and mitigation testing, unless configuration changes are coordinated
and approved in advance by the AFMS IA Program Office.
Prior to the mitigation visit, the SSAW will address all vulnerabilities
identified during the baseline scan.
13
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013

The AFMS IA Team will have access to documented security procedures or
operating instructions.

The CA and SSAW PO will determine the proposed solutions, schedule,
security actions, and milestones.

3.4
The DAA will determine the maximum length of time for the validity of an
accreditation
 SSAW IA Department personnel will comply with all applicable, established
DoD security policies, standards, and guidelines throughout the IS life cycle.
 SSAW will operate in a secured environment in accordance with site
operational and environmental procedures to ensure that risk to
confidentiality, integrity, and availability of the information and network
remains acceptable.
 The AFMS IA Team receives validation that an agreement exists, which
documents the IA Controls specified as inherited in the DIACAP
Implementation Plan.
.
Security Test Conduct
EXAMPLE TEXT: The following activities will be conducted during the security test:





Perform an entrance briefing. The AFMS IA Team will provide a short
entrance briefing to any participating site personnel of the SSAW IA
Department.
Conduct the assessment, collect the data, and upload into DAS.
Annotate any test discrepancy. Explain any conditions that seemingly caused
an error or test discrepancy to occur. Any test discrepancies should be
annotated in the security test procedures, with an indication of whether or not
it is certification- relevant. Any anomaly found to be certification-relevant
must be included in the test report.
Review/finalize false positives, duplications, and mitigations.
Perform an exit briefing. The AFMS IA Team will provide an exit briefing to
the participating site personnel. The briefing will include:
o Raw results of the security testing, to include physical security
findings
o Answers to any questions posed by the site personnel
o Conclusions and recommendations based on the test results
o An update of the discrepancies noted during the documentation review
3.5
Certification and Accreditation Boundary
STANDARD LANGUAGE (DO NOT MODIFY): The systems to be tested and the
boundary are depicted in Artifact 1h, C&A Boundary Diagram.
14
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
The C&A Boundary Device Matrix for hardware and software configurations is provided
in Artifact 1f, C&A Boundary Device Matrix. It has been completed with site/target
specific information for all devices (including printers) included in the C&A boundary.
3.5.1
Ports, Protocols, and Services Testing
STANDARD LANGUAGE (DO NOT MODIFY): C&A and DoD System Ports,
Protocols, and Services Testing are provided in Artifact 5, Ports, Protocols, and Services
and Registry Spreadsheet.
3.5.1.1 Red Escalation Procedures
STANDARD LANGUAGE (DO NOT MODIFY): If the identified port is categorized
as a Code Red, the lead AFMS IA engineer will inform the AFMS IA Analyst that there
is a Code Red PPS configuration in the C&A boundary for the site. The AFMS IA Team
Lead will notify the SSAW point of contact (POC) that a Code Red PPS configuration
exists within the C&A boundary. SSAW must first determine if the Red port has been
registered in the Ports and Protocols database. If registered, SSAW must provide a Plan
of Action and Milestone (POA&M) within 180 days that will eliminate red PPSboundary crossings within the two-year phase-out period.
If the Red port has not been registered, SSAW must register the port and will be allowed
to continue in operation while the PPS undergoes a vulnerability assessment, is assigned
an assurance category, and approved by the Defense Information System Network
(DISN) DAAs. After a reasonable grace period determined by the DISN DAAs, to allow
for compliance, unregistered PPS visible to DoD-managed network components -- that
are not in compliance with the most current PPS Assurance Category Assignments List
and not approved by the DISN DAAs -- shall be blocked at appropriate DoD enclave
boundaries.
After the site has performed the mitigation of the Code Red vulnerability, the AFMS IA
Team will conduct another port scan on the network infrastructure to verify that the Code
Red has been mitigated
3.5.1.1.1
Yellow Escalation Procedures
STANDARD LANGUAGE (DO NOT MODIFY): PPS designated as Yellow have a
medium level of assurance. These PPS expose DoD networks to an acceptable level of
risk for routine use only when implemented with the required mitigation and approved by
the DAAs for the SSAW IS.
3.6
Changes between the Baseline and Mitigation Visits
Indicate in this section any changes to the C&A boundary identified between baseline
and mitigation visits. Ensure that the changes to the C&A boundary are referenced in
this section and identified in the embedded C&A Boundary Device Matrix, Current
Device Quantities, and the C&A boundary diagrams, if applicable.
EXAMPLE TEXT: The current SSAW device quantity found in the C&A Boundary
Device Matrix shown in Section 3.5, Certification and Accreditation Boundary, of this
document, have been updated to reflect addition/subtraction/ replacement of the
following device(s):
15
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013

Workstations

Servers

Firewalls
This change has been made due to the replacement of the Linux DNS server with the
Windows DNS server.
The Site/PO is to provide a statement as to what changed in the boundary and why the
change was made.
3.7 Deliverable Requirements
3.7.1.1 False Positive Review Process
DIACAP Automation Tool (DAS) will mark findings as previously identified false
positives if the findings were validated false positive during the previous effort. The
Site/PO is responsible for reviewing these false positives after they are identified to
ensure they are still false positives. The AFMS IA Team will also validate these false
positives while they are on site.
3.7.2
Engineering/Trend Analysis Database Update Process
After the baseline scan data has been uploaded into DAS, it is the Site/PO’s responsibility
to address each finding in DAS via the VM Workflow. The Site/PO should review each
finding and notate whether it is a ‘False Positive’, ‘Site Will Fix’ or ‘False Positive’.
This must be accomplished in accordance with the timeframe allowed in the existing
approved timeline. The AFMS IA Team will validate the Site/PO’s response for each
finding. This is to be performed prior to and during the mitigation visit.
16
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
4
SECURITY TEST DELIVERABLES
4.1 Air Force Medical Service Information Assurance Vulnerability Matrix
STANDARD LANGUAGE (DO NOT MODIFY): The AFMS IA Team will generate
and provide the VM to the CA. This VM will allow for measurement of the overall
progress associated with the baseline and mitigation/validation of vulnerabilities on the
IS Name (SSAW) IS.
The VM is an AFMS IA PO document used to inform the CA of identified vulnerabilities
within a site’s IS, the impact code of the vulnerability, and the recommendations for
vulnerability resolution.
This customized, unique vulnerability executive summary report will be generated. The
report will be sent to AFMS senior management within five days of both the baseline
assessment and mitigation scans, for action and final approval prior to sending to SSAW.
4.1.1
DIACAP Severity Category
STANDARD LANGUAGE (DO NOT MODIFY): Upon completion of all security
testing, the AFMS IA Team will identify vulnerabilities including Category I (CAT I),
Category II (CAT II), and Category III (CAT III) weaknesses as defined in the
Department of Defense (DoD) Instruction 8510.01.”
4.1.1.1 CATEGORY I Weaknesses
STANDARD LANGUAGE (DO NOT MODIFY):

Shall be corrected or satisfactorily mitigated before an ATO is granted.

If a CAT I weakness is discovered within an IS and cannot be mitigated within 30
business days, the IS must revert to an IATO or Denial of Authorization to
Operate (DATO) in accordance with the DIACAP requirement. Additionally, if
an IATO is granted, the AFMS CIO must report and provide a signed copy of the
authorization memorandum with supporting rationale to the DoD Senior
Information Assurance Officer (SIAO).
4.1.1.2 CATEGORY II Weaknesses
STANDARD LANGUAGE (DO NOT MODIFY):

Shall be corrected or satisfactorily mitigated before an ATO is granted.

If a CAT II weakness is discovered on an IS operating with a current ATO and
cannot be corrected or satisfactorily mitigated by the completion of the mitigation
testing, the IS must revert to an IATO or DATO in accordance with the DIACAP
requirement. Additionally, if an IATO is granted, the AFMS DAA must report
and provide a signed copy of the authorization memorandum with supporting
rationale to the AFMS CIO.
17
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
4.1.1.3 CATEGORY III Weaknesses
STANDARD LANGUAGE (DO NOT MODIFY):

Shall be corrected or satisfactorily mitigated by the completion of the mitigation
testing to support the continuation of an accreditation/current effort. (Select one
as appropriate)
4.1.2
Vulnerability Impact Code Determination
STANDARD LANGUAGE (DO NOT MODIFY): When all security testing activities
have been completed, the AFMS IA Team will identify vulnerabilities, which will be
categorized by IA controls and sorted by impact codes. An impact code indicates DoD
assessment of the likelihood that a failed IA Control will have IA consequences that have
system-wide consequences. It is also an indicator of the impact associated with noncompliance or exploitation of the IA Control. The impact code may also indicate the
urgency with which corrective action should be taken. Impact codes are expressed as
High, Medium, and Low.



High Impact Code: Must be fixed/mitigated within the specified time period
mandated by the CA
Medium Impact Code: Must be fixed/mitigated within the specified time
period mandated by the CA
Low Impact Code: Must be fixed/mitigated within the specified time period
mandated by the CA
If technical or programmatic constraints prohibit vulnerability resolution, the DAA
responsible for the SSAW IS may elect to accept the risk posed by a vulnerability. This
risk acceptance must be documented using the POA&M and provided to the CA for
approval.
Impact code designation of vulnerabilities requires careful analysis and depends on the
following factors:






Nature of the security vulnerability
Relationship of the vulnerability to the overall business function
Role of the vulnerability within the system’s baseline infrastructure
Effect of the vulnerability on the system security posture
Operational environment
Risk factor
18
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
5
EVALUATION REPORT
The Evaluation Report describes the residual results of the security testing by the AFMS
IA Team. It also contains technical evidence that the site has implemented the
appropriate safeguards that allow the system to process DoD information with an
acceptable level of risk as required by DoD Directive 8500.01E. The Evaluation Report
will also support the CA’s recommendation to the DAA to grant an accreditation
decision.
The AFMS IA Team will prepare the Evaluation Report and will include in it, at a
minimum:







An overview of the C&A boundary
A residual report of the test results
Remaining vulnerabilities, rated in accordance with Section 4.1.2,
Vulnerability Impact Code Determination
AFMS IA Team recommendations for vulnerability resolution
AFMS IA Team overall risk assessment determinations
Next-step activities within the accreditation process
POA&M
19
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
6 EVALUATION REPORT SUMMARY
NOTE: This section and the following sections of this document will be populated
upon completion of the mitigation security testing.
Be sure to include details about the level of effort involved in the assessment, status of the
site’s personnel background checks, etc.
Read this document in its entirety. This report reflects a recommended accreditation by
the AFMS IA Team. However, if this is not the case, please revise accordingly.
EXAMPLE TEXT: C&A Effort Type of IS Name (SSAW), version v*.*.*.*, is being
requested to accommodate an update of operating system and database. The SSAW
currently has an accreditation that was granted on DD Month YYYY as version v*.*.*.*.
SSAW is designated as Mission Assurance Category (MAC) I, II, III, Classified,
Sensitive, Public IS. This report describes the safeguards provided by SSAW to comply
with Federal, DoD, and AFMSC&A security requirements throughout its life cycle.
The IS was assessed between Month YYYY and Month YYYY (Insert month/year of the
kickoff visit and DAA signature date from the approved timeline) and testing was
completed on DD Month YYYY. (Insert final scan date from the mitigation visit.)
The AFMS IA Team evaluated SSAW against the DoD IA Controls for submission to the
DAA. The final DIACAP Scorecard reflects the IA Controls reviewed: # controls Not
Applicable (N/A); # controls Non-Compliant (N/C); # controls Compliant (C), and #
controls Inherited.
A single finding may impact several IA Controls and multiple findings may impact one
IA Control. The variance is different for each system evaluated and is dependent on such
factors as configuration, functionality, number and types of components, ports, protocols
or services used by the system.
The AFMS IA Team and the CA recommend a Type or Site C&A Effort Type be granted
to SSAW based on the following terms and conditions:

In conjunction with the approval signature for this C&A Effort Type, the SSAW
Program Manager included an POA&M to mitigate the vulnerabilities identified
in the DIACAP Scorecard, as directed by the DAA.

The AFMS IA Team Analyst will immediately notify the CA and DAA of any
slippage or failure to meet stated POA&M milestones.
The assessment was conducted using a combination of the following tools: automated
vulnerability assessment tools, manual testing, interviews, and physical security
assessments.
For Site/PO:
20
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
EXAMPLE TEXT: The scope of this assessment for the SSAW IS, which is
considered a Size/Complexity, as defined by AFMS IA Program Office, IS includes: #
servers, # workstations, # laptops, # printers, # routers, # switches, # firewalls, and #
locations. The assessment did not include SSAW’s corporate IS, as there are technical,
physical, and administrative controls in place to separate its processing from the network
that handles DoD information.
EXAMPLE TEXT: During the evaluation IS/PO SSAW showed evidence of
appropriate security safeguards and controls being in place to prevent: unauthorized
entry into the data processing facilities; unauthorized access to data and data files;
unauthorized software modifications; and divulgence of sensitive processing procedures,
techniques or related information.
IS/PO SSAW was also found to be in compliance with the current IAVMs and personnel
security requirements.
If Applicable: An analysis was performed on the PPS in the IS/PO Name SSAW
environment against the requirements outlined in DoD Instruction 8551.1, “Ports,
Protocols, and Services Management (PPSM),” 13 August 2004. Refer to Table 8-1,
Ports, Protocols, and Services.
The AFMS IA Team concludes that the IS/PO SSAW has satisfied the security
requirements to support a/an C&A Effort Type, and that the overall security risk for
IS/PO SSAW IS is CAT I/CAT II/CAT III.
21
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
7 PREVIOUS AND CURRENT RESIDUAL RISK
EXAMPLE TEXT: During the Month Year (C&A) Effort Type # of vulnerabilities
vulnerabilities that were mitigated to an acceptable level of risk remained for IS/PO
Name SSAW. Most of the vulnerabilities were related to describe. The affected describe
devices are still within the IS/PO Name SSAW C&A boundary and were assessed and
documented as part of this C&A Effort Type effort. Of the total # of remaining
vulnerabilities vulnerabilities, # of vulnerabilities from the previous C&A effort that are
carried over to this assessment previously cited vulnerabilities are included in this C&A
Effort Type Review Report. Of the total # of remaining vulnerabilities vulnerabilities, #
of vulnerabilities with an original Severity Category of CAT III have an original Severity
Category of CAT III. Refer to Table 7-1, Previous/Current Residual Risk.
Assessment
CAT I
CAT II
CAT III
Total Residual
Findings
Previous C&A Effort Type, Month Year
#
#
#
#
Current C&A Effort Type, Month Year
#
#
#
#
Table 7-1: Previous/Current Residual Risk
Please see the Executive VM Remediation Report in Appendix C for a detailed
description of each finding. Please note, the VM remediation report may also include
CAT IV vulnerabilities which are considered informational and are not considered when
assessing the residual risk for the system.
22
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
8 SECURITY ASSESSMENT
8.1
Assessment Summary
EXAMPLE TEXT: A total of NUMBER OF RISK ITEMS THAT ARE NOT MET OR
PARTIALLY MET risk items are documented in this report for IS/PO Name (SSAW) IS.
The following is a summary of the vulnerabilities identified during the assessment:
o Continuity – # of CAT I, CAT II, and/or CAT III vulnerabilities remain at the
completion of this C&A Effort Type.
o Security Design and Configuration – # of CAT I, CAT II, and/or CAT III
vulnerabilities remain at the completion of this C&A Effort Type.
o Enclave Boundary Defense – # of CAT I, CAT II, and/or CAT III vulnerabilities
remain at the completion of this C&A Effort Type.
o Enclave and Computing Environment – # of CAT I, CAT II, and/or CAT III
vulnerabilities remain at the completion of this C&A Effort Type.
o Identification and Authentication (I&A) – # of CAT I, CAT II, and/or CAT III
vulnerabilities remain at the completion of this C&A Effort Type.
o Physical and Environmental – # of CAT I, CAT II, and/or CAT III
vulnerabilities remain at the completion of this C&A Effort Type.
o Personnel – # of CAT I, CAT II, and/or CAT III vulnerabilities remain at the
completion of this C&A Effort Type.
o Vulnerability and Incident Management – # of CAT I, CAT II, and/or CAT III
vulnerabilities remain at the completion of this C&A Effort Type.
[SITE NAME] Vulnerability Summary SAND Chart
600
400
200
0
[Month YYYY]
[Month YYYY]
[Month YYYY]
CAT III
300
150
75
CAT II
200
100
50
CAT I
100
50
25
In the table above, dates should be testing dates. Mark “N/A” for fields in columns
that do not apply, if there only one or two tests were performed.
Figure 8-1: SSAW IS Vulnerability Summary SAND Chart
23
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
50
40
30
# of Items
20
10
0
False Positive
Fixed
Accept Risk
In the table above, indicate #of false positives, # of fixed vulnerabilities, and # of
vulnerabilities for which we will be requesting the DAA to accept risk.
Figure 8-2: SSAW IS Assessment Summary
8.2
Ports, Protocols, and Services Results
NOTE: This section will be populated upon completion of the mitigation security
testing.
If the site does not have a direct connection to DoD, then a statement must be written to
that effect and the table must be deleted.
EXAMPLE TEXT: SSAW must adhere to Department of Defense (DoD) Instruction
8551.1, “Ports, Protocols, and Services Management (PPSM),” dated 13 August 2004,
and the PPS must be registered. Table 8-1, Ports, Protocols, and Services, lists the
SSAW PPS that were evaluated against the DISA Assurance Category Assignment List
(CAL) Registry Database, which expires Month YYYY. The table lists all IS PPS,
including internal services, if applicable.
Key to Ports, Protocols, and Services Matrix:
Banned
Acceptable
Best Practice
Service
List the
services that
Protocol
Ports
Assigned
in System
Ports
Assigned in
Vulnerability
Assessment
Vulnerability
Assessment
Assurance
Color
List the
protocols that
Include
ports and
Include ports
and protocols
Enter the
color (Red,
24
Add appropriate classification marking
Comments
Used to upload SSAW
data to the data
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
Service
Protocol
Ports
Assigned
in System
Ports
Assigned in
Vulnerability
Assessment
Vulnerability
Assessment
Assurance
Color
Comments
the
corresponding
ports are
utilizing
the
corresponding
ports are
utilizing
protocols
that have
been
opened for
the system
that have
been opened
Yellow,
Green) and
color the cell
to match the
corresponding
color
repository. SSAW is a
web-based system
accessible from a .com
and a .mil. Users access
the site to load HC data
and run queries against
the database.
Include whether traffic
is inbound or outbound.
HTTPS
TCP
443
Yellow
Used by the SSAW
users to retrieve Test
Results from SSAW.
HTTPS
TCP
443
Yellow
SSAW interface with
the e.g., TOL/IAS for
CAC enablement.
Table 8-1: Ports, Protocols, and Services
8.3
Privacy Act
EXAMPLE TEXT: Any system that contains Privacy Act data is required to comply
with DoD Directive 5400.11, “DoD Privacy Program,” dated 8 May 2007. The IS
maintains/does not maintain a system of records that contain Privacy Act data and is
therefore not subject to comply with the Privacy Act Personnel Controls.
8.4
Protected Health Information
ISs that store, maintain, transmit or process Protected Health Information (PHI) are
required to be safeguarded against unauthorized use by the DoD Privacy Act Program,
the Health Insurance Portability and Accountability Act (HIPAA) of 1996, as well as
Section 1102, Title 10, United States Code, entitled Confidentiality of Medical Quality
Assurance Records: Qualified Immunity for Participants (10 U.S.C. 1102). Per the
criteria established by these regulations and the Computer Security Act of 1987, SSAW
does or does not maintain PHI and is or is not subject to additional technical, procedural,
physical and operational safeguards to satisfy HIPAA standards and requirements for
protecting PHI.
8.5
Conclusions and Recommendations
EXAMPLE TEXT: Based on the assessment results and the documentation included in
the DIACAP Package dated Month Year, the AFMS IA Team recommends that the
SSAW IS be granted a favorable C&A Effort Type. This assessment concluded that the
overall risk exposure to the SSAW IS is CAT I/CAT II/CAT III.
25
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
APPENDIX A
REFERENCES
References used specifically in this artifact are to be listed in this appendix.
26
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
STANDARD REFERENCES:
 Office of Management and Budget (OMB) Circular A-130, “Management of
Federal Information Resources.” 24 December 1985. Revised, Transmittal
Memorandum No. 4, Appendix III, “Security of Federal Automated Information
Resources.” 28 November 2000.
 OMB Memorandum M-06-15, “Safeguarding Personally Identifiable
Information.” 22 May 2006.
 OMB Memorandum M-06-16, “Protection of Sensitive Agency Information.” 23
June 2006.
 OMB Memorandum M-06-19, “Reporting Incidents Involving Personally
Identifiable Information and Incorporating the Cost for Security in Agency
Information Technology Investments.” 12 July 2006.
 OMB Memorandum M-07-16, “Safeguarding Against and Responding to the
Breach of Personally Identifiable Information.” 22 May 2007.
 “Privacy Act of 1974.” 5 (U.S. Code) U.S.C. § 552a 1999 ed. P.L. 93-579.
 “E-Government Act of 2002.” U.S.C. 17 December 2002.
 “Federal Information Security Management Act of 2002.” U.S.C.
 “Freedom of Information Act of 1967.” 5 U.S.C. 552. As amended in 2002.
 “Computer Security Act of 1987.” P. L. 100-235. 8 January 1988.
 “Health Insurance Portability and Accountability Act (HIPAA)” 1996.
 Chairman of the Joint Chiefs of Staff Manual (CJCSM) 6510.01A, “Information
Assurance (IA) and Computer Network Defense (CND).” Volume I (Incident
Handling Program). 24 June 2009.

National Security Telecommunications and Information Systems Security
Instruction (NSTISSI) 4009, “National Information Systems Security (INFOSEC)
Glossary.” April 2005.

Department of Defense (DoD) Assistant Secretary of Defense (ASD) Networks
and Information Integration (NII) Memorandum, “Department of Defense (DoD)
Guidance on Protecting Personally Identifiable Information (PII).” 18 August
2006.

DoD Public Key Infrastructure (PKI) Program Management Office (PMO).
“X.509 Certificate Policy for the United States Department of Defense.” Version
9. 9 February 2005.

DoD ASD Health Affairs (HA) Memorandum, “Interim Policy Memorandum on
Electronic Records and Electronic Signatures for Clinical Documentation.” 4
August 2005.

DoD Directive 5000.01, “The Defense Acquisition System.” 12 May 2003.
Certified current 20 November 2007.
27
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013

DoD Directive 5136.12, “TRICARE Management Activity.” 31 May 2001.
Certified current 21 November 2003.

DoD Directive 5200.2, “DoD Personnel Security Program.” 9 April 1999.

DoD Directive 5220.22, “National Industrial Security Program.” 27 September
2004. Certified current 1 December 2006.

DoD Directive 5400.7, “DoD Freedom of Information Act (FOIA) Program.” 2
January 2008.

DoD Directive 5400.11, “DoD Privacy Program.” 8 May 2007.

DoD Directive 8500.01E, “Information Assurance.” 24 October 2002. Certified
current 23 April 2007.

DoD Directive O-8530.1, “Computer Network Defense.” 8 January 2001.

DoD Directive 8570.01. “Information Assurance Training, Certification, and
Workforce Management,” 15 August 2004. Certified Current as of 23 April
2007.

DoD Instruction 5200.01, “DoD Information Security Program and Protection of
Sensitive Compartmented Information.” 9 October 2008.

DoD Instruction 8100.3, “Department of Defense (DoD) Voice Networks.” 16
January 2004.

DoD Instruction 8500.2, “Information Assurance (IA) Implementation.” 6
February 2003.

DoD Instruction 8510.01, “DoD Information Assurance Certification and
Accreditation Process (DIACAP).” 28 November 2007.

DoD Instruction 8520.2, “Public Key Infrastructure (PKI) and Public Key (PK)
Enabling.” 1 April 2004.

DoD Instruction O-8530.2, “Support to Computer Network Defense (CND).” 9
March 2001.

DoD Instruction 8551.1, “Ports, Protocols, and Services Management (PPSM).”
13 August 2004.
DoD Publication 5200.1-R, “Information Security Program.” 14 January 1997.


DoD Publication 5200.2-R, “Personnel Security Program.” January 1987.
Change 3. 23 February 1996.

DoD Publication 5200.08-R, “Physical Security Program.” 9 April 2007.

DoD Publication 5400.7-R, “DoD Freedom of Information Act Program.”
September 1998. Change 1. 11 April 2006.

DoD Publication 5400.11-R, “Department of Defense Privacy Program.” 14 May
2007.
28
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013

DoD Publication 6025.18-R, “DoD Health Information Privacy Regulation.” 24
January 2003.

DoD Manual 5220.22-M, “National Industrial Security Program Operating
Manual.” 28 February 2006.

DoD Manual 8910.1-M, “DoD Procedures for Management of Information
Requirements.” 30 June 1998.

NIST FIPS Publication 140-2, “Security Requirements for Cryptographic
Modules.” 25 May 2001. Change Notice 4. 03 December 2002.
NIST FIPS Publication 180-3, “Secure Hash Standard (SHS).” Incorporated
Change Order. 17 October 2008.
NIST FIPS Publication 201-1, “Personal Identity Verification (PIV) of Federal
Employees and Contractors.” Change Notice 1. March 2006.
NIST Special Publication 800-18, “Guide for Developing Security Plans for
Federal Information Systems.” Revision 1. February 2006.
NIST Special Publication 800-27, “Engineering Principles for Information
Technology Security (A Baseline for Achieving Security).” Revision A. June
2004.
NIST Special Publication 800-34, “Contingency Planning Guide for Federal
Information Systems.” June 2002. Revision 1. May 2010.
NIST Special Publication 800-37, “Guide for Applying the Risk Management
Framework to Federal Information Systems: A Security Life Cycle Approach.”
Revision 1. February 2010.
NIST Special Publication 800-50, “Building an Information Technology Security
Awareness and Training Program.” October 2003.
NIST Special Publication 800-53, “Recommended Security Controls for Federal
Information Systems and Organizations.” Revision 3. August 2009.
NIST Special Publication 800-61, “Computer Security Incident Handling Guide.”
Revision 1. March 2008.
NIST Special Publication 800-64, “Security Considerations in the System
Development Life Cycle.” Revision 2. October 2008.










Insert here the DISA Security Readiness Review (SRR) Procedures, Security Manual
Checklists, and Security Technical Implementation Guides (STIGs) used during this C&A
effort from the “Index File” provided in the appropriate monthly AFMS IA C&A
Policies and Templates release. Be sure to bulletize the lists.
29
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
APPENDIX B
SSAWS
SSAWs used specifically in this artifact are to be listed in this appendix.
30
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
SSAWs
SSAW
TERM
ACF2
Access Control Facility 2
ACP
Access Control Protocol
ADP
Automatic Data Processing
ATO
Authorization to Operate
C&A
Certification and Accreditation
CA
Certifying Authority
CAL
Category Assignment List
CD
Compact Disc
DAA
Designated Accrediting Authority
DATO
Denial of Authorization to Operate
DB2
Database 2
DHCP
Dynamic Host Configuration Protocol
DIACAP
DoD Information Assurance Certification and Accreditation
Process
DiD
Defense-in-Depth
DISA
Defense Information Systems Agency
DISN
Defense Information System Network
DNS
Domain Name System
DoD
Department of Defense
FIPS
Federal Information Processing Standard
FOIA
Freedom of Information Act
FSO
Field Security Operations
GUI
Graphical User Interface
HIPAA
Health Insurance Portability and Accountability Act
I&A
Identification and Authentication
IA
Information Assurance
IAM
Information Assurance Manager
IAO
Information Assurance Officer
IATO
Interim Authorization To Operate
IAVA
Information Assurance Vulnerability Alert
31
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
SSAW
TERM
IAVB
Information Assurance Vulnerability Bulletin
IAVM
Information Assurance Vulnerability Management
IAVT
Information Assurance Vulnerability Technical Advisory
IP
Internet Protocol
IS
Information System
IT
Information Technology
LPAR
Logical Partition
MAC
Mission Assurance Category
MHS
Military Health System
MVS
Multiple Virtual Storage
NII
Network and Information Integration
NIST
National Institute of Standards and Technology
NSA
National Security Agency
NSTISSI
National Security Telecommunications and Information Systems
Security Instruction
NSTISSI 4009
National Security Telecommunications and Information Systems
Security (INFOSEC) Glossary
OMB
Office of Management and Budget
OS
Operating System
P2P
Peer-to-Peer
PDI
Packet Data Interface
PDS
Partitioned Data Set
PHI
Protected Health Information
PKI
Public Key Infrastructure
PO
Program Office
POA&M
Plan of Action and Milestone
POC
Point of Contact
PPS
Ports, Protocols, and Services
PSA
Physical Security Assessment
P.L.
Public Law
RACF
Resource Access Control Facility
32
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
SSAW
TERM
SIAO
Senior Information Assurance Officer
SOP
Standard Operating Procedure
SQL
Structured Query Language
SRR
Security Readiness Review/ System Requirements Review
STIG
Security Technical Implementation Guide
STP&ER
Security Test Plan and Evaluation Report
TMA
TRICARE Management Activity
TSM
TRICARE Systems Manual
UDB
Universal Database
URL
Uniform Resource Locator
VM
Vulnerability Matrix
33
Add appropriate classification marking
SSAW Artifact 4, Information System Security Test Plan and Evaluation Report
April 2013
APPENDIX C
EXECUTIVE VM REMEDIATION REPORT
Details for all open findings are documented within this appendix.
Embed the Executive VM Remediate Report below. Be sure to print the report and
include as part of this document to be submitted with the Executive Package.
34
Add appropriate classification marking
Download