GISFI TR SP.1xx V1.0.0 (2012-xx)
Technical Report
Global ICT Standardisation Forum for India;
Technical Working Group Security and Privacy;
Security Testing Methods for ICT products;
(Release 1)
The present document has been developed within GISFI and may be further elaborated for the purposes of GISFI.
Release 1
2
GISFI TR SP.1xx V1.0.0 (2012-xx)
GISFI
GISFI office address
Suite 303, 3rd Floor, Tirupati Plaza, Plot No. 4,
Sector 11, Dwarka, New Delhi-110075, India
Tel.: +91-11-47581800 Fax: +91-11-47581801
Internet
http://www.gisfi.org
E-mail: info@gisfi.org
Copyright Notification
No part may be reproduced except as authorized by written permission.
The copyright and the foregoing restriction extend to reproduction in all media.
© 2011, GISFI
All rights reserved.
GISFI
Release 1
3
GISFI TR SP.1xx V1.0.0 (2012-xx)
Contents
Foreword............................................................................................................................................................. 4
Introduction ........................................................................................................................................................ 4
1
Scope ........................................................................................................................................................ 5
2
References ................................................................................................................................................ 5
3
Definitions, symbols and abbreviations ................................................................................................... 6
3.1
3.2
Definitions ......................................................................................................................................................... 6
Abbreviations ..................................................................................................................................................... 6
4
Requirements for security testing of ICT products and systems .............................................................. 7
5
ICT Security Test Methods ...................................................................................................................... 8
5.1
5.2
5.3
5.4
5.5
5.6
6
6.1
6.2
7
7.1
7.2
8
8.1
8.2
Common Criteria (CC) ...................................................................................................................................... 8
The NIST Technical Guide to Information Security Testing and Assessment (SP800-115) ............................. 9
The ETSI MTS-IPT IPv6 Security Test Specifications ................................................................................... 11
The OWASP Testing Guide (Version 3) ......................................................................................................... 11
The IETF – Network Endpoint Assessment (NEA) ......................................................................................... 12
The ICT Security Testing Standards from ISO 27000 family ......................................................................... 13
Proposal for Network Element Testing .................................................................................................. 13
Network Element Testing: Flow-Diagram ....................................................................................................... 13
Network Element Testing: Brief on the Testing steps ..................................................................................... 16
Proposal for Network Testing ................................................................................................................ 19
Network Testing: Flow-Diagram ..................................................................................................................... 19
Network Testing: Brief on the Testing steps ................................................................................................... 22
Gap analysis by GISFI, based on the DoT requirements [10] [11] [12] [13] ......................................... 26
Technical Gaps ............................................................................................................................................... 26
Policy Gaps ..................................................................................................................................................... 27
9
Requirement for Common Criteria adoption- Means for development of Protection Profiles .............. 27
10
Conclusions ............................................................................................................................................ 27
Annex A (informative): Review Techniques, Security Assessment Execution phase, according to
The NIST SP800-115 [4] ....................................................................................................... 28
Annex B: Change history ............................................................................................................................... 30
GISFI
Release 1
4
GISFI TR SP.1xx V1.0.0 (2012-xx)
Foreword
This Technical Report has been produced by GISFI.
The contents of the present document are subject to continuing work within the Technical Working Group (TWG) and
may change following formal TWG approval. Should the TWG modify the contents of the present document, it will be
re-released by the TWG with an identifying change of release date and an increase in version number as follows:
Version x.y.z
where:
x the first digit shows the release to which the document belongs
y the second digit is incremented for all changes of substance, i.e. technical enhancements, corrections,
updates, etc.
z the third digit is incremented when editorial only changes have been incorporated in the document.
.
Introduction
This report presents ‘Network Element Testing’ in regards to performing security tests on network elements and to
certifying them as ‘approved/tested/etc for security’ before they are integrated into the mobile network. It also presents
‘Network Testing’ in regards to performing security tests on entire mobile networks deployed by Cellular Operators and
to certifying them as ‘approved/tested/etc. for security’.
Also, this report captures information about already available security test methods being employed by product/system
certification bodies and gives several proposals for Department of Telecom (DOT) requirements on security testing of
network.
GISFI
Release 1
1
5
GISFI TR SP.1xx V1.0.0 (2012-xx)
Scope
This document, study report on Security Testing Methods for ICT products is a deliverable of Security and Privacy
working group. The scope of this technical report is to perform a study on the various testing methods that are currently
employed in different regions of the world. Items within the scope of this study are:
1. Review the security testing methods for ICT products and systems.
2. Develop testing steps for ‘Network Element Testing’ and ‘Network Testing’.
3. Present a gap analysis between the testing methods asked for by the DoT (vide Circular “10-15/2011-AS.III/
(21)”, dated 31/05/2011 [10]) and the current testing methods being used in the country and to provide
recommendations.
2
References
[1] Common Criteria for Information Technology Security Evaluation:
http://www.niap-ccevs.org/Documents_and_Guidance/cc_docs.cfm.
[2] Common Criteria, “Common Criteria for Information Technology Security Evaluation: Part 3: Security assurance
components"; Ver. 3.1, Rev. 3, Final; July 2009; Ch. 8, Pg. 32 to 44.
[3] Common Criteria, “Common Criteria for Information Technology Security Evaluation: Evaluation Methodology",
by the Common Criteria portal; Ver. 3.1, Rev. 3, Final; July 2009; Ch. 8, pg 25 – 29.
[4] NIST SP800-115, “Technical Guide to Information Security Testing and Assessment”; NIST Computer Security
Division (CSD) (Karen Scarfone, Murugiah Souppaya, Amanda Cody, Angela Orebaugh); September 2008; Pgs.:
Chapter 2, pg. 2-1; Chapters 3 through 5, pg 3-1 to 5-7.
[5] ETSI: http://www.etsi.org/WebSite/NewsandEvents/2012_SECURITYWORKSHOP.aspx.
The ETSI MTS-IPT website: http://www.etsi.org/WebSite/Technologies/IPTesting.aspx.
[6] OWASP: https://www.owasp.org/index.php/Main_Page.
https://www.owasp.org/index.php/OWASP_Testing_Project.
OWASP Testing Guide (Version 3) document source:
https://www.owasp.org/index.php/Projects/OWASP_Testing_Project/Releases/Testing_Guide_V_3.0.
[7] IETF: www.ietf.org.
Network Endpoint Assessment: RFC 5209: https://datatracker.ietf.org/doc/rfc5209/.
[8] 3GPP 33-series (Security-related) Technical Specifications: http://www.3gpp.org/ftp/Specs/html-info/33-series.htm.
[9] 3GPP2 Specifications page: http://www.3gpp2.org/Public_html/specs/alltsgscfm.cfm.
[10] DoT Circular “10-15/2011-AS.III/ (21)”, 31 May 2011: www.dot.gov.in/AS-III/2011/as-iii.pdf.
[11] GISFI_SP_201203176, “Proposals for activity on network security requirements of India”, NEC, March 2012.
[12] GISFI_SP_201203178, “Approach Note on proposed GISFI’s activity on Network Security”, Krishna Sirohi,
March 2012.
[13] GISFI_SP_201206244, “Overview and System Security to Security Testing”, NEC, June 2012.
GISFI
Release 1
6
3
Definitions, symbols and abbreviations
3.1
Definitions
This document defines the following items.
3.2
3GPP
3GPP2
CC
CCRA
DoT
ETSI
ICT
IETF
IP (v6)
MTS-IPT
NIST
OWASP
PP
ST
Abbreviations
Third Generation Partnership Project
Third Generation Partnership Project 2
Common Criteria
Common Criteria Recognition Agreement
Department of Telecommunications
European Telecommunications Standards Institute
Information and Communication Technologies
Internet Engineering Task Force
Internet Protocol (version 6)
Methods for Testing and Specification-IP Testing
National Institute of Standards and Technology
Open Web Application Security Project
Protection Profile
Security Target
GISFI
GISFI TR SP.1xx V1.0.0 (2012-xx)
Release 1
4
7
GISFI TR SP.1xx V1.0.0 (2012-xx)
Requirements for security testing of ICT products
and systems
To be prepared (as from the DoT document (vide Circular “10-15/2011-AS.III/ (21)”, dated 31/05/2011 [10]))
GISFI
Release 1
8
5
ICT Security Test Methods
5.1
Common Criteria (CC)
GISFI TR SP.1xx V1.0.0 (2012-xx)
The Common Criteria (CC), more specifically known as The Common Criteria for Information Technology Security
Evaluation, was adopted and published by the ISO/IEC, following earlier attempts to integrate information technology
and computer security criteria by various regional SDO’s. [1]
The Common Criteria is composed of three parts [1]:
a) ISO/IEC 15408-1:2009: (Part 1) The Introduction and General Model: is the introduction to the CC. It defines
the general concepts and principles of IT security evaluation and presents a general model of evaluation.
b) ISO/IEC 15408-2:2008: (Part 2) The Security Functional Requirements: establishes a set of functional
components that serve as standard templates upon which to base functional requirements for Target Of
Evaluation(s) (TOEs). CC Part 2 catalogues the set of functional components and organizes them in families
and classes.
c) ISO/IEC 15408-3:2008: (Part 3) The Security Assurance Requirements: establishes a set of assurance
components that serve as standard templates upon which to base assurance requirements for TOEs. CC Part 3
catalogues the set of assurance components and organizes them into families and classes. CC Part 3 also
defines evaluation criteria for Protection Profile(s) (PPs) and Security Target(s) (STs) and presents seven predefined assurance packages which are called the Evaluation Assurance Levels (EALs).
And these are accompanied by:
 ISO/IEC 18045:2008: Evaluation Methodology: While Part 3 specifies the actions that must be performed to
gain assurance, it does not specify how those actions are to be conducted. The Common Evaluation
Methodology (CEM) provides the methodology for IT security evaluation using the CC as a basis.
These documents are used by the certifying body of a CC scheme and the evaluation facilities.
The CC is a standard for evaluating ICT security products against two types of requirements:
 Security functional requirements
 Security assurance requirements.
A product or service that is to be evaluated under the Common Criteria guidelines is referred to as a TOE and it is the
developer's responsibility to provide evidence that the security provisions for a TOE have been designed and
implemented to meet the requirements of ISO/IEC 15408.
The Common Criteria defines two different documents as TOE:
a) Protection Profile (PP): A description of a generic type of security device (e.g., a firewall).
b) Security Target (ST): A description of a specific security device.
The Evaluation Assurance Level (EAL1 through EAL7) of an IT product or system is a numerical grade assigned
following the completion of a Common Criteria security evaluation. The increasing assurance levels reflect added
assurance requirements that must be met to achieve Common Criteria certification.
The various Assurance Levels can be listed as follows [2]:
a) EAL1: Functionally Tested
b) EAL2: Structurally Tested
GISFI
Release 1
9
GISFI TR SP.1xx V1.0.0 (2012-xx)
c) EAL3: Methodically Tested and Checked
d) EAL4: Methodically Designed, Tested, and Reviewed
e) EAL5: Semi formally Designed and Tested
f) EAL6: Semi formally Verified Design and Tested
g) EAL7: Formally Verified Design and Tested
The CEM [3] provides an overview of the Evaluation process with four tasks that an evaluator needs to perform. They
are as follows:
a) The input task [3]: The objective of this task is to ensure that the evaluator has available the correct version of
the evaluation evidence (any resource required from the sponsor or developer by the evaluator or evaluation
authority to perform one or more evaluation or evaluation oversight activities) necessary for the evaluation and
that it is adequately protected. Otherwise, the technical accuracy of the evaluation cannot be assured, nor can it
be assured that the evaluation is being conducted in a way to provide repeatable and reproducible results.
b) The evaluation sub-activities [3]: is the actual evaluation activity.
c) The output task [3]: It consists of the consistent reporting of evaluation results that facilitates the achievement of
the universal principle of repeatability and reproducibility of results. The consistency covers the type and the
amount of information reported in the Evaluation Technical Report (ETR) and Observation Report (OR)
d) The demonstration of technical competence to the evaluation authority task [3]: may be fulfilled by the
evaluation authority analysis of the output tasks results, or may include the demonstration by the evaluators of
their understanding of the inputs for the evaluation sub-activities.
5.2
The NIST Technical Guide to Information Security Testing
and Assessment (SP800-115)
The NIST has published the “SP800-115: Technical Guide to Information Security Testing and Assessment” [4] that
addresses technical testing and examination techniques that can be used to identify, validate, and assess technical
vulnerabilities and assist organizations in understanding and improving the security posture of their systems and
networks.
The NIST guide outlines Security Testing Techniques into the following three categories grouped as:
a) Review Techniques [4]: These are examination techniques used to evaluate systems, applications, networks,
policies, and procedures to discover vulnerabilities, and are generally conducted manually. They include:
i. Documentation Review: Documentation review determines if the technical aspects of policies and
procedures are current and comprehensive. It evaluates policies and procedures for technical accuracy
and completeness.
ii. Log Review: Log review determines if security controls are logging the proper information, and if the
organization is adhering to its log management policies. It could reveal potential problems and policy
deviations.
iii. Rule set Review: A rule set is a collection of rules or signatures that network traffic or system activity is
compared against to determine what action to take. Rule set review reveals holes in rule set-based
security controls.
iv. System configuration review: System configuration review is the process of identifying weaknesses in
security configuration controls, such as systems not being hardened or configured according to
security policies.
v. Network sniffing: Network sniffing is a passive technique that monitors network communication, decodes
protocols, and examines headers and payloads to flag information of interest.
GISFI
Release 1
10
GISFI TR SP.1xx V1.0.0 (2012-xx)
vi. File integrity checking: File integrity checkers provide a way to identify that system files have been
changed computing and storing a checksum for every guarded file, and establishing a file checksum
database. Stored checksums are later recomputed to compare their current value with the stored value,
which identifies file modifications.
b) Target Identification and Analysis [4]: These testing techniques can identify systems, ports, services, and
potential vulnerabilities, and may be performed manually but are generally performed using automated tools.
They include:
i. Network Discovery: This technique discovers active devices on a network. It identifies communication
paths and facilitates determination of network architectures. Network discovery may also detect
unauthorized or rogue devices operating on a network.
ii. Network port and Service Identification: Network port and service identification involves using a port
scanner to identify network ports and services operating on active hosts—such as File Transfer
Protocol (FTP) and Hypertext Transfer Protocol (HTTP)—and the application that is running each
identified service, such as Microsoft Internet Information Server (IIS) or Apache for the HTTP
service. It discovers open ports and associated services/ applications.
iii. Vulnerability Scanning: identifies hosts and host attributes (e.g., operating systems, applications, open
ports), but it also attempts to identify vulnerabilities rather than relying on human interpretation of the
scanning results. Vulnerability scanning can help identify outdated software versions, missing
patches, and mis-configurations, and validate compliance with or deviations from an organization’s
security policy. This is done by identifying the operating systems and major software applications
running on the hosts and matching them with information on known vulnerabilities stored in the
scanners’ vulnerability databases.
iv. Wireless Scanning: identifies unauthorized wireless devices within the range of the scanners, discovers
wireless signals outside of an organization’s perimeter and detects potential backdoors and other
security violations. Wireless scans can help organizations determine corrective actions to mitigate
risks posed by wireless-enabled technologies (Wi-Fi, Bluetooth, etc.). It can be conducted as either
Passive wireless scanning (using tools that transmit no data, nor do they affect the operation of
deployed wireless devices. For example, Wireless Intrusion Detection and Prevention Systems
(WIDPS)) or Active wireless scanning that builds on the information collected during passive scans,
and attempts to attach to discovered devices and conduct penetration or vulnerability-related testing.
c) Target Vulnerability Validation [4]: These testing techniques corroborate the existence of vulnerabilities, and
may be performed manually or by using automatic tools, depending on the specific technique used and the skill
of the test team. They include:
i. Password cracking: identifies weak passwords and password policies. Password cracking is the process of
recovering passwords from password hashes (#) stored in a computer system or transmitted over
networks. It is usually performed during assessments to identify accounts with weak passwords.
Password cracking is performed, using various methods (Dictionary attack, Hybrid attack, Brute
Force, etc.), on hashes that are either intercepted by a network sniffer while being transmitted across a
network, or retrieved from the target system, which generally requires administrative-level access on,
or physical access to, the target system.
ii. Penetration Testing: tests security using the same methodologies and tools that attackers employ. It also
demonstrates how vulnerabilities can be exploited iteratively to gain greater access. It is a four-phased
process that consists of the planning phase, the discovery phase, the attack (execution) phase, and the
reporting phase.
iii. Social Engineering: allows testing of both procedures and the human element (user awareness). Social
engineering is an attempt to trick someone into revealing information (e.g., a password) that can be
used to attack systems or networks. It is used to test the human element and user awareness of
security, and can reveal weaknesses in user behavior—such as failing to follow standard procedures.
Social engineering can be performed through many means, including analog (e.g., conversations
conducted in person or over the telephone) and digital (e.g., e-mail, instant messaging).
GISFI
Release 1
11
GISFI TR SP.1xx V1.0.0 (2012-xx)
According to the NIST guide, since no one technique can provide a complete picture of the security of a system or
network, organizations should combine appropriate techniques to ensure robust security assessments. For example,
penetration testing usually relies on performing both network port/service identification and vulnerability scanning to
identify hosts and services that may be targets for future penetration.
5.3
The ETSI MTS-IPT IPv6 Security Test Specifications
ETSI technical committee MTS (Methods for Testing and Specification) has set up a working group, MTS-IPT (IP
Testing) [5] to focus on methodology and the production of test specifications for IP-related protocols. Co-funded by
ETSI and the EC/EFTA (European Commission/European Free Trade Association), MTS-IPT is an open group at
which participation is welcome.
This project will provide a publicly available test development framework and interoperability test packages for four
key areas of IPv6, of which one is ‘Security’.
The Test Specifications regarding IPv6 Security Testing have been published as [5]:
 ETSI TS 102 558 IPv6 Security: Requirements Catalogue
 ETSI TS 102 593 IPv6 Security: Conformance TSS & TP
 ETSI TS 102 594 IPv6 Security: Conformance Test Suite
 ETSI TS 102 597 IPv6 Security: Interoperability Test Suite
5.4
The OWASP Testing Guide (Version 3)
The OWASP Testing Guide (Version 3) [6] provides an overview of various testing techniques that can be employed
when building a testing program that covers the following:
a) Manual Inspection and Reviews [6]: Manual inspections are human-driven reviews that typically test the security
implications of the people, policies, and processes, but can include inspection of technology decisions such as
architectural designs. They are usually conducted by analyzing documentation or performing interviews with
the designers or system owners.
b) Threat Modeling [6]: help system designers think about the security threats that their systems/applications might
face. Therefore, threat modeling can be seen as risk assessment for applications.
c) Code Review [6]: Source code review is the process of manually checking a web application's source code for
security issues. Many serious security vulnerabilities cannot be detected with any other form of analysis or
testing. Examples of issues that are particularly conducive to being found through source code reviews include
concurrency problems, flawed business logic, access control problems, and cryptographic weaknesses as well
as backdoors, Trojans, Easter eggs, time bombs, logic bombs, and other forms of malicious code.
d) Penetration Testing [6]: Penetration testing (for web applications) is essentially the “art” of testing a running
application remotely, without knowing the inner workings of the application itself, to find security
vulnerabilities. Typically, the penetration test team would have access to an application as if they were users.
The tester acts like an attacker and attempts to find and exploit vulnerabilities.
The OWASP testing framework consists of the following activities that should take place:
a) Phase 1: Before Development Begins [6]:
i. Phase 1A: Review Policies and Standards
ii. Phase 1B: Develop Measurement and Metrics Criteria (ensure traceability)
b) Phase 2: During Definition and Design [6]:
GISFI
Release 1
12
GISFI TR SP.1xx V1.0.0 (2012-xx)
i. Phase 2A: Review Security Requirements
ii. Phase 2B: Review Design and Architecture
iii. Phase 2C: Create and Review UML Models
iv. Phase 2D: Create and Review Threat Models
c) Phase 3: During Development [6]:
i. Phase 3A: Code Walkthroughs
ii. Phase 3B: Code Reviews
d) Phase 4: During Deployment [6]:
i. Phase 4A: Application Penetration Testing
ii. Phase 4B: Configuration Management Testing
e) Phase 5: Maintenance and Operations [6]:
i. Phase 5A: Conduct Operational Management Reviews
ii. Phase 5B: Conduct Periodic Health Checks
iii. Phase 5C: Ensure Change Verification
Apart from the activities required to be performed before and after the testing phase, the OWASP Testing Guide details
the Web Application Penetration testing methodology. This methodology (Active mode) is split in 9 sub-categories for
a total of 66 controls as:
 Configuration Management Testing
 Business Logic Testing
 Authentication Testing
 Authorization Testing
 Session Management Testing
 Data Validation Testing
 Denial of Service Testing
 Web Services Testing
 Ajax Testing
5.5
The IETF – Network Endpoint Assessment (NEA)
Network Endpoint Assessment (NEA) [7] architectures according to IETF RFC5209 have been implemented in the
industry to assess the "posture" of endpoint devices for the purposes of monitoring compliance to an organization's
posture policy and optionally restricting access until the endpoint has been updated to satisfy the posture requirements.
Posture refers to the hardware or software configuration of an endpoint as it pertains to an organization's security policy.
Posture may include knowledge that software installed to protect the machine (e.g. patch management software, antivirus software, host firewall software, host intrusion protection software or any custom software) is enabled and up-todate. An endpoint supporting NEA protocols can be queried for posture information. [7]
The NEA seems to be a work in progress, rather than a concrete evaluation criteria or Standard.
GISFI
Release 1
5.6
13
GISFI TR SP.1xx V1.0.0 (2012-xx)
The ICT Security Testing Standards from ISO 27000 family
To be prepared
6
Proposal for Network Element Testing
6.1
Network Element Testing: Flow-Diagram
The testing step(s) can be iterative, depending on outcome of a particular evaluation step, requiring fixing of
severe/critical bugs/issues by equipment vendors, as reported by Test Lab. The testing steps/ procedures pertaining to
‘CC Testing’, explained in this document, have been framed with reference from the Common Criteria – Common
Evaluation Methodology (Version 3.1) [1]
Generic Requirement (GR)
document sign-off
Equipment chosen,
based on criticality, as
defined in baseline
document
Radio Access
Network Element
Core Network
Element
A relevant
3GPP/3GPP2
/Wireless Security
Standard for the
equipment/element?
A
GISFI
Internet Core
Element
Release 1
14
GISFI TR SP.1xx V1.0.0 (2012-xx)
A
Test the network element
with the test-case suite for
security compliance
Pass or
Fail
Fail
Intermediate Test
Report (with Failed Test
items)
Pass
Intermediate Test
Report (with
waivers/explanations)
Vendor to take
corrective action and
re-submit
A
PP identification/ ST
definition for the
equipment/element
B
GISFI
Release 1
15
GISFI TR SP.1xx V1.0.0 (2012-xx)
B
With EAL ‘X’
CC Testing
CC Testing, according to
Common Evaluation
Methodology (CEM) (ISO
18045: 2008)
Add Security Functional
and Assurance
Components for the next
higher EAL
Fail
Pass or Fail
Vendor to take
corrective action and resubmit
Pass
Increment EAL by 1
A
Intermediate Report
(OR/ETR)
Is EAL
<=7
Yes
Is CC testing with the
next higher EAL
required?
No
No
Final CC Validation
Report
Final Approval
Certificate Issuance by
Security Test Lab
GISFI
Release 1
6.2
16
GISFI TR SP.1xx V1.0.0 (2012-xx)
Network Element Testing: Brief on the Testing steps
STEP 1. Generic Requirement (GR) Document Sign-off:
This step involves filling-in of details of supported/ unsupported configuration parameters against mentioned
requirements as contained in the relevant Communication Standards. The GR document also contains product technical
information as provided by network (device/equipment) product manufacturers. This step also involves the submission
of the duly filled documents by the network product manufacturers to the Security Test Lab.
STEP 2. Telecommunication Product Classification by Lab:
This step involves comparison of the network element under consideration against a ‘Baseline document’ that defines
the criticality of the same.
This step also involves, in parallel, classification of a Telecommunication equipment classification into one of the three
broad categories as:
a) Radio Access Network Element (such as Base Station Transceivers (BTSs), NodeBs (HNBs), Radio Network
Controllers (RNCs), etc.)
b) Core Network Element (such as Home Subscriber Server (HSS), Operation, Administration, Maintenance &
Provisioning (OAM&P) System, Serving GPRS Support Node (SGSN), etc.)
c) Internet Core Element (Breakout Gateway Control Function (BGCF), Media Gateway Control Function
(MGCF), etc.)
STEP 3. 3GPP Security Standards Testing:
If the product/equipment has relevant 3GPP/3GPP2 standards for security, test for the same (Test cases for every
scenario need to be developed)
Reference Standards:
3GPP [8]: 33-series Technical Specifications.
3GPP2 [9]: C.S0024-400-C v2.0, C.S0102-0 v1.0, S.R0006-804-A v1.0, S.R0082-0 v1.0, S.R0083-0 v1.0, S.R0086-A
v1.0, S.R0138-0 v1.0, S.S0078-B v1.0, S.S0083-A v1.0, S.S0086-B v2.0, S.S0110-0 v1.0, S.S0114-A v1.0, S.S0127-0
v1.0, S.S0132-0 v1.0, X.S0027-002-0 v1.0, etc.
STEP 4. 3GPP Security Standards Test Result:
If the network equipment passes the 3GPP Security Standards testing, then the Security Test Lab must procure an
intermediate test report, with details of waivers and relevant explanations (by vendors, with negotiation with the
Tester(s)). The equipment vendor must proceed to prepare for CC testing for the equipment, as per the following steps.
If the network equipment fails the 3GPP Security Standards testing, then the Security Test Lab must procure an
intermediate test report, with details of ‘Failed Test items’. In response, the vendor/ manufacturer must fix the relevant
software/ firmware (at times, even, hardware must be modified) and must re-submit the equipment for testing (starting
at Step 3).
STEP 5. Common Criteria (CC) Testing, According to Common Evaluation Methodology (CEM) (ISO 18045: 2008)
[1]:
Prior to CC testing, Equipment Manufacturers prepare the following pre-requisite documents:
GISFI
Release 1
17
GISFI TR SP.1xx V1.0.0 (2012-xx)
a) Product Hardware/Firmware/Software information (Release document).
b) Protection Profile (Generic category security specification) definition (Equipment manufacturers must receive
the PP’s from the Security Test Lab and must submit it as a pre-requisite, along-with the other pre-requisite
documents)
c) Security Target (specific product security specification) definition
The CC testing can start from a lower Evaluation Assurance Level (EAL) and can then be increased to the next higher
EAL level.
Post the Product Submission procedure (along-with the above-listed three documents), evaluations are conducted for the
order of the classes defined in the Common Evaluation Methodology (CEM version 3.1) in steps as:
a) Evaluation Input task: related to management of Evaluation evidence
b) Evaluation sub-activities: apply to PPs and STs
c) Evaluation Output task: related to report generation
d) Demonstration of Technical Competence task:
Point (b) represents guidance on the actual test steps for PPs and STs and are organized into “Classes”.
Elaboration of (b) Evaluation of sub-activities: Each Class further breaks down into activities that can further be subdivided into various sub-activities.
The evaluation sub-activities vary depending whether it is a PP or a TOE evaluation. Moreover, in the case of a TOE
evaluation, the sub-activities depend upon the selected assurance requirements.
The classes are:
a) Class APE: Protection Profile evaluation
b) Class ASE: Security Target evaluation
c) Class ADV: Development
d) Class AGD: Guidance documents
e) Class ALC: Life-cycle support
f) Class ATE: Tests
g) Class AVA: Vulnerability assessment
h) Class ACO: Composition
STEP 6. CC Testing, Intermediate Report:
Elaboration of point (c) Evaluation Output task:
The evaluator performs the two following sub-tasks in order to achieve the CEM requirements for the information
content of reports:
a) Write OR sub-task (mandatory: for a fail verdict; optional: if needed in the context of the evaluation for
clarification);
GISFI
Release 1
18
GISFI TR SP.1xx V1.0.0 (2012-xx)
b) Write ETR sub-task (for PP or ST) to present technical justification of the verdicts.
If the network equipment fails the CC testing, at this particular EAL Level, then in response to the OR, the vendor/
manufacturer must fix the relevant software/ firmware (at times, even, hardware must be modified) and must re-submit
the equipment for testing (starting again, all the way up, from Step 3).
STEP 7. Incrementing the EAL Level and Testing, if required:
If the network equipment is required to be tested for more confidence, then the equipment can be re-tested against more
Security Functional and Assurance Components for the next higher EAL level, starting from Step 5.
If this equipment re-testing (with higher EAL level) is not required, then the Security Test Lab shall issue the Final CC
Validation Report to the equipment vendor/ manufacturer.
STEP 8. Final Approval Certificate Issuance by the Security Test Lab
GISFI
Release 1
19
7
Proposal for Network Testing
7.1
Network Testing: Flow-Diagram
GISFI TR SP.1xx V1.0.0 (2012-xx)
The testing step(s) can be iterative, depending on outcome of a particular evaluation step, requiring fixing of
severe/critical bugs/issues by equipment vendors or implementation bugs/gaps by network operators, as reported by
Test Lab. The testing steps/ procedures pertaining to ‘NIST SP800-115’ part, explained in this document, have been
framed with reference from the NIST Technical Guide to Information Security Testing and Assessment. [4]
The entire network can, at first, be sub-divided into three broad sub-systems as: Radio Access Network, Core Network,
Internet Core and each of these sub-systems can be tested separately. Once these three sub-systems are tested
successfully, the entire network can be tested for security.
Generic Requirement (GR) document
sign-off
Network sub-system
to test?
Radio Access Network
sub-system
Core Network subsystem
A relevant 3GPP/3GPP2
/Wireless Security Standard for
the sub-system/ system?
A
GISFI
Internet Core subsystem
Release 1
20
GISFI TR SP.1xx V1.0.0 (2012-xx)
A
Test the network sub-system/
system with the test-case suite
for security compliance
Fail
Pass or Fail
Intermediate Test Report
(with Failed Test items)
Pass
Intermediate Test Report
(with
waivers/explanations)
Operator/Vendor to take
corrective action and resubmit
A
B
GISFI
Release 1
21
GISFI TR SP.1xx V1.0.0 (2012-xx)
B
NIST SP800-115
Security Assessment
Planning
Security Assessment Execution:
Review Techniques: Documentation Review,
Log Review, etc.
Target Identification and Analysis: Network
Discovery, Vulnerability Scanning, etc.
Target Vulnerability Validation: Password
Cracking, Penetration Testing, etc.
Post-testing activities: Mitigation
recommendations, Reporting,
Remediation.
Fail
Pass or Fail
Operator/Vendor to take
corrective action and resubmit
Pass
A
C
GISFI
Release 1
22
GISFI TR SP.1xx V1.0.0 (2012-xx)
C
Test entire network with
interfaces from (A)
Fail
Pass or Fail
Operator/Vendor to take
corrective action and resubmit
Pass
Final Approval
Certificate Issuance by
Security Test Lab
7.2
A
Network Testing: Brief on the Testing steps
STEP 1. Generic Requirement (GR) Document Sign-off:
This step involves filling-in of details of supported/ unsupported network configuration parameters against mentioned
requirements as contained in the relevant Communication Standards. It also involves:
a) Sharing of Security Test Lab-approved GR documents, by network equipment vendor companies, with
operators.
b) Network-related documentation (such as: technical architecture, additional technologies encompassed (for
example, Wi-Fi offloading technique in Long Term Evolution (LTE) Radio), etc) sharing with Security Test
Lab.
All of these documents must be submitted by the network operators to the Security Test Lab.
STEP 2. Telecommunication Network Sub-system Classification by Lab:
This step involves the classification of the Telecommunication Network Sub-system into one of the three broad
categories as:
a) Radio Access Network sub-system (consisting of Base Station Transceivers (BTSs), NodeBs (HNBs), Radio
Network Controllers (RNCs), etc.)
GISFI
Release 1
23
GISFI TR SP.1xx V1.0.0 (2012-xx)
b) Core Network sub-system (consisting of Home Subscriber Server (HSS), Operation, Administration,
Maintenance & Provisioning (OAM&P) System, Serving GPRS Support Node (SGSN), etc.)
c) Internet Core sub-system (consisting of Breakout Gateway Control Function (BGCF), Media Gateway Control
Function (MGCF), etc.)
STEP 3. 3GPP Security Standards Testing:
If the network sub-system/ system has relevant 3GPP/3GPP2 standards for security, test for the same (Test cases for
every scenario need to be developed)
Reference Standards:
3GPP [8]: 33-series
3GPP2 [9]: C.S0024-400-C v2.0, C.S0102-0 v1.0, S.R0006-804-A v1.0, S.R0082-0 v1.0, S.R0083-0 v1.0, S.R0086-A
v1.0, S.R0138-0 v1.0, S.S0078-B v1.0, S.S0083-A v1.0, S.S0086-B v2.0, S.S0110-0 v1.0, S.S0114-A v1.0, S.S0127-0
v1.0, S.S0132-0 v1.0, X.S0027-002-0 v1.0, etc.
STEP 4. 3GPP Security Standards Test Result: If the network sub-system/ system passes the 3GPP Security Standards
testing, then the Security Test Lab must procure an intermediate test report, with details of waivers (if any) and relevant
explanations (by vendors/ network operators, after negotiation with the Tester(s)). The network operator must proceed
to prepare for NIST SP800-115-based testing for the network sub-system/ system, as per the following steps.
If the network sub-system/ system fails the 3GPP Security Standards testing, then the Security Test Lab must procure an
intermediate test report, with details of ‘Failed Test items’. In response, the equipment vendor/ network operator must
fix the relevant software/ firmware/ network implementation (at times, even, hardware must be modified) and must reschedule the network sub-system/ system testing (starting at Step 3).
STEP 5. Testing, According to NIST SP 800-115 Security Testing and Assessment Guide:
A. Security Assessment Planning [4]:
Security assessments can be simplified and associated risks reduced through an established, repeatable planning
process. The core activities involved in planning for an assessment include:
a) Developing a security assessment policy
b) Prioritizing and scheduling assessments
c) Selecting and customizing technical testing and examination techniques
d) Determining the logistics of the assessment
e) Developing the assessment plan
f) Addressing any legal considerations
STEP 6. Testing, According to NIST SP 800-115 Security Testing and Assessment Guide:
B. Security Assessment Execution (Actual Testing Phase) [4]:
This step consists of:
a) Coordination: through a document referred to as ‘Rules of Engagement’ (ROE)
GISFI
Release 1
24
GISFI TR SP.1xx V1.0.0 (2012-xx)
b) Assessing: Testing
c) Analysis: Comparing Test results, negotiation between concerned parties on issue status, agreement on either
issue resolution or waiver.
d) Data Handling: Data Collection, Data Storage, Data Transmission, Data Destruction
Under the Security Assessment Execution, the NIST guide outlines Security Testing Techniques into the following
three categories grouped as:
a) Review Techniques: These are examination techniques used to evaluate systems, applications, networks,
policies, and procedures to discover vulnerabilities, and are generally conducted manually.
b) Target Identification and Analysis: These testing techniques can identify systems, ports, services, and potential
vulnerabilities, and may be performed manually but are generally performed using automated tools.
c) Target Vulnerability Validation: These testing techniques corroborate the existence of vulnerabilities, and may
be performed manually or by using automatic tools, depending on the specific technique used and the skill of
the test team.
STEP 7. Testing, According to NIST SP 800-115 Security Testing and Assessment Guide:
C. Post-Testing Activities [4]:
Following the execution phase—whose findings are expressed in terms of vulnerabilities—the organization will take
steps to address the vulnerabilities that have been identified. They must develop ways such that the organization can
translate their findings into actions that will improve security.
a) Mitigation Recommendations: Mitigation recommendations, including the outcome of the root cause analysis,
will be developed for each finding. There may be both technical recommendations (e.g., applying a particular
patch) and nontechnical recommendations that address the organization’s processes (e.g., updating the patch
management process). Examples of mitigation actions include policy, process, and procedure modifications;
security architecture changes; deployment of new security technologies; and deployment of OS and
application patches. NIST SP 800-53 suggests mitigation recommendations for each security control.
Organizations will compare potential mitigation actions against operational requirements to determine the
actions that best balance functionality and security.
b) Reporting: Upon completion of analysis, a report will be generated that identifies system, network, and
organizational vulnerabilities and their recommended mitigation actions. NIST SP800-115 suggests that a
document referred to as ‘Plan of actions and Milestones’ (POA&M) be used and maintained to ensure that
individual vulnerabilities are addressed with specific, measurable, attainable, realistic, and tangible actions.
c) Remediation/Mitigation: The POA&M provides the program management office with the details and required
actions needed to appropriately and acceptably mitigate risk. Organizations should follow at least the four
steps outlined below during their remediation implementation process—these will provide consistency and
structure for security personnel and program managers:
i. Testing the remediation recommendation.
ii. The POA&M should be coordinated through an organization’s configuration control or configuration
management board because the POA&M likely proposes changes to existing systems, networks,
policy, or processes.
iii. Mitigation actions are implemented and verified to ensure their appropriate and accurate implementation.
iv. Continuously update POA&Ms to identify activities that have been accomplished, partially accomplished,
or are pending action by another individual or system.
GISFI
Release 1
25
GISFI TR SP.1xx V1.0.0 (2012-xx)
STEP 8. NIST SP800-115 Security Test, Result:
If the network sub-system/ system pass the NIST SP800-115-guided Security testing, then the Security Test Lab must
procure an intermediate test report, with details of waivers (if any) and relevant explanations (by vendors/ network
operators, after negotiation with the Tester(s)).
The network operator must then proceed to prepare for NIST SP800-115-based testing for the entire network system, as
starting from Step 3.
If the network sub-system/ system fails the NIST SP800-115-guided Security testing, then the Security Test Lab must
procure an intermediate test report, with details of ‘Failed Test items’. In response, the equipment vendor/ network
operator must fix the relevant software/ firmware/ network implementation (at times, even, hardware must be modified)
and must re-schedule the network sub-system/ system testing (starting at Step 3).
STEP 9. Entire Network Testing, Acording to NIST SP 800-115 Security Testing and Assessment Guide:
In the event of successful ‘network sub-system testing’ for all the three sub-systems of a particular operator (that is,
until Step 8), the network operator must then proceed to prepare for NIST SP800-115-based testing for the entire
network system, as starting from Step 3.
STEP 10. For the Entire Network Test: NIST SP800-115 Security Test, Result:
If the network system pass the NIST SP800-115-guided Security testing, then the Security Test Lab must procure an
intermediate test report, with details of waivers (if any) and relevant explanations (by vendors/ network operators, after
negotiation with the Tester(s)).
If the network system fails the NIST SP800-115-guided Security testing, then the Security Test Lab must procure an
intermediate test report, with details of ‘Failed Test items’. In response, the equipment vendor/ network operator must
fix the relevant software/ firmware/ network implementation (at times, even, hardware must be modified) and must reschedule the network sub-system/ system testing (starting at Step 3).
STEP 11. Final Approval Certificate Issuance by the Security Test Lab
GISFI
Release 1
8
26
GISFI TR SP.1xx V1.0.0 (2012-xx)
Gap analysis by GISFI, based
requirements [10] [11] [12] [13]
on
the
DoT
The National Guideline on Network Security [10] is not adequately detailed for the implementation and practical steps
towards achieving the stated objectives therein need to worked upon. GISFI’s analysis of such gaps is as follows:
8.1
Technical Gaps
The technical gaps identified by GISFI are as follows:
a) The Network Element and Network Security testing requirements need to be discussed, agreed upon and
concluded in collaboration with Telecom Service Providers.
b) On Indian Security Testing Lab [11] [13]: The location of the testing lab(s), requirements and process for
accreditation of the lab(s) need to be defined. Also the process of certification of equipment and network
needs to be defined.
c) Testing against Wireless Standards (3GPP/3GPP2/etc) [11]: Defining the wireless standards against which the
network equipments and networks shall be tested, is required. Within each standards, decision needs to be
taken on what is mandatory to implement and what is optional – both implementation and usage
d) On CC testing [13]: The level of CC testing that is accepted that fulfils both market and government
requirements and the process for the same needs to be defined.
e) On Protection Profiles (PP) and Security Targets (ST) [13]: The process of developing PP’s and ST’s, and the
contents of the same need to be defined.
f) Other technical gaps can be enumerated as:
i. Duration of testing [13]: An optimum duration of testing needs to be defined. Longer time to wait will
impact business whereas no compromise on the security aspect should creep in due to a shorter,
hastened testing duration.
ii. Periodicity of testing [13]: Based software and/or firmware updates to the network equipments, the
periodicity of the elements and the network needs to be defined.
iii. Timing of testing network equipments [13]: Whether the network equipment should be tested before
purchase by operators or after needs to be defined. Before purchase will mean impact on vendors
while after purchase could mean issues for operators/service providers
iv. Volume of testing [13]: The extent or depth of testing, number of test items, etc needs to be defined.
v. Cost of testing [13]: Cost of testing will lead to impact on market and hence a commonly accepted cost
needs to be defined.
vi. Human resource requirements [13]: This shall be determined by the decision about the types of test items,
extent of testing, and number of test cases.
vii. Responsibility of accidents [13]: Which party (vendors or operators, etc) will be responsible to pay for the
accidents despite the use of certified products needs to be defined.
GISFI
Release 1
8.2
27
GISFI TR SP.1xx V1.0.0 (2012-xx)
Policy Gaps
The policy gaps identified by GISFI are as follows:
a) Relationship between Telecom Service Providers and the concerned Government departments [12]: Such a
working relationship between the two needs to be defined and established. The proposal of the Telecom
Security Council of India (TSCI) still needs to be worked upon.
b) Relationship with the CCRA, 3GPP, etc [13]: A working relationship with other global SDOs (for example,
3GPP, 3GPP2, etc) and Certification bodies (for example, CCRA) needs to be established.
9
Requirement for Common Criteria adoption- Means
for development of Protection Profiles
10
Conclusions
GISFI
Release 1
28
GISFI TR SP.1xx V1.0.0 (2012-xx)
Annex A (informative): Review Techniques, Security Assessment
Execution phase, according to The NIST SP800-115 [4]
A.1. Review Techniques:
These are examination techniques used to evaluate systems, applications, networks, policies, and procedures to discover
vulnerabilities, and are generally conducted manually. They include:
A.1.1. Documentation Review: Documentation review determines if the technical aspects of policies and
procedures are current and comprehensive. It evaluates policies and procedures for technical accuracy
and completeness.
A.1.2. Log Review: Log review determines if security controls are logging the proper information, and if the
organization is adhering to its log management policies. It could reveal potential problems and policy
deviations.
A.1.3. Rule set Review: A rule set is a collection of rules or signatures that network traffic or system activity is
compared against to determine what action to take. Rule set review reveals holes in rule set-based
security controls.
A.1.4. System configuration review: System configuration review is the process of identifying weaknesses in
security configuration controls, such as systems not being hardened or configured according to
security policies.
A.1.5. Network sniffing: Network sniffing is a passive technique that monitors network communication, decodes
protocols, and examines headers and payloads to flag information of interest.
A.1.6. File integrity checking: File integrity checkers provide a way to identify that system files have been
changed computing and storing a checksum for every guarded file, and establishing a file checksum
database. Stored checksums are later recomputed to compare their current value with the stored value,
which identifies file modifications.
A.2. Target Identification and Analysis:
These testing techniques can identify systems, ports, services, and potential vulnerabilities, and may be performed
manually but are generally performed using automated tools. They include:
A.2.1. Network Discovery: This technique discovers active devices on a network. It identifies communication
paths and facilitates determination of network architectures. Network discovery may also detect
unauthorized or rogue devices operating on a network.
A.2.2. Network port and Service Identification: Network port and service identification involves using a port
scanner to identify network ports and services operating on active hosts—such as File Transfer
Protocol (FTP) and Hypertext Transfer Protocol (HTTP)—and the application that is running each
identified service, such as Microsoft Internet Information Server (IIS) or Apache for the HTTP
service. It discovers open ports and associated services/ applications.
A.2.3. Vulnerability Scanning: identifies hosts and host attributes (e.g., operating systems, applications, open
ports), but it also attempts to identify vulnerabilities rather than relying on human interpretation of the
scanning results. Vulnerability scanning can help identify outdated software versions, missing patches,
and misconfigurations, and validate compliance with or deviations from an organization’s security
policy. This is done by identifying the operating systems and major software applications running on
the hosts and matching them with information on known vulnerabilities stored in the scanners’
vulnerability databases.
A.2.4. Wireless Scanning: identifies unauthorized wireless devices within the range of the scanners, discovers
wireless signals outside of an organization’s perimeter and detects potential backdoors and other
security violations. Wireless scans can help organizations determine corrective actions to mitigate
risks posed by wireless-enabled technologies (Wi-Fi, Bluetooth, etc.). It can be conducted as either
Passive wireless scanning (using tools that transmit no data, nor do they affect the operation of
deployed wireless devices. For example, Wireless Intrusion Detection and Prevention Systems
GISFI
Release 1
29
GISFI TR SP.1xx V1.0.0 (2012-xx)
(WIDPS)) or Active wireless scanning that builds on the information collected during passive scans,
and attempts to attach to discovered devices and conduct penetration or vulnerability-related testing.
A.3. Target Vulnerability Validation:
These testing techniques corroborate the existence of vulnerabilities, and may be performed manually or by using
automatic tools, depending on the specific technique used and the skill of the test team. They include:
A.3.1. Password cracking: identifies weak passwords and password policies. Password cracking is the process of
recovering passwords from password hashes (#) stored in a computer system or transmitted over
networks. It is usually performed during assessments to identify accounts with weak passwords.
Password cracking is performed, using various methods (Dictionary attack, Hybrid attack, Brute
Force, etc.), on hashes that are either intercepted by a network sniffer while being transmitted across a
network, or retrieved from the target system, which generally requires administrative-level access on,
or physical access to, the target system.
A.3.2. Penetration Testing: tests security using the same methodologies and tools that attackers employ. It also
demonstrates how vulnerabilities can be exploited iteratively to gain greater access. It is a four-phased
process that consists of the planning phase, the discovery phase, the attack (execution) phase, and the
reporting phase.
A.3.3. Social Engineering: allows testing of both procedures and the human element (user awareness). Social
engineering is an attempt to trick someone into revealing information (e.g., a password) that can be
used to attack systems or networks. It is used to test the human element and user awareness of
security, and can reveal weaknesses in user behavior—such as failing to follow standard procedures.
Social engineering can be performed through many means, including analog (e.g., conversations
conducted in person or over the telephone) and digital (e.g., e-mail, instant messaging).
GISFI
Release 1
30
GISFI TR SP.1xx V1.0.0 (2012-xx)
Annex B:
Change history
:
Change history
Date
2012-12
TSG #
TSG Doc.
CR
Rev
Subject/Comment
Updated the document with review comments
GISFI
Old
New