GISFI TR SP.1xxV1.0.0(2012-xx)

Technical Report

Global ICT Standardisation Forum for India;

Technical Working Group Security and Privacy;

Security Testing Methods for ICT products;

(Release 1)

The present document has been developed within GISFIand may be further elaborated for the purposes of GISFI.

GISFI TR SP.1xx V1.0.0 (2012-xx)

1

Release 1

GISFI

GISFI office address

Suite 303, 3rd Floor, TirupatiPlaza, Plot No. 4, Sector 11, Dwarka, New Delhi-110075, India

Tel.: +91-11-47581800 Fax: +91-11-47581801

Internet

E-mail:

Copyright Notification

No part may be reproduced except as authorized by written permission.
The copyright and the foregoing restriction extend to reproduction in all media.

© 2011, GISFI

All rights reserved.

Contents

Foreword

Introduction

1Scope

2References

3Definitions, symbols and abbreviations

3.1Definitions

3.2Abbreviations

4Requirements for security testing of ICT products and systems......

5ICT Security Test Methods......

5.1Common Criteria (CC)......

5.2The NIST Technical Guide to Information Security Testing and Assessment (SP800-115)......

5.3The ETSI MTS-IPT IPv6 Security Test Specifications......

5.4The OWASP Testing Guide (Version 3)......

5.5The IETF – Network Endpoint Assessment (NEA)......

5.6The ICT Security Testing Standards from ISO 27000 family......

6Proposal for Network Element Testing......

6.1Network Element Testing: Flow-Diagram......

6.2Network Element Testing: Brief on the Testing steps

7Proposal for Network Testing......

7.1Network Testing: Flow-Diagram......

7.2Network Testing: Brief on the Testing steps

8Gap analysis by GISFI, based on the DoT requirements [10] [11] [12] [13]......

8.1 Technical Gaps......

8.2 Policy Gaps......

9Requirement for Common Criteria adoption- Means for development of Protection Profiles......

10Conclusions......

Annex A (informative): Review Techniques, Security Assessment Execution phase, according to The NIST SP800-115 [4]

Annex B: Change history

Foreword

This Technical Report has been produced by GISFI.

The contents of the present document are subject to continuing work within the Technical Working Group (TWG) and may change following formal TWG approval. Should the TWG modify the contents of the present document, it will be re-released by the TWG with an identifying change of release date and an increase in version number as follows:

Version x.y.z

where:

xthe first digit shows the release to which the document belongs

ythe second digit is incremented for all changes of substance, i.e. technical enhancements, corrections, updates, etc.

zthe third digit is incremented when editorial only changes have been incorporated in the document.

.

Introduction

This report presents ‘Network Element Testing’ in regards to performing security tests on network elements and to certifying them as ‘approved/tested/etc for security’ before they are integrated into the mobile network. It also presents ‘Network Testing’ in regards to performing security tests on entire mobile networks deployed by Cellular Operators and to certifying them as ‘approved/tested/etc. for security’.

Also, this report captures information about already available security test methods being employed by product/system certification bodies and gives several proposals for Department of Telecom (DOT) requirements on security testing of network.

1Scope

This document, study reporton Security Testing Methods for ICT products is a deliverable of Security and Privacy working group. The scope of this technical report is to perform a study on the various testing methods that are currently employed in different regions of the world. Items within the scope of this study are:

1.Review the security testing methods for ICT products and systems.

2.Develop testing steps for ‘Network Element Testing’ and ‘Network Testing’.

3.Present a gap analysis between the testing methods asked for by the DoT (vide Circular “10-15/2011-AS.III/ (21)”, dated 31/05/2011 [10]) and the current testing methods being used in the country and to provide recommendations.

2References

[1] Common Criteria for Information Technology Security Evaluation:

[2] Common Criteria, “Common Criteria for Information Technology Security Evaluation: Part 3: Security assurance components"; Ver. 3.1, Rev. 3, Final; July 2009; Ch. 8, Pg. 32 to 44.

[3] Common Criteria, “Common Criteria for Information Technology Security Evaluation: Evaluation Methodology", by the Common Criteria portal; Ver. 3.1, Rev. 3, Final; July 2009; Ch. 8, pg 25 – 29.

[4] NIST SP800-115,“Technical Guide to Information Security Testing and Assessment”; NIST Computer Security Division (CSD) (Karen Scarfone, Murugiah Souppaya, Amanda Cody, Angela Orebaugh); September 2008; Pgs.: Chapter 2, pg. 2-1; Chapters 3 through 5, pg 3-1 to 5-7.

[5] ETSI:

The ETSI MTS-IPT website:

[6] OWASP:

OWASP Testing Guide (Version 3) document source:

[7] IETF:

Network Endpoint Assessment: RFC 5209:

[8] 3GPP 33-series (Security-related) Technical Specifications:

[9] 3GPP2 Specifications page:

[10] DoT Circular “10-15/2011-AS.III/ (21)”, 31 May 2011:

[11] GISFI_SP_201203176,“Proposals for activity on network security requirements of India”, NEC, March 2012.

[12] GISFI_SP_201203178, “Approach Note on proposed GISFI’s activity on Network Security”, Krishna Sirohi, March 2012.

[13] GISFI_SP_201206244, “Overview and System Security to Security Testing”, NEC, June 2012.

3Definitions, symbols and abbreviations

3.1Definitions

This document defines the following items.

3.2Abbreviations

3GPPThird Generation Partnership Project

3GPP2Third Generation Partnership Project 2

CCCommon Criteria

CCRACommon Criteria Recognition Agreement

DoTDepartment of Telecommunications

ETSIEuropean Telecommunications Standards Institute

ICT Information and Communication Technologies

IETFInternet Engineering Task Force

IP (v6)Internet Protocol (version 6)

MTS-IPTMethods for Testing and Specification-IP Testing

NISTNational Institute of Standards and Technology

OWASPOpen Web Application Security Project

PPProtection Profile

STSecurity Target

4Requirements for security testingof ICT products and systems

To be prepared (as from the DoT document (vide Circular “10-15/2011-AS.III/ (21)”, dated 31/05/2011 [10]))

5ICT Security Test Methods

5.1Common Criteria (CC)

The Common Criteria (CC), more specifically known as The Common Criteria for Information Technology Security Evaluation, was adopted and published by the ISO/IEC, following earlier attempts to integrate information technology and computer security criteria by various regional SDO’s. [1]

The Common Criteria is composed of three parts [1]:

a)ISO/IEC 15408-1:2009: (Part 1) The Introduction and General Model: is the introduction to the CC. It defines the general concepts and principles of IT security evaluation and presents a general model of evaluation.

b)ISO/IEC 15408-2:2008: (Part 2) The Security Functional Requirements: establishes a set of functional components that serve as standard templates upon which to base functional requirements for Target Of Evaluation(s) (TOEs). CC Part 2 catalogues the set of functional components and organizes them in families and classes.

c)ISO/IEC 15408-3:2008: (Part 3) The Security Assurance Requirements: establishes a set of assurance components that serve as standard templates upon which to base assurance requirements for TOEs. CC Part 3 catalogues the set of assurance components and organizes them into families and classes. CC Part 3 also defines evaluation criteria for Protection Profile(s) (PPs) and Security Target(s) (STs) and presents seven pre-defined assurance packages which are called the Evaluation Assurance Levels (EALs).

And these are accompanied by:

  • ISO/IEC 18045:2008: Evaluation Methodology: While Part 3 specifies the actions that must be performed to gain assurance, it does not specify how those actions are to be conducted. The Common Evaluation Methodology (CEM) provides the methodology for IT security evaluation using the CC as a basis.

These documents are used by the certifying body of a CC scheme and the evaluation facilities.

The CC is a standard for evaluating ICT security products against two types of requirements:

  • Security functional requirements
  • Security assurance requirements.

A product or service that is to be evaluated under the Common Criteria guidelines is referred to as a TOE and it is the developer's responsibility to provide evidence that the security provisions for a TOE have been designed and implemented to meet the requirements of ISO/IEC 15408.

The Common Criteria defines two different documents as TOE:

a)Protection Profile (PP): A description of a generic type of security device (e.g., a firewall).

b)Security Target (ST): A description of a specific security device.

The Evaluation Assurance Level (EAL1 through EAL7) of an IT product or system is a numerical grade assigned following the completion of a Common Criteria security evaluation. The increasing assurance levels reflect added assurance requirements that must be met to achieve Common Criteria certification.

The various Assurance Levels can be listed as follows [2]:

a)EAL1: Functionally Tested

b)EAL2: Structurally Tested

c)EAL3: Methodically Tested and Checked

d)EAL4: Methodically Designed, Tested, and Reviewed

e)EAL5: Semi formally Designed and Tested

f)EAL6: Semi formally Verified Design and Tested

g)EAL7: Formally Verified Design and Tested

The CEM [3] provides an overview of the Evaluation process with four tasks that an evaluator needs to perform. They are as follows:

a)The input task [3]: The objective of this task is to ensure that the evaluator has available the correct version of the evaluation evidence (any resource required from the sponsor or developer by the evaluator or evaluation authority to perform one or more evaluation or evaluation oversight activities) necessary for the evaluation and that it is adequately protected. Otherwise, the technical accuracy of the evaluation cannot be assured, nor can it be assured that the evaluation is being conducted in a way to provide repeatable and reproducible results.

b)The evaluation sub-activities [3]: is the actual evaluation activity.

c)The output task [3]: It consists of the consistent reporting of evaluation results that facilitates the achievement of the universal principle of repeatability and reproducibility of results. The consistency covers the type and the amount of information reported in the Evaluation Technical Report (ETR) and Observation Report (OR)

d)The demonstration of technical competence to the evaluation authority task [3]: may be fulfilled by the evaluation authority analysis of the output tasks results, or may include the demonstration by the evaluators of their understanding of the inputs for the evaluation sub-activities.

5.2The NIST Technical Guide to Information Security Testing and Assessment (SP800-115)

The NIST has published the “SP800-115: Technical Guide to Information Security Testing and Assessment” [4] that addresses technical testing and examination techniques that can be used to identify, validate, and assess technical vulnerabilities and assist organizations in understanding and improving the security posture of their systems and networks.

The NIST guide outlines Security Testing Techniques into the following three categories grouped as:

a)Review Techniques[4]: These are examination techniques used to evaluate systems, applications, networks, policies, and procedures to discover vulnerabilities, and are generally conducted manually. They include:

  1. Documentation Review: Documentation review determines if the technical aspects of policies and procedures are current and comprehensive. It evaluates policies and procedures for technical accuracy and completeness.
  2. Log Review: Log review determines if security controls are logging the proper information, and if the organization is adhering to its log management policies. It could reveal potential problems and policy deviations.
  3. Rule set Review: A rule set is a collection of rules or signatures that network traffic or system activity is compared against to determine what action to take. Rule set review reveals holes in rule set-based security controls.
  4. System configuration review: System configuration review is the process of identifying weaknesses in security configuration controls, such as systems not being hardened or configured according to security policies.
  5. Network sniffing: Network sniffing is a passive technique that monitors network communication, decodes protocols, and examines headers and payloads to flag information of interest.
  6. File integrity checking: File integrity checkers provide a way to identify that system files have been changed computing and storing a checksum for every guarded file, and establishing a file checksum database. Stored checksums are later recomputed to compare their current value with the stored value, which identifies file modifications.

b)Target Identification and Analysis[4]: These testing techniques can identify systems, ports, services, and potential vulnerabilities, and may be performed manually but are generally performed using automated tools. They include:

  1. Network Discovery: This technique discovers active devices on a network. It identifies communication paths and facilitates determination of network architectures. Network discovery may also detect unauthorized or rogue devices operating on a network.
  2. Network port and Service Identification: Network port and service identification involves using a port scanner to identify network ports and services operating on active hosts—such as File Transfer Protocol (FTP) and Hypertext Transfer Protocol (HTTP)—and the application that is running each identified service, such as Microsoft Internet Information Server (IIS) or Apache for the HTTP service. It discovers open ports and associated services/ applications.
  3. Vulnerability Scanning: identifies hosts and host attributes (e.g., operating systems, applications, open ports), but it also attempts to identify vulnerabilities rather than relying on human interpretation of the scanning results. Vulnerability scanning can help identify outdated software versions, missing patches, and mis-configurations, and validate compliance with or deviations from an organization’s security policy. This is done by identifying the operating systems and major software applications running on the hosts and matching them with information on known vulnerabilities stored in the scanners’ vulnerability databases.
  4. Wireless Scanning: identifies unauthorized wireless devices within the range of the scanners, discovers wireless signals outside of an organization’s perimeter and detects potential backdoors and other security violations. Wireless scans can help organizations determine corrective actions to mitigate risks posed by wireless-enabled technologies (Wi-Fi, Bluetooth, etc.). It can be conducted as either Passive wireless scanning (using tools that transmit no data, nor do they affect the operation of deployed wireless devices. For example, Wireless Intrusion Detection and Prevention Systems (WIDPS)) or Active wireless scanning that builds on the information collected during passive scans, and attempts to attach to discovered devices and conduct penetration or vulnerability-related testing.

c)Target Vulnerability Validation [4]: These testing techniques corroborate the existence of vulnerabilities, and may be performed manually or by using automatic tools, depending on the specific technique used and the skill of the test team. They include:

  1. Password cracking: identifies weak passwords and password policies. Password cracking is the process of recovering passwords from password hashes (#) stored in a computer system or transmitted over networks. It is usually performed during assessments to identify accounts with weak passwords. Password cracking is performed, using various methods (Dictionary attack, Hybrid attack, Brute Force, etc.), on hashes that are either intercepted by a network sniffer while being transmitted across a network, or retrieved from the target system, which generally requires administrative-level access on, or physical access to, the target system.
  2. Penetration Testing: tests security using the same methodologies and tools that attackers employ. It also demonstrates how vulnerabilities can be exploited iteratively to gain greater access. It is a four-phased process that consists of the planning phase, the discovery phase, the attack (execution) phase, and the reporting phase.
  3. Social Engineering: allows testing of both procedures and the human element (user awareness). Social engineering is an attempt to trick someone into revealing information (e.g., a password) that can be used to attack systems or networks. It is used to test the human element and user awareness of security, and can reveal weaknesses in user behavior—such as failing to follow standard procedures. Social engineering can be performed through many means, including analog (e.g., conversations conducted in person or over the telephone) and digital (e.g., e-mail, instant messaging).

According to the NIST guide, since no one technique can provide a complete picture of the security of a system or network, organizations should combine appropriate techniques to ensure robust security assessments. For example, penetration testing usually relies on performing both network port/service identification and vulnerability scanning to identify hosts and services that may be targets for future penetration.

5.3The ETSI MTS-IPT IPv6 Security Test Specifications

ETSI technical committee MTS (Methods for Testing and Specification) has set up a working group, MTS-IPT (IP Testing) [5] to focus on methodology and the production of test specifications for IP-related protocols. Co-funded by ETSI and the EC/EFTA (European Commission/European Free Trade Association), MTS-IPT is an open group at which participation is welcome.

This project will provide a publicly available test development framework and interoperability test packages for four key areas of IPv6, of which one is ‘Security’.

The Test Specifications regarding IPv6 Security Testing have been published as [5]:

  • ETSI TS 102 558 IPv6 Security: Requirements Catalogue
  • ETSI TS 102 593 IPv6 Security: Conformance TSS & TP
  • ETSI TS 102 594 IPv6 Security: Conformance Test Suite
  • ETSI TS 102 597 IPv6 Security: Interoperability Test Suite

5.4The OWASP Testing Guide (Version 3)

The OWASP Testing Guide (Version 3) [6] provides an overview of various testing techniques that can be employed when building a testing program that covers the following:

a)Manual Inspection and Reviews [6]: Manual inspections are human-driven reviews that typically test the security implications of the people, policies, and processes, but can include inspection of technology decisions such as architectural designs. They are usually conducted by analyzing documentation or performing interviews with the designers or system owners.

b)Threat Modeling [6]: help system designers think about the security threats that their systems/applications might face. Therefore, threat modeling can be seen as risk assessment for applications.

c)Code Review [6]: Source code review is the process of manually checking a web application's source code for security issues. Many serious security vulnerabilities cannot be detected with any other form of analysis or testing. Examples of issues that are particularly conducive to being found through source code reviews include concurrency problems, flawed business logic, access control problems, and cryptographic weaknesses as well as backdoors, Trojans, Easter eggs, time bombs, logic bombs, and other forms of malicious code.

d)Penetration Testing [6]: Penetration testing (for web applications) is essentially the “art” of testing a running application remotely, without knowing the inner workings of the application itself, to find security vulnerabilities. Typically, the penetration test team would have access to an application as if they were users. The tester acts like an attacker and attempts to find and exploit vulnerabilities.