Simulator requirements

- a comparative evaluation of tools

Editor

Henrik Christiansen

COM, Technical University of Denmark

Contributors:

Gerben Kuijpers, Ericsson Telebit

Hiroyuki Yomo, WING, Aalborg University

Hanane Fathi, WING, Aalborg University

Version 2.1

September 8th, 2003

Executive summary

This document has compared a total of ?? simulation tools….

Table of Contents

1.Introduction

1.1.Document overview

2.Simulation tools in general

2.1.A classification of simulation tools

2.2.The tasks of the simulation process

2.3.Using general-purpose languages

2.4.Using a general-purpose simulation tools

2.5.Special-purpose simulation tools

2.6.Comments

3.List of requirements / evaluation criteria

3.1.Goals / overall requirements

3.2.Functional Requirements

3.3.Statistics generation

3.4.Performance/Scalability

3.5.Extensibility

3.6.Usability

3.7.Cost / Availability

3.8.Verifiability / Correctness

4.Description of existing tools

4.1.Description / comparison procedure

4.2.Candidates

4.3.WIPSIM

4.3.1.Overall description

4.3.2.Pros and cons

4.4.OPNET

4.4.1.Overall description

4.4.2.Pros and cons

4.5.NS2

4.5.1.Overall description

4.5.2.Pros and cons

4.6.GlomoSim

4.6.1.Overall description

4.6.2.Pros and cons

5.Comparison

5.1.Candidates

5.2.Functional Requirements

5.2.1.WIPSIM

5.2.2.OPNET

5.2.3.NS2

5.2.4.Glomosim

5.2.5.A new tool

5.2.6.A combinations of a number of existing tools

5.2.7.Summary

5.2.8.List of protocols supported

5.2.9.List of configurable parameters

5.3.Statistics generation

5.3.1.WIPSIM

5.3.2.OPNET

5.3.3.NS2

5.3.4.Glomosim

5.3.5.A new tool

5.3.6.A combinations of a number of existing tools

5.3.7.Summary

5.4.Performance/Scalability

5.4.1.WIPSIM

5.4.2.OPNET

5.4.3.NS2

5.4.4.Glomosim

5.4.5.A new tool

5.4.6.A combinations of a number of existing tools

5.4.7.Summary

5.5.Extensibility

5.5.1.WIPSIM

5.5.2.OPNET

5.5.3.NS2

5.5.4.GloMoSim

5.5.5.A new tool

5.5.6.A combinations of a number of existing tools

5.5.7.Summary

5.6.Usability

5.6.1.WIPSIM

5.6.2.OPNET

5.6.3.NS2

5.6.4.GlomoSIm

5.6.5.A new tool

5.6.6.A combinations of a number of existing tools

5.6.7.Summary

5.7.Cost / Availability

5.7.1.WIPSIM

5.7.2.OPNET

5.7.3.NS2

5.7.4.Glomosim

5.7.5.A new tool

5.7.6.A combinations of a number of existing tools

5.7.7.Summary

5.8.Verifiability / Correctness

5.8.1.WIPSIM

5.8.2.OPNET

5.8.3.NS2

5.8.4.Glomosim

5.8.5.A new tool

5.8.6.A combinations of a number of existing tools

5.8.7.Summary

6.Conclusion

7.References

Appendix A.Simulation tools under evaluation

Appendix B.Tool requirements

Document history

Version / Date / Author / Comments
0,1 / 06-08-2003 / Henrik Christiansen / Document created…
0,3 / 12-08-2003 / Henrik Christiansen / Added contribution on general properties of simulation tools
1.0 / 19-08-2003 / Henrik Christiansen / Added list of contributors + minor changes
1.5 / 23-08-2003 / Henrik Christiansen / Simulator requirements merged with requirements from the “use cases” document. Document changed as agreed upon at the meeting August 22th.
2.0 / 05-09-2003 / Henrik Christiansen
Gerben Kuijpers
Hiroyuki Yomo
Hanane Fathi / Contributions from all authors merged into one document.
2.1 / 08-09-2003 / Henrik Christiansen / Minor improvements and corrections
2.2 / 15-09-2003 / Henrik Christiansen / Document finalized

Abbreviations and acronyms

Acronym / Meaning
ATM / Asynchronous Transfer Mode
CNTK / Center for Netværks TjenesteKonvergens
MIT
NS2 / Network Simulator 2
OLSR / Optimized Link State Routing protocol
OPNET / Optimized Performance Network Engineering Tool
WIPSIM / Wireless IP SIMulator
AODV / Ad hoc On-demand Distance Vector routing protocol

1.Introduction

Within the CNTK framework there has been identified a need for a common simulation tool. [Cntk1]

1.1.Document overview

The structure of the document is as follows: firstly a set of requirements is set up. Secondly, a number of existing tools are described. Thirdly, a comparison of these tools is made. Finally, the conclusion and a list of references are given.

2.Simulation tools in general

Contributors: Henrik Christiansen (COM, Denmark), Gérard Hébuterne (INT, France)

This section gives an overview of some general properties of simulation tools and tries to categorize a number of existing tools based on their properties.

2.1.A classification of simulation tools

By “simulation tool” it is meant today not only the basic tool which runs the simulation model, but also all related utility programs attached to any simulation project, such as graphical aids and development tools – these ones becoming of growing importance. Comparing simulation tools is to some extend impossible as they have different goals, different fields of application, offer different tools and possibly address different populations of users.

A first step towards the classification is in the nature of the language the simulation makes use of. To better explain it, and to understand the implication of the choice, one has to think about the tasks involved in running a simulated model of any system.

2.2.The tasks of the simulation process

Naturally, the system and the way it works have to be described. Each building block of the whole system may be described with a variable level of detail, which depends first on the precision the study is to provide (think of a system involving communication between sub-systems: the description may choose to incorporate the details of the communication protocol, or may adopt a more global view). But this depends too on the basic tools the language provides. For instance, the language may provide some capability as “put object X in queue”; “extract an object from queue”, etc. Or the complete set of operations corresponding to queuing, etc., has to be described.

A second set of tasks to be taken account of is the whole set of operations that must be performed in the framework of any simulation study. The simplest example is “draw a random variable according to a given probability distribution”. Such variables represent the duration of a task, the time interval between events such as node failures, etc. These operations (draw random variables, manage the event list, perform basic statistics,…) are repeated for each simulation experiment.

2.3.Using general-purpose languages

First, the development of the simulation study may be done using general-purpose languages – such as C, C++, Fortran, etc. As they are not oriented towards simulation tasks, they offer no help for that goal: the developer has to perform the whole set of actions described above: whole description of the system in its finest details, and description of all “simulation-related” tasks. As it allows building a simulation program perfectly tailored to the needs of the study, the product obtained will have the highest possible performance level (e.g. in terms of consumed run time). Usually, the choice of this approach is motivated by the need of extensive use of a program that would be otherwise prohibitively slow. However, the gain in the exploitation phase is balanced by an increased effort in the development phase (both in terms of analyzing the model, of coding it, and last but not least, of debugging). The task of simulation development consists often in building (or assembling) a library of basic routines, with which the final package is constituted. One finds in the literature numerous examples of such libraries – see e.g. [Feldman].

2.4.Using a general-purpose simulation tools

Here, the developer makes use of a simulation language, which is a computer language aiming at easing to describe the model, by providing a high-level instruction set by which the system is much more easily described than using general-purpose languages. Typical instructions allow to enter or extract “customers” from queues, to choose a service discipline, to synchronize tasks, to draw random variables, etc.

First simulation tools were proposed in the 60’s, and look like general-purpose languages of the same period. Examples of such tools are Simula, GPSS, Simscript (but many others have been elaborated). Some of them have had a quite long career and have been continuously improved. However, most of today’s users prefer tools of the following generation, characterized by a more or less sophisticated graphical interface. The simulation model may be built from the interface – through a few “mouse clicks”, and the tool often provides utilities to visualize the results, and even to produce the final report. The trend is also to provide a larger and larger library containing built-in sub-models. OPNET is perhaps the most widely known example of this category, but many others exist (Bones, SES-Workbench, see next section).

In fact, the difference between these two categories tends to vanish. First, even if using a “genuine graphic” simulation tool, the study of any elaborated model (in fact, any model of real size, apart from toy cases) asks the developer to “open” the basic building blocks and to write down pieces of code (most frequently using C, or C++). Second, most of the languages in text form of the 60’s, which are still in use, have been greatly improved and provide most of the functionality as true graphic tools. This is especially the case for Simscript II.5 (the latest version of the popular language), but other ones have evolved the same way.

2.5.Special-purpose simulation tools

While general-purpose simulation languages are not specific of an application, the third category offers languages through which the user simply describes the system by specifying the topology, the kind of equipment, the numerical figures of traffics, etc. The simulator is tailored to the study of a quite specific application, such as a data network, and is of no help outside of this application. The effort of development is minimum and restricted to the definition of the simulation experiment and the analysis of the results. Examples of such tools are COMNET, SIMFACTORY (simulation of manufacturing applications), NETWORK II.5 (from CACI), etc. These tools are sometimes referred to as simulators (as opposed to simulation languages of the previous section) – see e.g. [Law, Kelton]

However, such a tool is of limited help, in that it can only be used for studying existing and well-documented technologies. It is thus poorly suited as soon as new equipment, new protocols, new networking paradigms are concerned. Rather, its field of application is to be found on pure network planning and dimensioning, in the operational phase of the technology. In the domain of networking, COMNET is a typical example of this category.

2.6.Comments

The above classification appears rather “rough”. The frontier between general-purpose and special-purpose simulation tools is somehow fuzzy. For instance, one may build network models using OPNET in a way much like a special-purpose language (using the specific libraries it provides), nevertheless it has to be seen as a general-purpose simulation tool, as new network devices may be freely developed.

However, the classification emphasizes a major difference between languages.

There is however a class of special-purpose tools which may be of more general help: free software tools have been devised, mostly by U.S. universities. Examples are NETSIM (MIT), NIST (for ATM networks, and based upon the previous one), INSANE (Berkeley), NS (project VINT), etc. They offer the user the possibility to develop new modules or alter the code already produced, allowing thus to enlarge the scope of the tool. NS is probably the most known example of this class.

Other classifications could be proposed, e.g. emphasizing the technical aspects of the simulation kernel. For instance, some tools use parallel simulation. Other tools are presented as based on an “object oriented” approach – but this is not, however, a sound criterion, as most simulation languages use these concepts naturally.

3.List of requirements / evaluation criteria

This chapter defines a set of criteria, which will be used when comparing different simulation tools. The criteria list is an extended version of the requirement list set up in [cntk1]. Requirements from [cntk1] are not repeated here, but directly referenced as Rx referring to the tags from chapter 5 in [cntk1] (requirements are also listed in appendix B of this document).

The requirements from [cntk1] have been subdivided into a number of categories, namely (some requirements appear in more than one category):

  • Performance evaluation capabilities (R1, R12)
  • Modeling capabilities (R2, R3, R6, R13, R14, R17, R18, R19, R20, R21)
  • User interface

–Configuration: (R5, R7, R11, R12, R17, R22)

–Presentation: (R15)

  • General properties of the tool: (R8, R9, R10, R17, R23)

The total list of criteria has been subdivided into groups, which are: functional requirements, statistics generation, performance/scalability, extensibility, Usability, cost / availability and correctness / Verifiability. A more precise definition of these requirements are given hereafter..

3.1.Goals / overall requirements

  • TIP integration
  • Evaluation of ad-hoc routing protocols under constraints of devices' power supply;
  • (2nd priority) QoS in such scenarios

3.2.Functional Requirements

Definition:
This criterion is related to the specific use of the simulation tool in the CNTK project.

More elaborated list of requirements:

  • Performance evaluation capabilities (R1, R12)
  • Modelling capabilities (R2, R3, R6, R13, R14, R17, R18, R19, R20, R21)
  • ad-hoc connectivity
  • mobility of nodes (mobility model)
  • L1: abstract channel model (e.g. described by bit error patterns, throughput, delay, depending on # active sources in some geographical region)
  • Model of Power consumption for data transmission
  • L2: Complete model of MAC behavior; at least for 802.11 & Bluetooth
  • L3: support of as many different routing protocols as possible (mainly for ad-hoc routing but also 'traditional' protocols, e.g. OSPF, for comparison); extensibility for self-developed protocols/protocol modifications (see section 3.3)

2nd priority:

  • DiffServ support (scheduling/marking)
  • L4-7: Traffic models:

–CBR (UDP based)

–Bursty ON/OFF (UDP based)

  • Extensibility (see section 3.3)
  • TCP traffic models, traffic generated by actual code of sensor applications
  • Support for node-failure modeling
  • Support for modeling of power-consumption due to processing in nodes
  • Support for modeling processing times

3.3.Statistics generation

Definition:
The ability of the tool to do measurements on the models while the simulation is running and present them afterwards.

More elaborated list of requirements:

  • User interface, Presentation: (R15)
  • Dropped packets, packet delay/jitter, throughput
  • Statistics for protocol overhead
  • Fairness
  • Allow for on-line/integrated computation of statistics
  • Simulation run-time

3.4.Performance/Scalability

Definition:
The performance of the simulator impacts the relation between the real-time (the time the computer uses to do the simulation) and the simulated time (the timeframe that are being modeled). The scalability is the impact the size of the model has on the performance.

More elaborated list of requirements:

  • Real-time simulation of 200-500 nodes under high traffic load
  • Support of distributed simulation

3.5.Extensibility

Definition:
Extensibility is a measure of whether a tool is built so that it can be expanded with e.g. new features by the user. An important factor here is the amount of work / time needed to extend the tool.

More elaborated list of requirements:

  • General properties of the tool: (R8, R9, R10, R17, R23)
  • Integration of TIP stack in simulator
  • Integration of new routing protocols
  • Definition of new mobility models
  • Processing in real-time
  • Definition of new traffic models (including possibly the use of real sensor applications for traffic generation)
  • Definition of measurement procedures for new statistics
  • Good development/source-code documentation
  • Simulator source code including compilation environment needs to be available

3.6.Usability

Definition:
The “user-friendlyness” of a system. Is it easy to use for an experienced / inexperienced user? How it the daily workwith the tool and how easy is it to learn.

More elaborated list of requirements:

  • User interface

–Configuration: (R5, R7, R11, R12, R17, R22)

–Presentation: (R15)

  • Good documentation of the simulator (users' manual)
  • Short 'learning period' for new users
  • existing experience with simulator within CNTK

3.7.Cost / Availability

Definition:
The cost of the tool is simply the price of acquiring the product. Availability is whether the tool is available to the general public, to CNTK project members only or limited to employees of a certain company.

More elaborated list of requirements:

  • Either costs for simulator (source code + compilation environment) as low as possible, or simulator already available for all CNTK partners

3.8.Verifiability / Correctness

Definition:
The correctness of a tool defines how confident one can be that a given implementation of, say, a protocol has been proven to conform to e.g., the standards or otherwise works correctly. Verifiability is the ability of the user to, by using the tool, test whether a new protocols implementation works as it is supposed to.

More elaborated list of requirements:

  • used simulator features should be known 'to work correctly'
  • results should be reproducible by other scientific organizations

4.Description of existing tools

This document is not the only one trying to compare different simulation tools - a number of comparisons of existing tools have been carried out [Bragg2000, Ince2002].

4.1.Description / comparison procedure

Each new tool under consideration must undergo the following screening procedure:

  1. Answer the following question: “is it possible to extend the tool with new protocols?” If no then stop evaluating this tool, otherwise continue.
  2. Answer the following question: “is it possible to extend the tool with new statistics?” If no then stop evaluating this tool, otherwise continue.
  3. Answer the following question: “is the tool able to carry out simulations on large network models?” If no then stop evaluating this tool, otherwise continue.
  4. Create a list of built-in protocols for this tool –
  5. Is the number of built-in protocols enough for our interest? If no stop evaluating this tool, otherwise continue
  6. if the list is huge the limit to protocols relevant to this project. Integrate the list into the appropriate table in the summary part of section 5.2
  7. Create a list of configurable parameters for this tool – might be limited to parameters relevant to the list of protocols created in step 4. Integrate the list into the appropriate table in the summary part of section 5.2
  8. Generate a sub-section for section four of this document (i.e. section 4.x) giving an overall description of the tool. The description must begin with the name of the contributor and a definition of his / her experience with the tool. At the end of the section a summary of the tool’s pro and cons should be given.
  9. Generate a sub-section for each criteria treated in section five of this document (i.e. section 5.x.y, where y is a unique number for the tool under study). In each sub-section it must be specified whether the requirements from section 3 are met or not. In addition the comparison table in the summary part of each sub-section must be filled out.

4.2.Candidates

The following tools will be evaluated:

  • WIPSIM [WIPSIM]
  • OPNET [OPNET]
  • NS2 [NS2]
  • Extensible and High-Fidelity TCPIP Network Simulator [Exten]
  • MPLS Network Simulator [MPLSSim]
  • Bluehoc [Bluehoc]
  • CDMA Wireless Network Simulator [CDMASim]
  • GloMoSim [GLOMOSIM]
  • QualNet [QUALNET]
  • CNET [CNET]
  • Real [REAL]
  • NetSIM [NetSIM]
  • FLAN [FLAN]
  • NCTUns [NCTUns]
  • SimMan 1.0[SimMAN]
  • VENUS [VENUS]
  • AnSIM [AnSIM]
  • NIST [NIST]
  • INSANE [INSANE]

The initial screening procedure as outlined in section 4.1 yields the following set of simulation tool for further comparison, the complete outcome of the screening can be seen in appendix 1: