ABSTRACTS
SIMULATION IN INDUSTRY AND SERVICES
Brussels, December 9, 2005
DECISION MAKING
Microsimulation for Political decision makingone
Istvan MOLNAR and Pal BELYO
Simulation and the design of evacuation policiestwo
Javier OTAMENDI
OPTIMIZATION
Finding optimal tours
Liesje DE BOECK, Jan COLPAERT, Carmen JACOB, Frank COLE three
Neural Networks
Gabort NÉMETH and Gyorgy LIPOVSZKIfour
EDUCATION
Rare events simulationfive
Isolina Alberto Moralejo, Pedro Mateo Collazos, Fermín Mallor Giménez
Simulation in Education on Statisticssix
P. DARIUS, E. SCHREVENS, K.PORTIER, O. THAS,
H. VAN DER KNAAP, A. CARBONEZ
FINANCE
Jump driven financial asset modelsseven
Wim SCHOUTENS
Portfolio simulationseight
Eddy VERBIEST
USE OF Microsimulation MODELS FOR POLITICAL DECISION MAKING
Istvan Molnar (°) and Pal Belyo (°°)
(°) School of Business, Department of Computer and Information Systems
BloomsburgUniversity of Pennsylvania, Bloomsburg, Pennsylvania, 17815, U.S.A.
E-Mail:
(°°) ECOSTAT - Institute for Economic Analysis and Informatics
H-1119 Budapest, Andor u. 47-49., Hungary
E-Mail:
Abstract
Socioeconomic systems are complex, extremely sensitive and of great social and economic importance. Microsimulation is a method able to handle complex socioeconomic systems by creating and studying a model that makes intensive use of the statistical data of the observed objects. These objects are the so-called micro units of the socioeconomic system; the person, the family or the household. Microsimulation models use simulation techniques in order to study the behavior of micro level units in time.
Microsimulation is generally accepted by decision-makers and widely used in Australia, Canada, Europe and the USA to prepare political decisions. In the European Union, more and more signs indicate an increasing demand for instruments of macroeconomic analysis and prediction, coupled with a tendency of more willingness to budgetary spending for microsimulation. However, not only highly developed economies, but economies in transition also face many problems, especially in demography, pension systems, health care and taxation, for which microsimulation could be a very useful tool to do a model-based study of related problems and possible solutions.
In this contribution first, a short overview about recent microsimulation model developments is presented. Next, the feasibility of introducing capital income taxation in Hungary will be analyzed and discussed based on a microsimulation model. The current economic environment and the special Hungarian economic characteristics are shortly described, followed by the economic justifications, the basic data collection and analysis, the microsimulation model, the model results and validation. Finally, the impact of different taxation policies on various social layers of the population and the macro-economic consequences will be discussed and briefly analyzed.
Keywords: Microsimulation, Socioeconomic applications, Capital income taxation.
A SIMULATION MODEL FOR THE DESIGN OF THE EVACUATION POLICIES IN THE CASE OF FIRE IN PUBLIC INTEGRATED STATIONS
Javier Otamendi
Universidad Rey Juan Carlos ()
Abstract
When designing a public transportation building, which integrates underground and bus and therefore must hold many passengers during rush hour, it is important to include rules and procedures to escape from fire if this undesirable situation appears. There are institutional requisites that must be satisfied: minimum width of the stairs, minimum distance between ramps and exit gates, … They are all static values that are included in a international regulation, which also reflect some dynamic rates, like capacities of the stairs or the hallways expressed in passengers per minute per unit length.
What is not included in any rule is how the crowd is going to behave whenever the fire starts. The flow of passengers starts to be chaotic and some of the hallways and exits will collapse and reduce the overall capacity of the system. The time required for any passenger to exit the facilities is no longer easy to calculate since the dynamic behavior of the crowd in not directly related to the average behavior of the individuals.
Here is where Monte Carlo Simulation comes into play. A detailed model of the station which reflects all the possible flows and capacities of the different exits helps calculate the time required for the crowd to get outside of the building. Once it is verified and validated, the simulation model will help design not only the building but also the policies that should be followed if the disaster happens.
This article includes how the fire policies for one of the main integrated stations in Madrid were developed using simulation. The presentation will follow the traditional breakdown of phases that any simulation specialist should complete. First, the available data is presented (layout, capacities, volume of passengers…). Second, the simulation models are introduced. One detailed discrete-event model was first developed but could not be run because it was too slow. Then, a continuous model was developed, but had the same problem. Finally, a black box discrete-event model was used, which even if it hadn’t a good graphical interface, it was very useful to calculate the necessary statistics.
To overcome its lack of user-friendliness, an Excel spreadsheet was developed to enter the necessary input values and get the results. The spreadsheet was also used for verification purposes, since additional worksheets were designed to compare the theoretical results calculated using the input values with the output values obtained by simulation.The next step was the parameterization of the model so it could be run from the spreadsheet for different input values. For example, from the architectural design point of view, the number of fare collection gates could be changed and its influence in the total evacuation time assessed. From the fire policies point of view, ramps could be closed or opened to study the effect in the outgoing flows. As it turned out, this experimentation phase showed the possibility to reduce the evacuation time in about 20%, which a byproduct gives the possibility of evacuating about 30% more passengers than the amount expected to use the facilities these days.
On the best tours for suppliers and food teams in Limburg
Liesje De Boeck (°), Jan Colpaert (°), Carmen Jacob (°°), Frank Cole (°°°)
(°)Centre for Modelling and Simulation, Ehsal, Stormstraat 2, 1000 Brussels
(°°)student CatholicUniversity of Leuven
(°°°)Ehsal, Stormstraat 2, 1000 Brussels
Abstract
This paper comes up with an application of the ordered cluster traveling salesman problem. The application consists in finding tours with minimal traveling distances between suppliers and food teams. A tour always exists of a trip to all suppliers and after that, a trip to all food teams. We propose different alternatives for the tours depending on who is responsible for performing the tour or parts of the tour. Since we cope with a small real-life application, we can rely on exact methods to find the best tours. To obtain the tours with optimal (minimal) distances for all alternatives, we rely in a first stage on linear (integer) programming. In a second stage, we use simulation to generate a number of best solutions for all alternatives. That is because the optimal solutions of all alternatives are not always feasible in real-life. We prove that both methods are very valuable and flexible in obtaining adequate answers to small-scale real-life problems.
System identification by neural network
GÁBOR NÉMETH (°) and GYÖRGY LIPOVSZKI (°°)
Department of Production Informatics, Management and Controls
BudapestUniversity of Technology and Economics
H-1111 Budapest, Muegyetem rkp. 3-9, Building D, Room 422, Hungary
(°)PhD student []
(°°)Associate professor []
Abstract
In financial systems there is hard to find the correct relationship between the input and the output data. These systems are usually described by nonlinear, stochastic and time-dependent differential equations. It is a relevant success, if the static functions (input-output balance) of the systems can be determined; of course the final goal is the construction of the dynamic state-functions of the whole process. Unfortunately the structure of state-functions and their variables are continuously changing in the real system; however, it is possible to 'freeze' the model and assign the model type and its parameters in a certain period. Since in identification processes neither the structure of the differential equation nor its parameters are known, the system identification is a really complex optimization procedure.
In our research a special task was given for the computer, applying soft computing technology, it should behave within a certain precision like the examined system. In the teaching procedure of neural network genetic algorithms were used. In this article we report the steps of development and the first experiments of the identification method.
Keywords:Neural Network, Simulation, Genetic Algorithms and Hierarchical Optimization
Applications of rare events simulation by using Restart
Isolina Alberto Moralejo (°), Pedro Mateo Collazos (°),FermínMallor Giménez (°°)
(°)Universidad de Zaragoza, Spain
(°°) Universidad Pública de Navarra, Spain ()
Abstract
Discrete event simulation as a method for performance evaluation has become an indispensable tool in many fields, e.g., teletraffic engineering, management, logistics, reliability, etc. But in some important application areas the system performance is closely linked with the occurrence of certain rare events. For example, new communication networks and services pose extreme requirements regarding the quality of service: in packet switching over telecommunications networks, an important parameter is the probability of packet loss at a switch. These probabilities are required to be of the order of 10-9. Also it occurs in reliability where, for example, false alarm probabilities of radar and sonar receivers are usually constrained to not exceed values of 10-6.
In such situations, conventional Monte Carlo simulation becomes ineffective because of excessively long run times required to generate rare events in sufficiently large number for obtaining statistically significant results. Thus new methods to implement the simulation model need to be investigated and employed.
Several speed up techniques have been proposed in the literature as parallel programming, importance sampling, cross entropy and importance splitting. In our work we briefly present these different techniques and focus more extensively in the Restart, which falls inside the last category. The idea of the Restart method, which was introduced by Villén Altamirano brothers at the beginning of the 90’s, is based on restarting the simulation in certain system states in order to do the rare event more likely to be observed. For this purpose, a sequence of nested events is defined, being the rare event the intersection of all of them. The probability of the rare event is expressed as the product of the successive conditional probabilities, each one of them can be estimated more accurately than the rare event probability.
We show the application of this method to asses the quality of service in a telecommunications context, concretely, by estimating the probability of packet loss in an antenna. We compare the computational effort needed to attain certain accuracy level in both cases, using and not using the Restart method. We also give indications for the practical implementation of the Restart technique in other contexts.
Furthermore, a new method that improves the implementation of the restart method when it is successively used to optimise certain parameter is presented.
The use of simulation in Education on Statistics
P. Darius (°), E. Schrevens(°), K.Portier(°°), O. Thas (°°°), H. van der Knaap(°°°°), A. Carbonez(°)
(°)Katholieke Universiteit Leuven, Belgium
(°°) University of Florida, Gainsville, Florida,USA
(°°°)Universiteit Gent, Belgium
(°°°°)Unilever Reasearch, Vlaardingen, The Netherlands
Abstract
Teachers of statistics encounter a number of didactic problems. It has been argued in the literature that supplementing the traditional material with tools based on a visual approach and a more active form of learning, could improve the effectiveness of the teaching.
This paper describes two possible tools.
First, we start by showing the use of JAVA applets called VESTAC (Visualization of and Experimentation with STAtistical Concepts). They are covering selected topics useful for an introductory course and a second regression and/or anova course. Each applet gives a visual representation of the topic, with ample possibilities to see the asymptotic results build up, or to experiment with the data and the parameters and see the effects immediately. Those topics were selected where it was thought that an interactive visual applet could have didactic properties exceeding those attainable with more traditional didactic tools.
The VESTAC applets are intended for use by the teacher in the classroom, as well as for supervised use by students during practical exercises in a PC classroom, and for unsupervised use by students at home. They are currently used in a variety of courses at the authors’ universities, as well as in some others.
Next, we demonstrate the use VIRTEX, which is a collection of applets that allow virtual experimentation. They are meant as an opportunity to train data collection and analysis skills. They could be used as supplementary exercises within a Statistics or a Design of Experiments course.
Each VIRTEX applet presents a software environment that mimics a real situation of interest. Data can be easily collected, but this can be done in so many ways that before doing so, many nontrivial decisions must be taken. Once the data are collected, they can be copy/pasted to a statistical software package. The user now can relate the quality of the analysis results to the data collection strategy used.
Two environments will be shown:
- an industrial process that must be optimized
- a greenhouse experiment to compare the effect of different treatments on plant growth.
A Multivariate Jump-Driven Financial Asset Model
Wim Schoutens
Katholieke Universiteit Leuven, Dept. of Mathematics, Leuven, Belgium ()
Abstract
We propose a multivariate model for financial assets which incorporates jumps, skewness, kurtosis and stochastic volatility, and discuss its applications in the context of equity and credit risk. In the former case we describe the stochastic behavior of a series of stocks or indexes, in the latter we apply the model in a multi-firm, value-baseddefault model. Starting from a independent Brownian world, we will introduce jumps and other deviations from normality, as well as non-Gaussian dependence, by the simple but very strong technique of stochastic time-changing. We work out the details in the case of a Gamma time-change, thus obtaining a multivariate Variance Gamma (VG) setting. We are able to characterize the model from an analytical point of view, by writing down the joint distribution function of the assets at any point in time and by studying their association via the copula technique. The model is also computationally friendly, since numerical results require a modest amount of time and the number of parameters grows linearly with the number of assets. The main feature of the model however is the fact that - opposite to other, non jointly Gaussian settings - its risk neutral dependence can be calibrated from univariate derivative prices. Examples from the equity and credit market show the goodness of fit attained. Finally, we focus on the pricing of Basket and FtD optionsusing Monte-Carlo simulations.
Understanding style investing portfolio simulations.
Eddy Verbiest
ING Investment Management ()
Abstract
Brokers publish research showing that quantitative stock selections simply based on ranked indicators outperform the market by several percent annually but descriptions often remain anecdotic and partial. On the other hand there is little convincing evidence that managed funds outperform and the debate on market efficiency lingers.
This paper shows portfolio simulations based on simple indicators that evidently outperform. The potential of size and value styles is confirmed and growth is shown to be effective if applied to a value base. Price momentum proves more complex and requires non-linear allocation models. Focus is on the dependency of the results on simulation parameters like backtesting time spans, periodicity of reallocation and allocation schemes and indicator models. The variation of results across regions and sectors is discussed as well as turnover and liquidity issues and simulation pitfalls like survival bias and data availability lags. Goal is to provide a sufficiently broad and consistent picture and understanding of stock portfolio simulations to interpret published research confidently and to understand stock markets on a statistical level in order to make ones own assessment of market’s and manager’s efficiency.
Given the broad area to cover to deliver a coherent overview this paper favours listing full results over selective interpretation. Popular time-dependent style allocation falls outside the scope of this paper.