Solucient.com/eSachs

Load Test Plan

V 1.0

1. Test Plan Identifier

2. Introduction

Whenever new features, additions or improvements are made in the development of the Solucient / eSachs.com online application(s), there needs to be a way to measure what impact those changes have on both the server environment and client browsing software. Potential impact could mean better load handling, more efficient data retrieval and improved download speeds to client browsers. Those are just a few of the measures that could be calculated, and a full list of what is measured can be found in another section of this document.

In order to most effectively determine comparative gains or negative implications, there needs to be a structured and procedural testing methodology. The methods of the entire process should be documented, archived and have the ability to be referenced for a particular snapshot in time.

The goal of this document is an attempt to lay down the definitions and procedures to implement this kind of benchmarking methodology in the eSachs staging environment. (or production candidate)

3. Constraints

The test environment and/or the automated load-testing tool impose the following constraints.

·  Solucient has a maximum of 100 Virtual User licenses for the e-Load automated load-testing tool.

·  Each Virtual User requires approximately 6MB of RAM for execution. This severely limits the number of Virtual Users that can be hosted by a single machine. (Pic. X)

·  The e-Load tool measures only the response time for a request to be completely sent to the browser. Browser rendering time is not captured.

·  Some of the testing procedures will be exercised over the WAN, which can cause performance fluctuations.

·  Local AA machines have insufficient RAM & CPU to act as controllers.

·  Limitations within the current version of the e-Test Suite (5.12) software require all test machines to have Internet Explorer 5.x, service pack 6 installed for NT platform, or win 2000 service pack 2.

4. Test Items

The following list summarizes the objectives of the load test. The detailed test scenarios of each test objective are provided in the Testing Scenarios section.

  1. Measure the effect of incrementally increasing load for users viewing reports. Users should be added at regular intervals. Users will be added until the resource limits of the test machines are reached or the limit of 100 virtual user licenses is reached, whichever is greater.
  2. Measure the effect of incrementally increasing load for users viewing charts. Users should be added at regular intervals. Users will be added until the resource limits of the test machines are reached or the limit of 100 virtual user licenses is reached, whichever is greater.
  3. Measure the effect of incrementally increasing load for users viewing maps. Users should be added at regular intervals. Users will be added until the resource limits of the test machines are reached or the limit of 100 virtual user licenses is reached, whichever is greater.
  4. Measure user response time for a single user viewing a map repeatedly over 12 hour time period.
  5. Measure user response time under estimated peak user load over the T1 line speeds. (28.8K modem, 56K modem, will not be tested)
  1. Determine the system failure point for simultaneously requested maps. This test should begin with X users, ramping up to Y users in increments of Z. Users should be ramped up every X minutes/iterations.
  2. Measure the length of time the system can operate under estimated average user load with less than X% failures.
  3. Measure the length of time the system can operate under estimated peak user load with less than X% failures.
  4. Measure the effect of all users logging on with the same user ID vs. all users logging on with unique user IDs. Measurements should be taken with the system under estimated peak user load.
  5. Determine the system failure point for throughput volume. Maximum throughput should be measured independently for performing searches, viewing maps, viewing charts, and viewing reports.

All tests are to be conducted in the staging environment unless specified otherwise. The test scenarios will be executed using the e-Test Suite tools from Emperix Software. The test scenarios will be executed using the machines shown in Table X below.

Machine / Name / Processor Speed (MHz) / Memory (MB) / Operating System / Browser
1 / QATESTBOX-WKS. (Controller) / 1500 MHz / 523.344 / MS Win2000
Sp2 / IE 5.5
2 / AARSW01. / 350 / 130 / MS Win2000
Sp2 / IE 5.5
3 / AARSW02. / 350 / 130 / MS Win2000
Sp2 / IE 5.5
4 / AARSW03. / 350 / 130 / MS Win2000
Sp2 / IE 5.5
5 / AARSW04. / 350 / 130 / MS Win2000
Sp2 / IE 5.5
6 / AARSW05. / 350 / 130 / MS Win2000
Sp2 / IE 5.5
7 / KKOETZ-WKS / 761 / 129 / WinNT sp 6 / IE 6.0
8 / ANATANOV-WKS / 598 / 261 / WinNT sp 6 / IE 5.5
9 / LLANG-WKS / 598 / 130 / WinNT sp 6 / IE 5.5
10 / GFELICIANO-WKS / 598 / 130 / WinNT sp 6 / IE 5.5
11 / JMARKS-WKS / 347 / 130 / WinNT sp 6 / IE 5.5

Table X. Test Machine Configurations

Note: all AA machines are a part of WGAASTRESS workgroup.

5. Definitions

Agent & Controller E-Load Stations – The Agent/Controller Workstation relationship is a Client/Server Relationship (Actually more of a Slave/Server relationship). The Controller (i.e. Server) Workstation (ex: storres-wks) is where you set all of the E-Load Testing Options (Defining Scenario Profiles, Virtual Users, Even which Agent Workstations to use, etc.). The Agent (i.e. Slave) Workstations acts as a Slave E-Load Workstation to the Controller. The Agent workstations can perform their E-Load functions in the background of normal workstation activities. The user of the Agent Workstation has no control over any of the parameters of the Load Testing being performed (other than normal use & competition for System Cycles).

Caching Type—You have the choice of “First Time User” or “Repeat User”. “First Time User” will produce more load on the server because it simulates each iteration of the visual script being performed as if the user visited the site for the first time (i.e. downloading all pages & images will not be cached). If you choose “Repeat User” then the Pages & Images will be cached & the user should operate faster & with less load on the server.

Connection Speed – The Line speed at which E-Load will simulate for the Virtual User’s Internet connection. You can simulate a modem connection of 28K or 56K or the maximum speed at which your workstations are connected to the Internet (i.e. T1, DSL, etc.).

Delay Between Iterations – The amount of time (in seconds) to wait between iterations of Virtual User Runs.

Download Images – When selected, images on the Web site or application will be downloaded as part of virtual user playback.

Duration – After each initial request for a page by the Scenario Profile Script, there is a period of time that passes before the page is completely retrieved from the web-server. This time is named the “duration”. The duration measurement only shows in E-Tester for the individual Scenario Scripts. It is a parameter defined under the Address Selections of every page. In E-Load the you can set the maximum duration before a timeout by going to the following menu option: Options>Preferences>”Timeout in Seconds for Vus”

Iteration – One complete cycle (run-through) of the script/scripts making up a Scenario Profile. If the scripts have run-through to completion once, then they are defined as having run through 1-iteration. This variable is used to define a time period at which to stop the entire load test and/or add more virtual users (rampup) to the load test.

Number of Virtual Users – Is defined on a PER Scenario Profile Basis. Each Profile Scenario can have the number of Virtual Users defined as unique from the other Scenario Profiles. This number is the maximum number of Vus that will be activated per (ramp-up rules) for this particular Scenario Profile.

Number of V.U.s Reporting – Is defined on a PER Scenario Profile Basis. This number is defined as the maximum number of Vus that will be reporting data back to E-Load for use in E-Reporter for run-time (i.e. Performance, Errors, Etc.) & in-session reports.

On Error View HTML—when selected, the virtual user display automatically shows the page with errors that was returned by the Web Server. This gives you a running view & record of all pages with HTML Output that had errors.

Page Request – Each step in a Scenario profile makes a page request from the web-server by submitting a particular web address (URL) that could also contain some custom-parameters to define certain aspects of a dynamically created web page. Looking at a visual testing script in e-test would show each line of the script to be an individual “page request”.

Run Virtual Users in Separate Processes – When selected all Virtual users run as separate processes. Use this setting if your Web application uses cookies to manage session and other context information, or if it requires user authentication (for example, username and password). When cleared, all virtual users run as threads inside of one process. Clear this setting if your Web application does not use any cookies. This is the most efficient way of running virtual users and consumes the smallest amount of system resources.

Scenario Profile – Either an e-Test visual script will be used to emulate a users operating characteristics or a custom-defined user profile which could consist of a combination of several individual e-Test visual scripts used to emulate a particular users profile of operating characteristics. The visual scripts are initially recorded in e-Test.

User Mode—You have the choice of “Thick Client” or “Thin Client”. Thick Client will allow virtual users to run with full browser capabilities including support for VBScript, JavaScript, and Java applets. Thick Client also consumes more Ram per virtual user (approx. 1 MB per virtual user). Thin Client will allow virtual users to run with limited browser capabilities. Thin Client does not support VBScript, JavaScript, or Java applets.

Virtual Users (Vusers, VUs)- A Sub Process running on a Controlling Load Test Workstation or an Agent Load Test Workstation. The Process emulates the properties of a Software User as defined by the “Scenario Profile”. The Virtual User has many attributes defined by e-Load such as Number of Virtual Users, Number of Virtual Users Reporting, Which Workstation will the Virtual User run on (i.e. controller or one of the Agents), Which Browser to emulate, Delay Time between virtual user iterations, Pacing of visual scripts, type of user to simulate, image downloading, use of databanks, error handling, setup of VU display component…

Virtual User Pacing– after a requested page is completely retrieved there is a delay that can occur before the next step in the Scenario Profile (visual script) occurs. This delay is the Virtual User Pacing and can be defined as Recorded, Random, or None. When choosing “Recorded” E-Load uses the “Delay” that was recorded when the original E-Tester script was created. Choosing None will play back the scripts at the fastest possible speed.

V.U. Display Ready – When selected, this profile of virtual users are setup to be viewed immediately in the virtual user display. The virtual user display is a separate browser-type window where you can view the actions of specific virtual users. If VU Display Ready is cleared, and you choose to view the progress of a (or a set of) virtual users at run-time, there will be a delay for the initial communication to be established between the simulation servers and the virtual user display. Pages start to appear in the virtual user display after that initial delay.

5. Testing Environment

(This section will detail the staging servers and network environment. The servers should be detailed in their own right, including hardware specs, OS version/service packs/hotfixes and software versions. Also included should be what is running on the servers at the time of the testing. Not every, little piece of software installed on each server needs to be documented, but anything that is running during testing should be documented (including such things as anti-virus software and so on).

e-Sachs Staging Web Server

DELL PE 4350

Web0.hcia.com

157.199.248.24

Hostname: WEB0

RAM: --

NIC: --

OS: WindowsNT Server

Software:

·  IIS –

·  MS Scripting Run-Time Library –

·  SQL Server Driver –

·  Etc.

Staging SQL Server

Hostname: DB0

Dell PE 4350

Db0hcia.com

157.199.248.22

CPU: --

RAM: --

NIC: --

OS: WindowsNT Server

Software:

·  SQL Server-

·  Etc.

Staging OLAP Server

Hostname: OLAP0

Dell PE 6350

Olap0.hcia.com

157.199.248.20

CPU: --

RAM: --

NIC: --

OS: WindowsNT Server

Software:

·  SQL Server- Version 7.0, Service Pack 1

·  SQL Server OLAP Services – Version 1.0, Service Pack 1

·  Etc.

Staging Map Server

Hostname: MAP0

Dell PE 6350
map0.hcia.com

157.199.248.21

CPU: --

RAM: --

NIC: --

OS: WindowsNT Server 4.0, SP3

Software:

·  Etc.

Staging Applications Server

Hostname: APP0

Dell PE 4350
app0.hcia.com