Test Management Forum meeting @ Balls Brothers, Mincing Lane, London 29th Oct 2008
Notes from the Performance Group sessions
In Session A, Gordon McKeown of Facilita and Alan Gordon of SQS highlighted ways NOT to do Performance Testing (see slides)
In Session B (with a suitably Halloween-inspired title) - Back From The Grave: Your Worst Performance Testing Nightmares
What keeps you (performance testers) awake at night? - the list started with contributions from Gordon and Alan and with further contributions from the audience included the following:
- Emerging problems
- green computing (power consumption, virtualisation, heat generation, auto closedown/startup)
- increasing complexity (web 2.0, cloud)
- user expectations
- Unrealistic expectations of management, users, etc.
- Overenthusiastic technicians
- Lack of education/understanding at management level
- This was compared with the education needed to understand functional testing 20 years ago
- Performance experts need to make what they do understandable
- Performance requirements
- How do we get them (and other non-functional requirements)?
- Are NF requirement simply requirements that are not well understood?
- Every functional requirement has a corresponding (and usually unstated) non-functional requirement
- Multi-user testing of functional requirements often overlooked
- Agile/Test-driven development (does this ever include anything relating to performance?)
- Most plans assume successful testing
- Need for contingency plans for failing performance test
- Performance testers should be trained in bedside manner (the patient may not survive)
- Performance testers often detect the need for change
- But are often not in a position to instigate change
- They need to have clear communication links to the change makers
There was then an informal panel discussion. On the panel we had Gordon McKeown of Facilita, Alan Gordon of SQS, and an independent performance test manager, Graham Dwyer, who agreed to chair the session.
The point was made that successful performance testing projects need a wide range of skills but are frequently carried out by very small teams, often teams of only 1 or 2 people. So it is hard for them to be expert in all these fields, which include:
- Software & script development
- Configuration management
- Infrastructure management
- Data management
- Requirements management
- etc...
A recent blog by Goranka Bjedov, Senior Test Engineerthe at Google was mentioned as highlighting many common issues for performance testers in general and has some good advice for similar projects (see this link:
There was also discussion about potential topics for future Performance Group sessions and the panel welcomed suggestions for speakers and voluteers to facilitate sessions. The following topics were noted - let us know if we left anything out.
- Terminology for performance testing seemed to be unclear/unstandardised
- ISEB glossary definitions may be a start
- Boilerplates for performance testing
- Best paractices
- Templates
- Bridging the gap between Non-functional Requirements and Functional Requirements
- Communication and organisational structure
- ownership of QA etc.
- SLA's
- Analysis and reporting of performance testing
- introduction to statistics for performance engineers
- Different tools for different tests
- open source tools
- proprietory tools
- how, why & when?
Graham Dwyer 31st October 2008