S/w fAQ's
S.No   Category views Poted On
1 SQA and testing frequently asked definitions

TESTING

999 01/01/08
2 Load testing interview questions

TESTING

2547 01/01/08
3 Performance Testing Considerations

TESTING

525 01/01/08
4 what is testing?

TESTING

658 01/01/08
5 blackbox testing tips

TESTING

4254 01/01/08
6 Tester Tips

TESTING

6589 01/01/08
7 Interview with Brian Marick on How to do Good Test..

TESTING

254 01/01/08
8

WEB Testing Interview Questions For software teste...

TESTING

5846 02/02/08
9 General interview questions

TESTING

5554 02/02/08
10 Latest Questions in Testing Definations

TESTING

5885 02/02/08
11 Software Testing Interview Questions

TESTING

556 02/02/08
12 Interview Questions for Software Testers.

TESTING

658 02/02/08
13 Testing Interview Questions

TESTING

2135 02/02/08
14 Testing Tools Interview Questions

TESTING

245 02/02/08
15 TESTING TOOLS INTERVIEW QUESTIONS-Part2

TESTING

546 02/02/08
16 TESTING TOOLS INTERVIEW QUESTIONS-Part1

TESTING

879 02/02/08
17 Fuzz testing

TESTING

1245 02/02/08
18 Defect Tracking & Formal Verification

TESTING

471 02/02/08
19 Test Cases, Suits, Scripts

TESTING

501 02/02/08
20 Compatibility Testing

TESTING

2456 02/02/08
21 System Testing & Regression Testing

TESTING

4511 02/02/08
22 Beta Testing & Product Testing

TESTING

6548 02/02/08
23 Installation Testing & Alpha Testing

TESTING

235 02/02/08
24 Stability Testing & Acceptance Testing

TESTING

546 02/02/08
25 Usability Testing

TESTING

546 02/02/08
26 Stress Testing & Security Testing

TESTING

856 02/02/08
27 Performance Testing

TESTING

214 02/02/08
28 Unit Testing & Integration Testing

TESTING

568 02/02/08
29 White Box & Black Box Testing

TESTING

546 02/02/08
30 Interview questions on WinRunner TESTING 125 03/02/08
31 Testing Tools Interview Questions TESTING 658 03/02/08
32 Testing Tools Interview Questions-2 TESTING 5488 03/02/08
33 Testing Tools Interview Questions-3 TESTING 254 03/02/08
34 Testing Tools Interview Questions-4 TESTING 987 03/02/08
35 Testing Tools Interview Questions TESTING 2456 03/02/08
36 Testing Tools Interview Questions TESTING 2145 03/02/08
37 Software Testing 10 Rules-Bugs and fixes TESTING 985 03/02/08
38 How to Write a Fully Effective Bug Report TESTING 357 03/02/08
39 Testing Reviews--methodology and techniques TESTING 159 03/02/08
40 Load and Performance Test Tools TESTING 658 03/02/08
41 TESTING 856 03/02/08
42 Debugging Strategies, Tips, and Gotchas TESTING 2145 03/02/08
43 Web services programming tips and tricks: Stress t... TESTING 84754 03/02/08
44 Web services programming tips and tricks: improve ... TESTING 2358 03/02/08
45 WinRunner Interview Questions TESTING 3569 03/02/08
46 LoadRunner Interview Questions TESTING 1245 03/02/08
47 SilkTest Interview Question TESTING 845 03/02/08
48 Software QA and Testing Frequently-Asked-Questions... TESTING 21 03/02/08
49 Systematic Software Testing TESTING 254 03/02/08
50 Software Testing-Introduction TESTING 2586 03/02/08
51 Tips for Releasing Software for Customer Testing TESTING 358 03/02/08
52 Software Regression Testing TESTING 951 03/02/08
53 TestComplete 4 - Automate the Non-Automatable. TESTING 32558 03/02/08
54 webtest tools TESTING 245 03/02/08
55 webtest tools TESTING 956 03/02/08
56 Applying Patterns to Software Testing TESTING 845 03/02/08
57 The Software Testing Automation Framework TESTING 326 03/02/08
58 Testing Tools Interview Questions and Faqs-unanswe... TESTING 745 03/02/08
53 latest and unanswered Questions in Rational Robot ... TESTING 5125 03/02/08
54 Buttons TESTING 648 03/02/08
55 XPLANNER TESTING 213 03/02/08
56 Testing Tools Interview Questions TESTING 9547 03/02/08
57 Web services programming tips and tricks: TESTING 852 03/02/08
         

Performance Testing

In software engineering, performance testing is testing that is performed to determine how fast some aspect of a system performs under a particular workload.

Performance testing can serve different purposes. It can demonstrate that the system meets performance criteria. It can compare two systems to find which performs better. Or it can measure what parts of the system or workload cause the system to perform badly. In the diagnostic case, software engineers use tools such as profilers to measure what parts of a device or software contribute most to the poor performance or to establish throughput levels (and thresholds) for maintained acceptable response time.

In performance testing, it is often crucial (and often difficult to arrange) for the test conditions to be similar to the expected actual use.

Technology

Performance testing technology employs one or more PCs to act as injectors – each emulating the presence or numbers of users and each running an automated sequence of interactions (recorded as a script, or as a series of scripts to emulate different types of user interaction) with the host whose performance is being tested. Usually, a separate PC acts as a test conductor, coordinating and gathering metrics from each of the injectors and collating performance data for reporting purposes. The usual sequence is to ramp up the load – starting with a small number of virtual users and increasing the number over a period to some maximum.

The test result shows how the performance varies with the load, given as number of users vs response time. Various tools, including Compuware Corporation's QACenter Performance Edition, are available to perform such tests. Tools in this category usually execute a suite of tests which will emulate real users against the system. Sometimes the results can reveal oddities, e.g., that while the average response time might be acceptable, there are outliers of a few key transactions that take considerably longer to complete – something that might be caused by inefficient database queries, etc.

Performance testing can be combined with stress testing, in order to see what happens when an acceptable load is exceeded –does the system crash? How long does it take to recover if a large load is reduced? Does it fail in a way that causes collateral damage?

Performance specifications

Performance testing is frequently not performed against a specification, i.e. no one will have expressed what is the maximum acceptable response time for a given population of users. However, performance testing is frequently used as part of the process of performance profile tuning. The idea is to identify the “weakest link” – there is inevitably a part of the system which, if it is made to respond faster, will result in the overall system running faster. It is sometimes a difficult task to identify which part of the system represents this critical path, and some test tools come provided with (or can have add-ons that provide) instrumentation that runs on the server and reports transaction times, database access times, network overhead, etc. which can be analysed together with the raw performance statistics. Without such instrumentation one might have to have someone crouched over Windows Task Manager at the server to see how much CPU load the performance tests are generating. There is an apocryphal story of a company that spent a large amount optimising their software without having performed a proper analysis of the problem. They ended up rewriting the system’s ‘idle loop’, where they had found the system spent most of its time, but even having the most efficient idle loop in the world obviously didn’t improve overall performance one iota!

Performance testing almost invariably identifies that it is parts of the software (rather than hardware) that contribute most to delays in processing users’ requests.

Performance testing can be performed across the web, and even done in different parts of the country, since it is known that the response times of the internet itself vary regionally. It can also be done in-house, although routers would then need to be configured to introduce the lag what would typically occur on public networks.

It is always helpful to have a statement of the likely peak numbers of users that might be expected to use the system at peak times. If there can also be a statement of what constitutes the maximum allowable 95 percentile response time, then an injector configuration could be used to test whether the proposed system met that specification.

Tasks to undertake

Tasks to perform such a test would include:

* Analysis of the types of interaction that should be emulated and the production of scripts to do those emulations

* Decision whether to use internal or external resources to perform the tests.

* set up of a configuration of injectors/controller

* set up of the test configuration (ideally identical hardware to the production platform), router configuration, quiet network (we don’t want results upset by other users), deployment of server instrumentation.

* Running the tests – probably repeatedly in order to see whether any unaccounted for factor might affect the results.

* Analysing the results, either pass/fail, or investigation of critical path and recommendation of corrective action.