S/w fAQ's
S.No   Category views Poted On
1 SQA and testing frequently asked definitions

TESTING

999 01/01/08
2 Load testing interview questions

TESTING

2547 01/01/08
3 Performance Testing Considerations

TESTING

525 01/01/08
4 what is testing?

TESTING

658 01/01/08
5 blackbox testing tips

TESTING

4254 01/01/08
6 Tester Tips

TESTING

6589 01/01/08
7 Interview with Brian Marick on How to do Good Test..

TESTING

254 01/01/08
8

WEB Testing Interview Questions For software teste...

TESTING

5846 02/02/08
9 General interview questions

TESTING

5554 02/02/08
10 Latest Questions in Testing Definations

TESTING

5885 02/02/08
11 Software Testing Interview Questions

TESTING

556 02/02/08
12 Interview Questions for Software Testers.

TESTING

658 02/02/08
13 Testing Interview Questions

TESTING

2135 02/02/08
14 Testing Tools Interview Questions

TESTING

245 02/02/08
15 TESTING TOOLS INTERVIEW QUESTIONS-Part2

TESTING

546 02/02/08
16 TESTING TOOLS INTERVIEW QUESTIONS-Part1

TESTING

879 02/02/08
17 Fuzz testing

TESTING

1245 02/02/08
18 Defect Tracking & Formal Verification

TESTING

471 02/02/08
19 Test Cases, Suits, Scripts

TESTING

501 02/02/08
20 Compatibility Testing

TESTING

2456 02/02/08
21 System Testing & Regression Testing

TESTING

4511 02/02/08
22 Beta Testing & Product Testing

TESTING

6548 02/02/08
23 Installation Testing & Alpha Testing

TESTING

235 02/02/08
24 Stability Testing & Acceptance Testing

TESTING

546 02/02/08
25 Usability Testing

TESTING

546 02/02/08
26 Stress Testing & Security Testing

TESTING

856 02/02/08
27 Performance Testing

TESTING

214 02/02/08
28 Unit Testing & Integration Testing

TESTING

568 02/02/08
29 White Box & Black Box Testing

TESTING

546 02/02/08
30 Interview questions on WinRunner TESTING 125 03/02/08
31 Testing Tools Interview Questions TESTING 658 03/02/08
32 Testing Tools Interview Questions-2 TESTING 5488 03/02/08
33 Testing Tools Interview Questions-3 TESTING 254 03/02/08
34 Testing Tools Interview Questions-4 TESTING 987 03/02/08
35 Testing Tools Interview Questions TESTING 2456 03/02/08
36 Testing Tools Interview Questions TESTING 2145 03/02/08
37 Software Testing 10 Rules-Bugs and fixes TESTING 985 03/02/08
38 How to Write a Fully Effective Bug Report TESTING 357 03/02/08
39 Testing Reviews--methodology and techniques TESTING 159 03/02/08
40 Load and Performance Test Tools TESTING 658 03/02/08
41 TESTING 856 03/02/08
42 Debugging Strategies, Tips, and Gotchas TESTING 2145 03/02/08
43 Web services programming tips and tricks: Stress t... TESTING 84754 03/02/08
44 Web services programming tips and tricks: improve ... TESTING 2358 03/02/08
45 WinRunner Interview Questions TESTING 3569 03/02/08
46 LoadRunner Interview Questions TESTING 1245 03/02/08
47 SilkTest Interview Question TESTING 845 03/02/08
48 Software QA and Testing Frequently-Asked-Questions... TESTING 21 03/02/08
49 Systematic Software Testing TESTING 254 03/02/08
50 Software Testing-Introduction TESTING 2586 03/02/08
51 Tips for Releasing Software for Customer Testing TESTING 358 03/02/08
52 Software Regression Testing TESTING 951 03/02/08
53 TestComplete 4 - Automate the Non-Automatable. TESTING 32558 03/02/08
54 webtest tools TESTING 245 03/02/08
55 webtest tools TESTING 956 03/02/08
56 Applying Patterns to Software Testing TESTING 845 03/02/08
57 The Software Testing Automation Framework TESTING 326 03/02/08
58 Testing Tools Interview Questions and Faqs-unanswe... TESTING 745 03/02/08
53 latest and unanswered Questions in Rational Robot ... TESTING 5125 03/02/08
54 Buttons TESTING 648 03/02/08
55 XPLANNER TESTING 213 03/02/08
56 Testing Tools Interview Questions TESTING 9547 03/02/08
57 Web services programming tips and tricks: TESTING 852 03/02/08
         

Performance Testing Considerations

The Need For Performance Testing

Performance is a "must have" feature. No matter how rich your product is functionally, if it fails to meet the performance expectations of your customer the product will be branded a failure.

Application architectural design decisions may be greatly influenced by the importance placed by the customer on one or more specific requirements. Incorrect design decisions, made at the outset of a project as a result of invalid assumptions, may become impossible to remedy downstream. (Remember: What we 'ASSUME' can often make an '***' out of 'YOU' and 'ME'.)
Set Performance Testing Objectives
It is useful during performance testing to start by setting clear objectives. More often than not, your performance tests will seek to achieve one or more of the following objectives:
Identify system bottlenecks.
Verify current system capacity.
Verify scalability of the system.
Determine optimal hardware/software configuration for your product.

Dealing with the identification of system bottlenecks is a good place to start. The scalability and capacity of your system will often be directly constrained by a bottleneck, although identifying and removing a bottleneck often leads to the discovery of yet another bottleneck, so be prepared for the long haul.
Determine Customer Requirements Early

It is extremely important that you fully understand your customer's intentions and requirements as early as possible regarding software performance i.e. the operating environment (both hardware and software) in which your product will be deployed and the manner in which it will be used.
To begin to identify your customer's requirements you must determine:
Transaction mix.
Usage patterns.
Data volumes.
Maximum allowable response times.
Minimum transaction throughput rates.

Note: Determining the above information is particularly important when a customer is seeking your advice on purchasing suitable hardware and software specifications in order to best deploy your product. As there will undoubtedly be lead times to purchasing and configuring such items, your customer is likely to want this information at the earliest opportunity.
Determine The Transaction Mix

The "transaction mix" that your product must cope with is determined by the number of functions your product implements and the way in which those functions are executed by users as part of the activities they each perform in relation to their individual roles. Try to identify key user groups and list the activities associated with each user role.

In addition to the activities performed by users in different roles, the transaction mix is also dependent upon the number of concurrent users and the frequency of their activities.
Determine Usage Patterns

To accurately simulate system usage it is important to understand the intended usage patterns for your product. By studying user roles and quantifying the frequency and concurrency of user activities it becomes possible to begin to predict user behaviour and usage patterns that can be simulated in the test environment. Your test cases must simulate real usage patterns to be meaningful.

Try to determine different levels of usage occuring over time such as normal usage and peak usage. From this information you can estimate double peak usage levels and design appropriate system stress tests to push your product to limits that it might not normally achieve.

Tip The number of virtual user licenses required for your automated testing tools to test your product need not necessarily agree with the intended user base size. On most occasions, automated regression testing tools will out-perform individual user activities which need not run concurrently and can therefore be used to simulate the load of a much larger number of 'real' users. A little effort with your calculator can help save on the cost of expensive virtual user licenses.
Data Volumes

If your application creates data then you need to consider the impact of increasing data volumes over time on application performance. You will need to forecast data volume sizes based on usage patterns and then create test data volumes as appropriate to simulate future data volume scenarios.

Creating or obtaining large data volumes can be a problem. Large volumes of test data can take considerable time to create as well as introduce unexpected hardware requirements during development.

Expect to have to repeat your test cycles many times. Automated testing tools are therefore essential along with a fast means of backing up and restoring your test data and environments.
Design Tests Carefully

To design credible tests requires an intimate understanding of the system transaction mix i.e. user activities and behaviours. Even when armed with this knowledge, time and costs will ensure it will not be possible to test every conceivable scenario and so tests need to be considered carefully.

All tests must simulate real user activities under a variety of circumstances providing sufficient data to allow plotting of meaningful graphs. Unfortunately this is more easily said than done so expect to spend a reasonable amount of time designing tests and their associated pass / fail criteria.

Tip Beware of false (variable) results caused by a hot data cache or one-off costs caused by JIT (just-in-time) compilation. To overcome such issues you may wish to avoid cyclic use of primary key values or allow an operational warm up time before you start gathering metrics.
Keep Tight Control Of Your Test Environment

For performance tests to be meaningful, your test environment must be kept under strict control with no unauthorised changes being made (such as the installation of service packs or tweaking of configuration settings etc.) which might falsely influence test results leading to possible incorrect conclusions.

Take steps to ensure your test environment is isolated from variance such as network traffic or scheduled tasks to ensure that test results are repeatable. Only variations which you choose to make as part of your test strategy must be allowed. (Beware memory leaks which can prevent repeatable test results!)

Your test environment (hardware and software) should mimic the intended deployment environment of your customer as closely as possible. The manner in which you install and configure test software builds of your product should also mimic the way your customer intends to install and configure your product.
Collect Metrics As You Go

During performance testing, be sure to take and record precise measurements in a controlled environment. Treat each test as a controlled experiment. Don't just measure response times, the number of users or transaction throughput rates. Take note of system performance counters such as processor usage, memory usage, network traffic, disk input/output etc. as these can provide developers with valuable clues regarding the cause of a bottleneck or failure.

Performance rarely degrades gracefully. More often than not performance degrades drastically when circumstances suddenly change. Under such circumstances, the more operational data you have at your fingertips, the sooner you are likely to diagnose problems which cause sudden performance degradation.

Performance counter information can also be used to derive the possible impact of vertical scaling on performance i.e. how upgrades to an individual computer such as adding additional memory, processors or faster disks might improve performance.
Conclusion

Performance testing requires a different mind set and skill set to that of functional testing and is best started early in the development life cycle whenever possible.

Understanding customer requirements and expectations, as well as user activities and behaviours, is key to designing suitable tests.

Ensure tests represent realistic usage of the application.

Test environments must be carefully controlled to prevent unauthorised modifications which might falsify test results.

Automated test tools coupled with fast backup and restore mechanisms are essential due to the need to repeat tests many times.

System bottlenecks can rapidly become very technical in nature and consume considerable resource and effort to diagnose. Resolution may require considerable re-work and even re-design of your product.

Even after performing a significant number of tests and gathering a considerable amount of data and test results there is still a possibility that the wrong conclusions may be drawn by developers inexperienced in performance testing.

Seeking advice from specialist consultants can offer a cost effective means of designing tests, diagnosing bottlenecks, interpreting test results and resolving performance problems quickly.