S/w fAQ's
S.No   Category views Poted On
1 SQA and testing frequently asked definitions

TESTING

999 01/01/08
2 Load testing interview questions

TESTING

2547 01/01/08
3 Performance Testing Considerations

TESTING

525 01/01/08
4 what is testing?

TESTING

658 01/01/08
5 blackbox testing tips

TESTING

4254 01/01/08
6 Tester Tips

TESTING

6589 01/01/08
7 Interview with Brian Marick on How to do Good Test..

TESTING

254 01/01/08
8

WEB Testing Interview Questions For software teste...

TESTING

5846 02/02/08
9 General interview questions

TESTING

5554 02/02/08
10 Latest Questions in Testing Definations

TESTING

5885 02/02/08
11 Software Testing Interview Questions

TESTING

556 02/02/08
12 Interview Questions for Software Testers.

TESTING

658 02/02/08
13 Testing Interview Questions

TESTING

2135 02/02/08
14 Testing Tools Interview Questions

TESTING

245 02/02/08
15 TESTING TOOLS INTERVIEW QUESTIONS-Part2

TESTING

546 02/02/08
16 TESTING TOOLS INTERVIEW QUESTIONS-Part1

TESTING

879 02/02/08
17 Fuzz testing

TESTING

1245 02/02/08
18 Defect Tracking & Formal Verification

TESTING

471 02/02/08
19 Test Cases, Suits, Scripts

TESTING

501 02/02/08
20 Compatibility Testing

TESTING

2456 02/02/08
21 System Testing & Regression Testing

TESTING

4511 02/02/08
22 Beta Testing & Product Testing

TESTING

6548 02/02/08
23 Installation Testing & Alpha Testing

TESTING

235 02/02/08
24 Stability Testing & Acceptance Testing

TESTING

546 02/02/08
25 Usability Testing

TESTING

546 02/02/08
26 Stress Testing & Security Testing

TESTING

856 02/02/08
27 Performance Testing

TESTING

214 02/02/08
28 Unit Testing & Integration Testing

TESTING

568 02/02/08
29 White Box & Black Box Testing

TESTING

546 02/02/08
30 Interview questions on WinRunner TESTING 125 03/02/08
31 Testing Tools Interview Questions TESTING 658 03/02/08
32 Testing Tools Interview Questions-2 TESTING 5488 03/02/08
33 Testing Tools Interview Questions-3 TESTING 254 03/02/08
34 Testing Tools Interview Questions-4 TESTING 987 03/02/08
35 Testing Tools Interview Questions TESTING 2456 03/02/08
36 Testing Tools Interview Questions TESTING 2145 03/02/08
37 Software Testing 10 Rules-Bugs and fixes TESTING 985 03/02/08
38 How to Write a Fully Effective Bug Report TESTING 357 03/02/08
39 Testing Reviews--methodology and techniques TESTING 159 03/02/08
40 Load and Performance Test Tools TESTING 658 03/02/08
41 TESTING 856 03/02/08
42 Debugging Strategies, Tips, and Gotchas TESTING 2145 03/02/08
43 Web services programming tips and tricks: Stress t... TESTING 84754 03/02/08
44 Web services programming tips and tricks: improve ... TESTING 2358 03/02/08
45 WinRunner Interview Questions TESTING 3569 03/02/08
46 LoadRunner Interview Questions TESTING 1245 03/02/08
47 SilkTest Interview Question TESTING 845 03/02/08
48 Software QA and Testing Frequently-Asked-Questions... TESTING 21 03/02/08
49 Systematic Software Testing TESTING 254 03/02/08
50 Software Testing-Introduction TESTING 2586 03/02/08
51 Tips for Releasing Software for Customer Testing TESTING 358 03/02/08
52 Software Regression Testing TESTING 951 03/02/08
53 TestComplete 4 - Automate the Non-Automatable. TESTING 32558 03/02/08
54 webtest tools TESTING 245 03/02/08
55 webtest tools TESTING 956 03/02/08
56 Applying Patterns to Software Testing TESTING 845 03/02/08
57 The Software Testing Automation Framework TESTING 326 03/02/08
58 Testing Tools Interview Questions and Faqs-unanswe... TESTING 745 03/02/08
53 latest and unanswered Questions in Rational Robot ... TESTING 5125 03/02/08
54 Buttons TESTING 648 03/02/08
55 XPLANNER TESTING 213 03/02/08
56 Testing Tools Interview Questions TESTING 9547 03/02/08
57 Web services programming tips and tricks: TESTING 852 03/02/08
         

Software Regression Testing

Attain A Known State And Stay There

It is always preferable to release software in a 'known' state compared to software in an unknown state which may contain surprises for both the developers and the users!

Making changes to software which is in a known state can, and often does, pose a serious threat to that known state. Every developer knows that even the smallest change can render software useless if the implications of the change were not properly understood or if the change was insufficiently tested during and after implementation.

Changes to software are generally unavoidable. Eventually the need arises to amend software, either to correct defects or to modify functionality, and those changes may be required at short notice. It is under these circumstances that the investment in automated regression testing tools can begin to pay back big time.
The Need For Automation

Without automated regression testing tools the emphasis remains on manual testing to re-affirm the 'known' state of software after changes. This may tie up a large number of expensive resources or simply prevent changes from being delivered successfully at short notice. Under such circumstances pressures can lead to the shipping of software which has not been sufficiently re-tested after changes, sometimes with dire consequences.

To remain competitive, software developers must be able to implement changes to software quickly and reliably. Whilst no one doubts that changes can be made quickly, doubts about the reliability of the changes must be dispelled with proof. To support rapid change, testing must therefore be both thorough and quick, leaving little option but to automate the testing process.
Sow And Ye Shall Reap ... Eventually

The decision to use automated regression testing tools is not one to be taken lightly. Acquiring the tools, which may be expensive, is only the beginning. Expect to incur further costs for training and test script creation as well as for maintenance to keep the test scripts in synchronisation with your changing software.

For feature-rich software, the lead time to create test scripts to ensure adequate coverage of both the software's key features and its code base should not be under-estimated.

If adequate resource is not applied in the early stages of development then the test scripts may continually lag behind the software and thereby never offer a full coverage test leaving the state of the software 'unknown'. It is therefore essential that your project schedule reflects both the resource and effort required to create, maintain and execute regression tests to achieve full coverage of the code base.
Ensure Tests Provide Full Coverage

For software with a large code base, ensuring full coverage is not a small undertaking and you should therefore plan to deliver coverage in stages, perhaps choosing to address the most important or commonly used software features first.

To achieve full coverage requires two technical challenges to be overcome. Firstly, there is the challenge of identifying the lines of source code which are executed during testing, and those which are not. Secondly, comes the problem of designing new tests to exercise code paths which are currently not being reached.

If full coverage of the code base is not achieved by testing then the software remains in an unknown state and thus carries an uncertain risk of failure.

Note: Unless source code contains suitable reference comments to enable tracing of code back to functional requirements, it can prove a difficult task to design new tests that ensure coverage of all code paths. Dead code can also be misleading and is best removed.
Design Applications With Testing In Mind

The purpose of regression testing is to exercise all code paths fully and to confirm that the software under test continues to function as expected. The tests thus serve as an 'insurance policy' to aid identification of unexpected status changes during software operation.

To achieve this goal, regression tests must contain regular check points where the current status can be compared with the expected status and any mismatch reported immediately. By taking regular snapshots of screen images and data etc. comparisons of current versus expected status can easily be made and differences detected quickly.

Because not every element of functionality can be linked to visual display elements, it helps if features within the software can be enabled during testing to create additional trace output which can be used in snapshot comparisons. (Debug builds created by conditional compilation switches enabling verbose trace information provide an excellent means of achieving this.)

For some components, it is possible to perform isolated testing via the use of test harness applications while database tools may provide facilities for extracting data, or details of data which has changed, which can then be used in making comparisons.

The greater the number of check point comparisons, the more likely the regression tests are to find differences when they occur. Tests must be designed carefully so as to minimise unnecessary maintenance work on test scripts whenever software changes are made, yet retain their effectiveness at detecting differences.

Failure to detect a difference means that the impact of a change may pass unnoticed leading to potential software failure downstream. Regression tests around software integration points in particular must be constructed to ensure impact on external systems is detected as early as possible.

Note: Comparisons must take into account (or preferably design out) differences between operational and baseline data occuring due to date/time related fields and values.
Schedule With Testing In Mind

Just as logistical issues related to coding tasks require careful planning and sequencing to implement them efficiently, so too do testing tasks.

Changes to functionality should be implemented in stages whenever possible thereby allowing regression tests to be more easily kept in synchronisation with software changes and for new baseline comparisons to be established.

New functionality added to software should always be matched by new regression tests and confirmation that new baseline comparisons have been created and that all new code paths are covered by the tests. Bug fixes on the other hand are likely to require modification to existing tests and baseline comparisons.
Conclusion

Successful integration of automated regression testing tools into the development process offers greater levels of confidence that changes introduced into software at short notice will not go without testing nor will they cause unexpected consequences when the software is later shipped to users.

Automated testing enables thorough testing of software to be conducted both quickly and repeatedly. No longer are expensive resources tied up for long periods of time to repeat tedious manual tests.

The efforts of testers can instead focus on designing and implementing high quality test cases. Once designed and scripted, each test can be repeated quickly on demand in a fraction of the time taken to create the test case.

The benefits of automated regression testing tools however do not come for free and require considerable up-front effort before they can be realised. Without automation, the emphasis remains on tedious and slow manual testing to ensure that all changes are thoroughly re-tested before software is shipped. If you need faster turn-around of changes combined with higher levels of confidence that your changes didn't 'break' things, then perhaps automated regression testing can help you.