Anything not covered in regression series.
- Repeating the same tests means not looking for the bugs that can be found by other tests.
- Pesticide paradox.
- Low yield from automated regression tests.
- Maintenance of this standard list can be costly and distracting from the search for defects.
- The tests exist already (no need for new design, or new implementation, b
there will be maintenance cost)
- Many regulators and process inspectors like them.
- We are investing in reuse we can afford to take time to craft each test carefully. Making it more likely to be powerful in future use.
- This is the dominant paradigm for automated testing. So it is relatively easy to justify and there are lots of commercial tools.
- Implementation of automation tools is often relatively quick and easy 9though maintenance can be nightmare)
- Good regression testing gives clients confidence that they can change the product (or product environment).
Regression software changes
System design changes
Low level design document changes
Change algorithm logic
Change component structure
Internal data types& name
Internal structures. Such as class relation ships.
Control flow & data flow
Component interface changes
Protocol messages & formats.
Technology or language changes.
Types of system changes
Types of product impacts
Affect design, coding & testing document
Affect coding & tests
Affect associated components
Affect system architecture
Affect related component interactions
Affect test cases, test data, test scripts
Affect test specification
Code change impacts
Affect other tests
Affect test documentation
Affect other documents
Why do we regression testing
In any application new functionalities can be added so the application has to be tested to see whether the added functionalities have affected the existing functionalities or not. Here instead of retesting all the existing functionalities baseline scripts created for these can be return and tested.
Regression types of changes
Requirements specs. Changes.
Add new functional features.
Change current functional features.
Delete existing functions features.
System architecture charges.
Change component interactions.
Add new components/subsystems.
Update existing components.
Delete existing components.
High level design doc changes.
Change statebased behaviors.
Change component interfaces.
Change database design.
Change GUI design.
Change function design.
How do we regression testing?
Various automation tools can be used to perform regression testing like Win Runner, Rational Robot and Silk test.
Testing to see system functions in acceptable time frame under simultaneous user load
Ex :- response time, bottle necks(decline performance, bench mark(top limit)
Performance testing focuses
System process speed (max./min./average)
-system processes, tasks, transactions, responses
-data retrieval, data loading
system throughput (max./min./average)
-loads, messages, tasks, processes.
- message /event, task/process
system utilization (max./min./average)
-network, server/ client machines
System availability (component-level/system-level)
-system network, computer hardware/software
system reliability( component-level/system-level)
-system network, computer hardware/software
system scalability (component-level/system-level)
-load /speed /throughput boundary
-improvements on process speed, throughput
System successes/failure rates for
-communications, transactions, connections
-call processing, recovery
domain -specific/ application-specific
-real-time report generation speed
Performance testing process
Understand system and identify performance requirements
Identify performance test objectives and focuses
Define performance test strategy
Define /select performance evaluation models
Define/select performance test criteria
Define and identify performance test metrics
Identify the needs of performance test tools
Define performance test environment
Write performance test plan
Develop performance test tools and support environment
Setup target system and performance test beds
Design performance test cases and test suite
Performance test execution and data collection
Performance analysis and reporting
Performance testing tools
They are classified into
-Message based or table-based simulators
-State- based simulators
Model- based Data generators
-Pattern -based data generators
-Random data generators
Performance data collectors & tracking tools
-Performance tracking tools
Performance evaluation & analysis tool
-Performance metric computation
-Model-based performance evaluation tool
-Sniffer Microsoft performance monitor
-External third party tools
Performance report generators
- Refers to test activities on checking system performance
- To confirm and validate the specified system performance requirements
- To check the current product capacity to answer the questions from customers and marketing people
- To identify performance issues and performance degradation in a given system
Using a well-defined approach to study, analyze, and measure the performance of a given system
Tasks & scope
Collect system performance data
Define system performance metrics
Model system performance
Measure, analyze, estimate system performance
Present and report system performance
Understand product capacity
Discover system performance issues
Measure and evaluate system performance
Estimate and predict system performance
Well- defined performance metrics
Well- defined performance evaluation models
Performance evaluation tools and supporting environment
Performance Evaluation Metrics
-Call request process time
-Event interaction latency
Throughput metrics(Component & System level)
-Call processing throughput
-Call load throughput rate
Availability metrics(Component & System level)
Reliability metrics(Component & System level)
Common used Performance Metrics
- Functional or process speed metrics
- User response time metric
- Communication metric
- Transaction speed metric
- Latency metric
Performance metrics for portal V.5. products
- Call process time metric
- Call process speed metric
- Event latency metric
PERFORMANCE EVALUATION - MODELS
A well defined formal model which depicts different prospects of system performance of a system
Why do we need performance evaluation models?
- To present the system performance properties
- To provide a guideline for engineers to find the strategy on performance evaluation
- To set up a foundation to define performance metrics
- To identify the needs of the target performance environment.
Performance Evaluation Approaches
Performance testing: ( during production)
-Measure and analyze the system performance based on performance test data and results
Performance simulation:( pre-production)
-Study and estimate system performance using a simulation approach
Performance measurement at the customer site(post production)
-Measure and evaluation system performance during system operations
Difference between regression automation tool & performance automation tool
Regression test tools capture tests and play them back later time .the capture and play back feature is fundamental to regression testing.
Performance test tool determine the load a server can handle. And must have a feature to stimulate many users from one machine, scheduling and synchronize different users, able to measure the net work load different number of simulated users
Testing functional requirements of an application
Connectivity among components of an application
Ex:- properties and methods of an object are working good
Test engineers should perform this type of testing