
System Testing in Software Verification and Validation Process
Explore the importance of system testing in software verification and validation process, focusing on functional and non-functional aspects at different testing levels. Understand the techniques and criteria involved in ensuring that an integrated system meets specified requirements.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Software Verification and Validation Lecture No. 9
Module 51: System Testing
System Testing Assumption Unit level testing is performed Integration level testing is performed Therefore, individual modules are working correctly System level focus Functionality based on Multi-modules
System Testing System level focus External interfaces Non-functional aspects, e.g., Security Recovery Performance Operational and user business process requirements
System Testing System Testing Techniques 1. Specification-based 2. Model-based techniques Integration Testing Unit Testing Techniques 1. CFG, DFG 2. Techniques 1. MM Path 2. State-Machine Techniques on the boundary Techniques on the boundary Exit Criteria 1. Specific Coverage Crit. Exit Criteria 1. Specific Coverage 2. Confidence Crit. Exit Criteria 1. Specific Coverage Crit. Development Manager Quality Assurance PM Tester1 Developer1 Tester2 Developer2 CM Tester n Developer n Techniques on the boundary
System Testing The process of testing of an integrated hardware software system to verify that the system meets its specified requirements verification: conformance to examination and provisions of objective evidence that specified requirements have been fulfilled Definitions taken from IEEE and
System Testing More definitions: Testing to confirm that all code modules work as specified and that the system as a whole performs adequately on the platform on which it will be deployed Testing conducted on a complete, integrated system to evaluate compliance with specified requirements.
Module 52: System Testing Aspects
System Testing Aspects Functional aspects Business processes System goals Non-functional aspects Load, Stress, Volume Performance, Security, Usability Storage, Install-ability, Documentation, Recovery, There could be specific functional as well as non- functional requirements
System Testing Aspects Functional Aspects Testing: Based on requirements / specifications System models Context diagrams ERD Design documents Our Focus while deriving test cases Data Action Device Event
System Testing Aspects Non-functional aspects testing We identify non-functional user requirements with the specifications User defined non-functional requirements Local regulations e.g., Each computer in UK must have a warning displayed for user awareness That the system usage could be harmful under certain conditions e.g., disconnect power before opening Standard SOPs.. END
Module 53: System Testing Non-functional aspects
System Testing Non functional aspects We study a couple of system testing techniques considering non-functional testing methods first We then move towards system testing considering functional aspects
System Testing Non functional aspects Load Testing System is subjected to statistically calculated load of Transactions Processing Parallel connections Etc. Test Case format It is a setup Mimicking real life boundary situation Software example for Load testing: J-meter
System Testing Non-functional aspects Stress Testing Form of Load testing The resources are denied Finding out the boundary conditions where the system would crash Finding situations where system usage would become harmful We test the system behavior upon lack of resources
System Testing Non-functional aspects Stress Testing (Contd.) Test Case format It is a setup Mimicking real life boundary situation Bugs are not (normally) repaired by reported to the end user for avoidance.
System Testing Non-functional aspects Performance Testing We study the performance requirements, e.g., Response time Worst, best, average case time to complete specified set of operations, e.g., Transactions per second Memory usage (wastage) Handling extra-ordinary situations Test Case format It is a setup Simulation
System Testing Non-functional aspects Volume Testing: We test if the system is able to handle expected volume of data, e.g., Backend e.g., PRAL Affiliated resources, e.g. ZEROX printers Word document having 200,000 pages We need to measure and report to user: System boundaries with respect the volume or capacity of its processing
System Testing Non-functional aspects Security Testing: We want to see if Use cases are allowed Misuse cases are not allowed Aimed at breaking the system Test cases format Negative intent or penetration Specific situations SQL injections Network security features
System Testing Non-functional aspects GUI Testing Verifying if HCI principles are properly followed Documented and de facto standards are met Uses Scenarios Questionnaire User activity logs Inspections Examples of Test cases Lower part of each website Every windows OS based application
System Testing Non-functional aspects Storage testing Install-ability testing Documentation testing Recovery testing We follow: what was promised with client, what is required of such systems and what are regulatory requirements.. END
Module 54: System Testing Functional Aspects
System Testing Functional Aspects Functional aspects are represented as: User stories Use cases, descriptions Formal specifications Functional aspects are part of SRS document We consider them for functional testing part of the system testing
System Testing Functional Aspects The test phase must assure that the final software system covers these functional requirements. There are two very used techniques for extracting test cases from use cases: Scenario analysis, based on scenario identification. Test data analysis, based on category partition method
System Testing Functional Aspects We need mechanisms to extract test cases from functional aspects presented in SRS document We need to report coverage so that we may prove sufficiency of testing. We discuss a couple of such techniques.. END
Module 55: Use Case Analysis Based System Testing
Use Case Analysis based System Testing Use cases is a wide used technique to define functional requirements. Use Case Scenario In case of a business scenario A business scenario is written considering user view of the system It reflects a business interaction or a business process
Use Case Analysis based System Testing In case of extracting scenarios from use-case diagrams A scenario is an instance of a use case Execution view of a specific user How a user would execute it in a specific way.
Use Case Analysis based System Testing Driving Test Cases from Use-case Steps: Identify the use case scenarios. For each scenario, identify one or more test cases. For each test case, identify the conditions that will cause it to execute. Complete the test case by adding data values
Use Case Analysis based System Testing Step 1: Use simple matrix that can be implemented in a spreadsheet, database or test management tool. Number the scenarios and define the combinations of basic and alternative flows that leads to them. Many scenarios are possible for one use case.
Use Case Analysis based System Testing Step 1: Not all scenarios may be documented. Use an iterative process. Not all documented scenarios may be tested. Use cases may be at a level that is insufficient for testing. Team s review process may discover additional scenarios.
Use Case Analysis based System Testing Step 2: We Find parameters of a test case: Conditions Input (data values) Expected result Actual result
Use Case Analysis based System Testing Step 3: For each test case identify the conditions that will cause it to execute a specific events. 1. Use matrix with columns for the conditions and for each condition state whether it is Valid (V): Must be true for the basic flow to execute. Invalid (I): This will invoke an alternative flow Not applicable (N/A): To the test case
Use Case Analysis based System Testing Step 4 Design real input data values that will make such conditions to be valid or invalid and hence the scenarios to happen. Options look at the use case constructs and branches. Consider category- partitioning Boundary value analysis
Use Case Analysis based System Testing Coverage analysis Model-based coverage i.e., All MCAs covered All ACAs covered Question How can we relate or equate this to actual code coverage Do we need to make any assumptions?
Module 56: Use case analysis - example
Use case analysis - example We take an example of a system which is a control system of Microwave oven We consider its specifications We extract test cases from specifications
Use case analysis - example Initiator Cook Food Microwave Oven System Cook Timer Reference: Gomaa, H., Designing Software Product Lines with UML : From Use Cases to Pattern-Based Software Architectures, Addison-Wesley, Reading 2004.
Use case analysis - example Brief description: This use case describes the user interaction and operation of a microwave oven. Cook invokes this use case in order to cook food in the microwave. Added value: Food is cooked. Scope: A microwave oven Primary actor: Cook Supporting actors: Timer Preconditions: The microwave oven is waiting to be used.
Use case analysis - example Main Course of Action 1. Cook opens the microwave oven door, puts food into the oven, and then closes the door. 2. Cook specifies a cooking time. 3. System displays entered cooking time to Cook. 4. Cook starts the system. 5. System cooks the food, and continuously displays the remaining cooking time. 6. Timer indicates that the specified cooking time has been reached, and notifies the system. 7. System stops cooking and displays a visual and audio signal to indicate that cooking is completed. 8. Cook opens the door, removes food, then closes the door. 9. System resets the display.
Use case analysis - example Alternative flows: 1a. If Cook does not close the door before starting the system (step 4), the system will not start. 4a. If Cook starts the system without placing food inside the system, the system will not start. 4b. If Cook enters a cooking time equal to zero, the system will not start. 5a. If Cook opens the door during cooking, the system will stop cooking. Cook can either close the door and restart the system (continue at step 5), or Cook can cancel cooking. 5b. Cook cancels cooking. The system stops cooking. Cook may start the system and resume at step 5. Alternatively, Cook may reset the microwave to its initial state (cancel timer and clear displays).
Use case analysis - example Test case generation: 1. For Each use case, generate a full set of use- case scenarios. 2. For each scenario, identify at least one test case and the conditions that will make it "execute. 3. For each test case, identify the data values with which to test.
Use case analysis - example Step 1: Read use-case textual description and identify each combination of main and alternate flows of scenarios and create a scenario matrix. U/C ID S ID Scenario Description Starting flow Target Flow 1 1-1 Scenario 1 Successful Cooking MCA MCA 1 1 1 1-2 1-3 1-4 Scenario 2 Door Open Scenario 3 No Food Scenario 4 Door Open during cooking Scenario 5 Cooking cancelled MCA MCA MCA ACA1 ACA2 ACA3 MCA ACA4