Failures in Technology: Lessons Learned from Historic Incidents

informatics 43 n.w
1 / 60
Embed
Share

Explore the common failures in technology, including the Airbus 320, Y2K bug, and more, delving into the impacts of these incidents on both large-scale systems and personal devices. Understand the significance of quality assurance and testing in software engineering.

  • Technology Failures
  • Quality Assurance
  • Software Engineering
  • Historic Incidents
  • Impact

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Informatics 43 Introduction to Software Engineering Lecture 8-2 May 21, 2015 Emily Navarro Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited. SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Department of Informatics, UC Irvine

  2. Todays Lecture Quality assurance Testing SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Department of Informatics, UC Irvine

  3. Todays Lecture Quality assurance Testing SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 3 Department of Informatics, UC Irvine

  4. What Do These Have in Common? Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 4 Department of Informatics, UC Irvine

  5. They All Failed! Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 5 Department of Informatics, UC Irvine

  6. They All Failed! Airbus 320 http://catless.ncl.ac.uk/Risks/10.02.html#subj1.1 Audi 5000 unintended acceleration problem Mariner 1 launch http://catless.ncl.ac.uk/Risks/5.73.html#subj2.1 AT&T telephone network Ripple effect, from switch to switch, network down/dark for 2-3 days Ariane 5 http://catless.ncl.ac.uk/Risks/18.24.html#subj2.1 Word 3.0 for MAC Plagued with bugs , replaced for free later Word 3.0.1 Radiation therapy machine http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_5.html Y2K SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 6 Department of Informatics, UC Irvine

  7. Y2K Facts Bug description Date formats were MM/DD/YY, e.g., 01/01/98, 02/02/99, 03/03/00 98 -> 1998, 99 -> 1999 But does 00 mean 2000 or 1900? Does 1999 turn to 19100? Effects Relatively minor Cost: $300 billion! SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Department of Informatics, UC Irvine

  8. Impact of Failures Not just out there Space shuttle Mariner 1 Ariane 5 But also at home Your car Your call to your mom Your wireless network, social network, mobile app Your homework Your hospital visit Peter Neumann s Risks Digest: http://catless.ncl.ac.uk/Risks SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Department of Informatics, UC Irvine

  9. Verification and Validation Validation Verification Ensure software meets specifications Internal consistency Are we building the product right? e.g., testing, inspections, program analysis Validation Ensure software meets customer s intent External consistency Are we building the right product? e.g., usability testing, user feedback SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Department of Informatics, UC Irvine

  10. Verification and Validation Validation Verification Ensure software meets specifications Internal consistency Are we building the product right? e.g., testing, inspections, program analysis Validation Ensure software meets customer s intent External consistency Are we building the right product? e.g., usability testing, user feedback Implement the idea properly Implement the proper idea SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 10 Department of Informatics, UC Irvine

  11. Software Qualities Correctness Reliability Efficiency Integrity Usability Maintainability Testability Flexibility Portability Reusability Interoperability Performance, etc. SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 11 Department of Informatics, UC Irvine

  12. Quality Assurance All activities designed to measure and improve quality in a product Assure that each of the software qualities is met Goals set in requirements specification Goals realized in implementation Sometimes easy, sometimes difficult Portability versus safety Sometimes immediate, sometimes delayed Understandability versus evolvability Sometimes provable, sometimes doubtful Size versus correctness SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 12 Department of Informatics, UC Irvine

  13. An Idealized View of QA Complete formal specification of problem to be solved Correctness-preserving transformation Design, in formal notation Correctness-preserving transformation Code, in verifiable language Correctness-preserving transformation Executable machine code Correctness-preserving transformation Execution on verified hardware SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 13 Department of Informatics, UC Irvine

  14. A Realistic View of QA Mixture of formal and informal specifications Manual transformation Design, in mixed notation Manual transformation Code, in C++, Java, Ada, Compilation by commercial compiler (Intel Pentium-based) machine code Commercial firmware Execution on commercial hardware SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 14 Department of Informatics, UC Irvine

  15. First Complication Real needs Actual Correct Specification Specification No matter how sophisticated the QA process, the problem of creating the initial specification remains SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 15 Department of Informatics, UC Irvine

  16. Second Complication Correctness! Efficiency There are often multiple, sometimes conflicting qualities to be tested, making QA a challenge. SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 16 Department of Informatics, UC Irvine

  17. Third Complication Complex data communications Electronic fund transfer Distributed processing Web search engine Stringent performance objectives Air traffic control system Complex processing Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 17 Department of Informatics, UC Irvine

  18. Fourth Complication Project Management Quality Assurance Group Development Group It is difficult to divide the particular responsibilities involved when performing quality assurance SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Department of Informatics, UC Irvine

  19. Fourth Complication Quality assurance lays out the rules You will check in your code every day You will comment your code You will Quality assurance also uncovers the faults Taps developers on their fingers Creates image of competition Quality assurance is viewed as cumbersome, heavy Just let me code Quality assurance has a negative connotation SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 19 Department of Informatics, UC Irvine

  20. Available Techniques Formal program verification Static analysis of program properties Concurrent programs: deadlock, starvation, fairness Performance: min/max response time Code reviews and inspections Testing Most techniques are geared towards verifying correctness SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Department of Informatics, UC Irvine

  21. Which Technique to Use? There is no silver bullet of testing A mixture of techniques is needed Different approaches are needed for different faults E.g., testing for race conditions vs. performance issues Different approaches are needed at different times E.g., unit testing vs. usability testing vs. system testing SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Department of Informatics, UC Irvine

  22. How do we know when we are done? We can never find all faults But we cannot test forever! We aim to reveal as many faults as possible in a given period of time More faults found and fixed = good More bugs found = more bugs not found Aim to meet the quality requirements established for the project SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 22 Department of Informatics, UC Irvine

  23. How do we know when we are done? Number of problems found per hour Time Day 1 Day 2 Day 3 Day 4 Day 5 SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 23 Department of Informatics, UC Irvine

  24. How do we know when we are done? Could stop testing when the problem find rate stabilizes to near zero Number of problems found per hour Time Day 1 Day 2 Day 3 Day 4 Day 5 SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 24 Department of Informatics, UC Irvine

  25. How do we know when we are done? -- 100% Confidence in module being tested Sweet spot? Number of test cases with correct outputs SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 25 Department of Informatics, UC Irvine

  26. How do we know when we are done? Can pepper the code with defects and observe how many of the seeded defects are discovered Scenario The program is seeded with 10 defects After some test cases are executed 7 of the seeded defects are found 45 nonseeded defects are found Since 70% of the seeded defects are found, and 30% not found, assume that the nonseeded defects follow the same pattern 45 is 70% of 64, so there are 19 (64-45) defects left to be found This technique assumes that nonseeded defects are similar to the seeded ones SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 26 Department of Informatics, UC Irvine

  27. Reminder: Use the Principles Rigor and formality Separation of concerns Modularity Abstraction Anticipation of change Generality Incrementality SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Department of Informatics, UC Irvine

  28. Todays Lecture Quality assurance Testing SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Department of Informatics, UC Irvine

  29. Testing Using a set of techniques to detect and correct errors in a software product Exercise a module, collection of modules, or system Use predetermined inputs ( test case ) Capture actual outputs Compare actual outputs to expected outputs Actual outputs equal to expected outputs test case succeeds Actual outputs not equal to expected outputs test case fails SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Department of Informatics, UC Irvine

  30. Testing Process Model 1. Decide what to test. 2. Select a test case input. 3. Determine the expected output E. 4. Run the system with the test case input. 5. Capture the actual output A. 6. Compare E and A. Different? Inform programmer 7. Loop back to 1 or 2, if time permits. SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 30 Department of Informatics, UC Irvine

  31. V-Model of Development and Testing Develop Requirements Execute System Tests Requirements Review Develop Acceptance Tests Acceptance Test Review Design Execute Integration Tests Design Review Develop Integration Tests Integration Tests Review Code Execute Unit Tests Code Review Develop Unit Tests Unit Tests Review SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 31 Department of Informatics, UC Irvine

  32. Spiral Risk analysis Risk analysis Risk analysis Risk analysis SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 32 Department of Informatics, UC Irvine

  33. The RUP Model Phases Process Workflows Inception Elaboration Construction Transition Business Modeling Requirements Analysis & Design Implementation Test Deployment Supporting Workflows Configuration Mgmt Management Environment Preliminary Iteration(s) Iter. #1 Iter. #2 Iter. #n Iter. #n+1 Iter. #n+2 Iter. #m Iter. #m+1 Iterations SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 33 Department of Informatics, UC Irvine

  34. Extreme Programming SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 34 Department of Informatics, UC Irvine

  35. Extreme Programming SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Department of Informatics, UC Irvine

  36. Test-Driven Development SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Department of Informatics, UC Irvine

  37. Testing Terminology Error A human action that produces an incorrect result May or may not produce a fault Fault A condition that may (or may not) cause a failure Caused by an error Commonly referred to as a bug Failure The inability of a system to perform a function according to its specifications Result of a fault SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 37 Department of Informatics, UC Irvine

  38. Error, Fault, Failure Error (in programmer s mind) Fault or defect (in code) Failure (in execution or output) SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Department of Informatics, UC Irvine

  39. Error, Fault, Failure Error (in programmer s mind) Fault or defect (in code) Failure (in execution or output) SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 39 Department of Informatics, UC Irvine

  40. Error, Fault, Failure SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Department of Informatics, UC Irvine

  41. Error, Fault, Failure SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Department of Informatics, UC Irvine

  42. Testing Goals Detect failures/faults/errors Locate failures/faults/errors Fix failures/faults/errors Show system correctness Within the limits of optimistic inaccuracy Improve confidence that the system performs as specified (verification) Improve confidence that the system performs as desired (validation) Program testing can be used to show the presence of bugs, but never to show their absence [Dijkstra] SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Department of Informatics, UC Irvine

  43. Testing Process Quality Goals Accurate Complete Thorough Repeatable Systematic SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Department of Informatics, UC Irvine

  44. Testing Planning Set quality goal for the project Choose test methodologies and techniques Assign resources Bring in tools Set a schedule SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Department of Informatics, UC Irvine

  45. Who Does the Testing? Programmers Unit testing Testers Non-programmers Users Acceptance testing Alpha testing Beta testing SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 45 Department of Informatics, UC Irvine

  46. Levels of Testing Unit testing Testing of a single code unit Requires use of test drivers Functional/integration testing Testing of interfaces among integrated units Incremental Big bang Often requires test drivers and test stubs System/acceptance testing Testing of complete system for satisfaction of requirements SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 46 Department of Informatics, UC Irvine

  47. Levels of Testing CalorieTracker ZotMyHealth Meal MealList CalorieTracker List<Meal>: mealList App: connectedApp WorkoutTracker String: mealName int: numCalories SleepTracker void addMeal( ) LoginManager void deleteMeal( ) void setNumCalories( ) SettingsManager Unit testing Functional/integration testing CalorieTracker.addMeal(Meal m) System/acceptance testing Add a meal Delete a workout Login Logout Meal.setNumCalories( ) SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 47 Department of Informatics, UC Irvine

  48. Test Tasks Test Case: A group of input values that cause a program to take some defined action, with an expected output Devise test cases Target specific areas of the system Create specific inputs Create expected outputs Choose test cases Not all need to be run all the time Regression testing Run test cases Can be labor intensive Opportunity for automation All in a systematic, repeatable, and accurate manner SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 48 Department of Informatics, UC Irvine

  49. How to Choose Test Cases (I) There are usually an infinite number of possible test cases for a given function There are too many input-output pairs to exhaustively verify, so we must take a small sample Example: multiplier Input: two integers Output: one integer int multiplier(int a, int b) { return a * b; } SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 49 Department of Informatics, UC Irvine

  50. How to Choose Test Cases (II) Practically, testing can only select a very small set of inputs Our goal should be to choose the best ones What are the best five test cases for a multiplier? (AKA: what five test cases, if they don t reveal any bugs, will give you the most confidence that the multiplier is working correctly?) SDCLCollaboration Laboratory Software Design and sdcl.ics.uci.edu 50 Department of Informatics, UC Irvine

Related


More Related Content