Review Techniques SEII - Lecture 16

Software reviews Cost impact of software defects Defect amplification model Review metrics and their use Preparation effort (Ep), assessment effort (Ep), Rework effort (Er), work product size (WPS), minor errors found (Errminor), major errors found (Errmajor) Formal and informal reviews Review meeting, review reporting and record keeping, review guidelines

pptx24 trang | Chia sẻ: dntpro1256 | Lượt xem: 539 | Lượt tải: 0download
Bạn đang xem trước 20 trang tài liệu Review Techniques SEII - Lecture 16, để xem tài liệu hoàn chỉnh bạn click vào nút DOWNLOAD ở trên
Review Techniques SEII-Lecture 16Dr. Muzafar KhanAssistant ProfessorDepartment of Computer ScienceCIIT, Islamabad.RecapMulti-aspects conceptTranscendental view, user view, manufacturer’s view, product view, value-based viewSoftware qualityEffective software process, useful product, add value for producer and user of a software productSoftware quality modelsGarvin’s quality dimensions, McCall’s quality factors, ISO 9126 quality modelSoftware quality dilemmaAchieving software quality2Software ReviewsFilter for software processTo err is humanPeople are good at catching others’ errorsThree stepsPoint out needed improvementsConform those parts that are OKAchieve technical work of uniform quality without reviewsDifferent types of reviews3Cost Impact of Software DefectsDefect and fault are synonymousPrimary objective is to find errorsPrimary benefit is early discovery of errorsNo propagation in next stepDesign activities introduce 50-65% of all errorsReview techniques are 75% effective to uncover design flawsIt leads to reduced cost at later stages4Defect Amplification Model5Figure source: Software Engineering: A Practitioner’s Approach, R. S. Pressman, 7th ed., p. 419Example – No Reviews6Figure source: Software Engineering: A Practitioner’s Approach, R. S. Pressman, 7th ed., p. 419Example –Reviews Conducted7Figure source: Software Engineering: A Practitioner’s Approach, R. S. Pressman, 7th ed., p. 419Review Metrics and Their Use [1/2]Each action requires dedicated human effortProject effort is finiteNeed of metrics to assess effectiveness of each actionReview metricsPreparation effort (Ep)Number of person-hours prior to actual reviewAssessment effort (Ep)Number of person-hours required for actual review8Review Metrics and Their Use [2/2]Rework effort (Er)Number of person-hours to correct errors uncovered during the reviewWork product size (WPS)Size of work reviewed e.g. number of UML modelsMinor errors found (Errminor)Number of minor errors foundMajor errors found (Errmajor)Number of major errors found9Analyzing Metrics [1/2]Ereview = Ep + Ea + ErErrtot = Eminor + EmajorError density – number of errors found per unit of work reviewedError density = Errtot/WPSExample: 18 UML diagrams, 32 pages document, 18 minor and 04 major errorsError density = 22/18 = 1.2 errors per UML diagrams OR22/32 = 0.68 errors per page10Analyzing Metrics [2/2]Different work products are reviewedPercentage of errors uncovered for each review is computed against the total number of errors for all reviewsError density for each work product is computedWhen all reviews are conducted, average values indicate the errors in new documents11Cost Effectiveness of ReviewsDifficult to measureExample: 0.6 errors per page4 person-hours to correct minor error18 person-hours to correct major errorMinor errors occurs about six time more frequently as compared to major errors based on the review dataRequirements-related error needs 6 person-hours to correctSame error requires 45 person-hours if uncovered during testingEffort saved per error = Etesting – Ereviews45 - 6 = 39 person-hours12Effort Expanded With and Without Reviews13Figure source: Software Engineering: A Practitioner’s Approach, R. S. Pressman, 7th ed., p. 422Reviews: A Formality SpectrumLevel of formality depends on different factorsProduct, time, peopleFour characteristics of reference modelRolesPlanning and preparation for the reviewReview structureFollow-up14Reference Model15Figure source: Software Engineering: A Practitioner’s Approach, R. S. Pressman, 7th ed., p. 423Informal ReviewsSimple desk check, casual meeting, or review-oriented aspects of pair programmingEffectiveness is considerably lowerNo advance planning/preparation, no agenda, no follow-upStill good to uncover errors that may propagateSimple review checklist to improve the efficiency16Example: Review ChecklistWork product: prototypeReviewers: designer and colleagueIs the layout designed using standard conventions? Left to right? Top to bottom?Does the presentation need to be scrolled?Are color and placement, typeface, and size used effectively?Are all navigation options or functions represented at the same level of abstraction?Are all navigation choices clearly labeled?17Formal Technical ReviewsObjectives areTo uncover errors in function, logic, or implementationTo verify that the software under review meets its requirementsTo ensure that it is according to predefined standardsTo achieve that it is developed in a uniform mannerTo make project more manageableTraining ground for juniorsPromotes backup and continuityWalkthroughs and inspections18The Review Meeting [1/2]Constraints3-5 peopleNo more than two hours (for each person) of advance preparationMeeting duration should be less than two hoursFocus should be on specific and small part of overall softwareProducer informs project leader about the work productProject leader contacts review leaderReview leader is responsible for the rest of arrangement19The Review Meeting [2/2]Meeting is attended by the review leader, all reviewers, and the producerOne reviewer serves as recorderMeeting starts with an introduction of the agenda and the producerProducer “walk through” the work productDecisionsAccept the product without further modificationReject the product due to severe errorsAccept the product provisionallySign-off at the end of meeting20Review Reporting and Record KeepingReview issues list is producedIdentify problem areasAn action item checklist for correctionsFormal technical review summary report (a single page with possible attachments)What was reviewed?Who reviewed it?What were the findings and conclusions?Follow-up procedure to ensure the rework21Review Guidelines [1/2]Review the product, not the producerSet an agenda and maintain itLimit debate and rebuttalEnunciate problem areas, but don’t attempt to solve every problem notedTake written notesLimit the number of participants and insist upon advance preparation22Review Guidelines [2/2]Develop a checklist for each product that is likely to be reviewedAllocate resources and schedule time for formal reviewsConduct meaningful training for all reviewersReview your early reviews23SummarySoftware reviewsCost impact of software defectsDefect amplification modelReview metrics and their usePreparation effort (Ep), assessment effort (Ep), Rework effort (Er), work product size (WPS), minor errors found (Errminor), major errors found (Errmajor)Formal and informal reviewsReview meeting, review reporting and record keeping, review guidelines24

Các file đính kèm theo tài liệu này:

  • pptxlecture_16_csc392_dr_muzafar_khan_6394_2027026.pptx
Tài liệu liên quan