Opportunity Options Planning Architecture Build Ready Deploy


Project Management

Fagan Inspections

This is an inspection technique named after Michael Fagan who developed it in the mid 1970s.

Benefits of Inspections

Any software engineering textbook will tell you that the cost of fixing a defect rises dramatically the later it is found in the development lifecycle of the program. Peer review is a vital step in any software development. Click here for details on why you should review.

Inspections shorten delivery time by reducing the time spent in the integration and system test/debug phases, since a cleaner product is passed into those late-stage quality filters. Better quality in the completed product saves on maintenance time, too. Reducing the effort that you have to spend fixing bugs after delivery frees up time that can be used for new development work. Cutting down on rework always improves productivity. The excuse of "I don't have time to do inspections" simply doesn't hold water.

The Fagan Inspection Process

The process typically involves an inspection team consists of 3-8 members and includes these roles:

Moderator - leads the inspection, schedules meetings, controls the meetings, reports inspection results, and follows up on rework issues. Moderators should be trained in how to conduct inspections, including how to keep participants with strong technical skills but low social skills from killing each other. The project or quality manager often performs this role.

Author - created or maintains the work product being inspected. The author may answer questions asked about the product during the inspection, and he also looks for defects. The author cannot serve as moderator, reader, or recorder.

Reader - describes the sections of the work product to the team as they proceed through the inspection. The reader may paraphrase what is happening in the product, such as describing what a section of code is supposed to do, but he does not usually read the product verbatim.

Recorder - classifies and records defects and issues raised during the inspection. The moderator might perform this role in a small inspection team.

Inspector - attempts to find errors in the product. All participants actually are acting as inspectors, in addition to any other responsibilities. Good people to invite as inspectors include: the person who created the predecessor specification for the work product being inspected (e.g., the designer for a code inspection); those responsible for implementing, testing, or maintaining the product; a quality assurance representative to act as standards enforcer; other project members; and someone who is not involved in the project at all but who has the skill set and defect-detection abilities to be able to contribute usefully to inspecting any work product of this type. We also require that user representatives participate in requirements inspections.

A formal inspection consists of several activities:

Planning - The moderator selects the inspection team, obtains materials to be inspected from the author, and distributes them and any other relevant documents to the inspection team in advance (for example the defect sheet). Materials should be distributed at least two or three days prior to the inspection. We leave the responsibility for requesting an inspection to the author, but you should establish some entry criteria for assessing readiness of a product for inspection. For code, you might require that it compiles cleanly and that the listing provided to inspectors includes line numbers.

Overview meeting - This meeting gives the author an opportunity to describe the important features of the product to the inspection team. It can be omitted if this information is already known to the other participants.

Preparation - Each participant is responsible for examining the work products prior to the actual inspection meeting, independently noting any defects found or issues to be raised. In other words the inspection is performed before you turn up at the inspection meeting this is just the method of collating all the feedback. If you do not turn up to the inspection meeting having reviewed the work product and listed the defects found, the moderator can exclude you from the inspection meeting. Perhaps 75% of the errors found during inspections are identified during the preparation step. The product should be compared against any predecessor documents or standards to assess completeness and correctness.

Checklists of defects commonly found in this type of work product should be used during preparation to hunt for anticipated types of errors. Click here for a sample checklist for code inspections.

Inspection meeting - During this session, the team convenes and is led through the work product by the moderator and reader. If the moderator determines at the beginning of the meeting that insufficient time has been devoted to preparation by the participants, the meeting should be rescheduled. During the discussion, all inspectors can report defects or raise other issues, which are documented on a form by the recorder. The author can ask for clarification on points raised but is not allowed to defend or explain any defects.

The meeting should last no more than two hours. At its conclusion, the group agrees on an assessment of the product: accepted as is (I have never seen this happen); accepted with minor revisions; major revisions needed and a second inspection required; or rebuild the product.

Causal analysis - An important long-term benefit of an inspection program is the insight it can provide into the kinds of defects being created and process changes you can make to prevent them. This "causal analysis" step provides that understanding. While not essential to the current inspection, it can lead to improved quality on future work by helping to avoid making the same mistakes in the future.

Rework - The author is responsible for resolving all issues raised during the inspection. This does not necessarily mean making every change that was suggested, but an explicit decision must be made about how each issue or defect will be dealt with.

Follow-up - To verify that the necessary rework has been performed properly, the moderator is responsible for following up with the author. If a significant fraction (say, 10 percent) of the work product was modified, an additional inspection may be required. This is the final gate through which the product must pass in order for the inspection to be completed. You may wish to define explicit exit criteria for completing an inspection. These criteria might require that all defects are corrected and issues resolved, or that uncorrected defects are properly documented in a defect tracking system.

Guiding Principles

Every inspection or review process should follow some basic guiding principles.

  • Leave your egos at the door.

It's not easy to bare your soul and expose your carefully crafted products to a mob of critical co-workers, and it is easy to become defensive about every prospective bug that is brought up in the meeting. A climate of mutual respect helps ensure that this polarization does not occur. The group culture must encourage an attitude of, "We prefer to have a peer find a defect, rather than a user." Inspections are then viewed as a non-threatening way to achieve this goal, rather than as an ego-destroying experience.

  • Look for positive things to say about the product.

Critique the products, not the producers. The purpose of the inspection is not to point out how much smarter the inspector is compared to the author. The purpose is to make a work product as error-free as possible, for the benefit of both the development team and its users. The moderator should promptly deal with any behaviors that violate this principle.

  • Find problems during the review; don't try to fix them.

It is easy to fall into the trap of amusing and interminable technical arguments about the best way to handle a particular problem. The moderator should stop these tangents within a few seconds. The author is responsible for fixing defects after the inspection, so just concentrate on identifying them during the meeting.

  • Limit inspection meetings to a maximum of two hours.

Our attention spans are not conducive to longer meetings, and their effectiveness decays quickly after this duration. If the material was not completely covered in two hours, schedule a second inspection meeting to complete the task.

Data in the software literature indicates that slowing the preparation and inspection rates increases the number of bugs found, with the optimum balance around 150-200 lines of code per hour. This rule limits the quantity of material that can be covered in a single inspection to about 8-12 pages of design or text documents, or 300-400 lines of source code.

  • Avoid style issues unless they impact performance or comprehension.

Everyone writes, designs, and programs a little differently. When a group is beginning to use reviews, there is a tendency to raise a lot of style issues that are probably not defects but rather preferences. Style issues that impact clarity, readability, or maintainability (such as nesting functions five levels deep) should certainly be raised, but discussing whether code indentation should be done in 2-character or 3-character increments should not. The use of coding standards or code beautifiers can eliminate many style issues.

Inspectors should be checking for completeness, correctness, clarity, and traceability to predecessor documents (designs back to requirements, code back to design, and so on).

Focus on uncovering errors in logic, function or implementation, not just physical appearance.

  • Inspect early and often, formally and informally.

There is a tendency to not want to share incomplete products with your peers for review. This is a mistake. If a product has a systematic weakness, such as a C program that is using literal constants where #defined macros should be used, you want to point this out when a small amount of code has been written, not in a completed 5,000 line program. So long as reviewers know what state the product is in, most people can examine it from an appropriate perspective.

Keeping Records

Record keeping is a major distinction between informal and formal review activities. There are three aspects to this task: recording defects during the inspection meeting; collecting data from multiple inspections; and analysing the defect trends to assess inspection effectiveness and identify ways to improve your software development process to prevent common types of defects.

As inspectors raise issues during the review meeting, the recorder enters them on the issues list. Try to distinguish "defects" from "issues." Defects are subdivided into the categories missing, wrong, extra, performance, and usability, while issues can be questions, points of style, or requests for clarification. We also try to note the development phase in which the underlying error was introduced (requirements, design, implementation). In addition, we classify errors found according to a severity scale. Our scale (also used for errors found in testing) includes the categories cosmetic (such as misspelled screen text), minor (nuisance or a workaround exists), severe (some functionality is not available), or fatal (program will probably crash).A simpler method is to just classify defects as major (some product failure is expected) or minor (product may be misunderstood, but the program will work).

Each error found is described in enough detail that the author can refer to this list during the rework step and understand the point that was raised. References to the line numbers where errors were found are important for specific defects, although more general observations about standards violations or style suggestions may apply to the entire product.

The defect list from a single inspection should be distilled down to a summary report with a count of the defects in each category you are using for classification. This summary can be entered into an inspection database, if you are maintaining one. An inspection management report also should be prepared for (guess who) project management. The management report contains information about the material that was inspected and the disposition of the product (accepted with minor changes, etc.), but no actual information about the defects found is included. The purpose is to allow managers to know how the project is progressing and to look for areas where improvements should be made. The moderator usually is responsible for preparing these post-inspection reports.

An effective, ongoing inspection process permits us to combine data from multiple inspections to gain insight into the quality of both the review process and the products being reviewed. The ultimate objective is to have a database of inspection data so that quantitative conclusions can be drawn from defect trends and inspection process information.

Making Inspections Work

  1. You need to have a desire to make every work product generated by any team member as defect-free as possible before it passes to the next development stage or to the user
  2. A level of mutual respect, which ensures that problems found are with the product and not the author, and which makes each participant receptive to suggestions for improvement
  3. A sense of unease when the author is the only person who has viewed a completed product
  4. A recognition that spending the time on quality activities up front will save time for the whole department in the long run, as well as increasing user satisfaction by releasing cleaner products in version 1.0.
  5. Build inspections into the project schedule. Don't forget to factor in time for the inevitable rework that follows a successful inspection.
  6. Inform the participants and, if appropriate, your users of the benefits of inspection. Those members of our team who regularly have their work products reviewed by others regard the activity as highly valuable.
  7. Recognize that the time you devote to inspecting another person's work product is for the benefit of the whole department, and also that others will assist your project efforts in the same way. This quid pro quo balances the resistance team members may express to taking time away from their own work to review someone else's products.