A collegue asked me about code reviews. I mentioned that I had written a white paper on code reviews for a company some time ago and after re-reading it, thought it should be given a fresh lick of paint.

I posted it earlier this year to my blog but definatly needed a bit of work… so here it is:

Peer Reviews

Peer review — an activity in which people other than the author of a software deliverable examine it for defects and improvement opportunities.

Its one of the most powerful software quality tools available. Peer review methods include inspections, walkthroughs, peer desk checks, and other similar activities. After experiencing the benefits of peer reviews for nearly ten years, I would think twice if I had to work in a team that did not perform them. That said peer reviews can really benefit a project especially in this age of “agile programming”. BUT and it’s a big but. If they are not run correctly they can go spectacularly wrong! And really damage a development project.

Why do a peer review?

The projects that I have found peer reviews to be the most valuable have been the projects that have little documentation, no really formalised development process, or are iterative projects where the documentation never keeps up with the actual project or isn’t in depth enough. This is because the peer review acts as:

Training

Risk management

Playing to everyone’s strengths

Quality Assurance

But if that hasn’t turned you around just take a look at the statistics:

.. software testing alone has limited effectiveness — the average defect detection rate is only 25 percent for unit testing, 35 percent for function testing, and 45 percent for integration testing. In contrast, the average effectiveness of design and code inspections are 55 and 60 percent. Case studies of review results have been impressive:

  • In a software-maintenance organization, 55 percent of one-line maintenance changes were in error before code reviews were introduced. After reviews were introduced, only 2 percent of the changes were in error. When all changes were considered, 95 percent were correct the first time after reviews were introduced. Before reviews were introduced, under 20 percent were correct the first time.
  • In a group of 11 programs developed by the same group of people, the first 5 were developed without reviews. The remaining 6 were developed with reviews. After all the programs were released to production, the first 5 had an average of 4.5 errors per 100 lines of code. The 6 that had been inspected had an average of only 0.82 errors per 100. Reviews cut the errors by over 80 percent.
  • The Aetna Insurance Company found 82 percent of the errors in a program by using inspections and was able to decrease its development resources by 20 percent.
  • IBM’s 500,000 line Orbit project used 11 levels of inspections. It was delivered early and had only about 1 percent of the errors that would normally be expected.
  • A study of an organization at AT&T with more than 200 people reported a 14 percent increase in productivity and a 90 percent decrease in defects after the organization introduced reviews.
  • Jet Propulsion Laboratories estimates that it saves about $25,000 per inspection by finding and fixing defects at an early stage.

Code Complete
By Steve McConnell

[ Steves book Code Complete is HIGHLY recommended to anyone who codes, full over very useful information ]

Training

As you are going through the review the junior developers are learning from the developers with decades of experience. Introducing the junior developers to advanced concepts that they wouldn’t usually have access too, therefore bringing there level of coding up.

Risk Management

Problems and design flaws come up before its too late. And as a manager you get a chance to see how developers who work for you think.

The risk of the key people who wrote the code going on holiday / leaving taking the knowledge of how the system works is reduced because everyone knows about the code and how it works, or at least knows where to start if something goes wrong.

This of course is not as good as a documented system, but I’ve found it useful when management can’t see the reasons for spending time on documentation, especially if its out of date the moment its written.

Playing to everyone’s strengths

There maybe people on the team that have strengths in different areas of development; Security, databases/SQL, OO patterns etc. This then allows to let them have their say in to improve the code and also pass the knowledge to other developers.

Quality Assurance

The obvious quality assurance that will come from developers peer reviewing the code. More manageable code, more readable, less bugs!

But there are many others, giving the developers the chance to talk about their code in front of their peers to build up their “public speaking” experience, empowering the developer by giving them responsibility for identifying specific code issues, the list goes on…

How to run a peer review:

The reviews I’ve had in the past we have given specific roles to people in each review meeting:

Moderator – This really needs to be someone “grownup”. One person with authority who isn’t going to be commenting directly on the code. A referee who has the power and shut someone down if need be. Developers take code personally and it’s important to remember that! The moderators prime role is to keep the review moving fast enough to be productive but slow enough to find the most issues. They would also distribute design or code to be reviewed and an inspection checklist, setting up the meeting, reporting inspection results and following up on any actions.

Author – Quite obvious, the coder of the code

Reviewer – Anyone who has an interest in the code. The reviewers take a look at the code before the meeting and discuss any bugs or questions during the meeting.

Documenter – This could be the role of the moderator if the group is small, but should not be the role of the author or reviewer (as its got a bit of a conflict of interest). The documenters role is to record the bugs/issues and the assignments of any actions during the review meeting.

Management – No way, this is purely a technical review so the focus is technical not political. Managers could receive a review report (list of actions from the review or notes on the checklist) very simple report which can be written during the meeting. But this shouldn’t be used to evaluate developers in any way.

REVIEW RULES:
I’ve followed a number of rules in the past for the review:

To have checklists which focus the reviewers attention on areas which have been problems in the past.

The emphasis is on bug/issue detection, not correction.

Reviewers prepare for the review meeting beforehand and arrive with a list of issues/questions.

Information is passed on to the next review to improve future reviews (eg: adding/removing questions on the checklist)

The meeting is for technical people and not management.

Under no circumstances should the review results be used for performance appraisals, its just like killing the goose that lays the golden eggs. It doesn’t make sense anyway performance should be based on final products/projects and not work that isn’t finished. Another reason why there is no management. It just means everyone is a lot more free to say what they think.

The meeting should have a minimum 3 people and maximum of 6 people; any bigger and the review gets too big to manage and ends up becoming a hindrance.

GENERAL PROCEDURE FOR THE REVIEW:

In the past I’ve broken reviews down into 9 possible stages but its really up to you how things work. Have a play at different ways of running the review until it fits in to your organisation:

Planning: The author gives the code/design to the moderator. The moderator then distributes the code to the people they think should review it including the checklist which focuses the attention of the reviewers so they don’t waste too much time looking at the code. Usually the developer has developed using the checklist which highlights the key corporate issues.

Overview (optional): The overview is used only if the reviewers aren’t familiar with the project. The author will spend some time just describing the project. This is generally a dangerous practice because the reviewer could end up glossing over issues. The design or code should speak for itself, the overview shouldn’t speak for it as that’s what the whole exercise is all about.

Preparation: Each reviewer works alone to become familiar with the design or code. The reviewers use the checklist. For every 700 lines of code the reviewer “should” spend about an hour but its really down to what works best in the organisation.

If you provide printouts for the review meeting make sure they have page numbers and line numbers.

One company I worked for we even had an additional step during preparation. That was to check the code with a static code analysis tool (PMD for Java development) to reduce the meeting time down by pre-catching issues. Also the static code checkers tend to catch a lot of important issues that many developers miss.

Review Meeting: To start the review meeting usually the author reads the code/design. They will explain all the logic including each branch loop and each logical structure/object etc… while they are talking the documenter records any issues/bugs but all discussion of the issue stops as soon as they have been recognised. The documenter also notes the type and severity of the issue and the review moves on.

As I said above the meeting needs to keep moving otherwise you’ll loose concentration. So, depending on the project this shouldn’t take longer than 30mins/1hr. Everything needs to stay focused, I worked in one place where the group was banned from even discussing weather an issue really was an issue because they assumed that if one person is confused enough to think an issue existed then it was an issue and so the design, code or documentation needs to be clarified.

Another good idea is to do a few peer reviews at the same time as long as people are focused. Sometimes this isn’t always possible but having three or four people go the same day can really take the edge off. If you put your most good natured, thick skinned, or well liked developer first you can set the tone of academic civility that will keep everything smooth and help eliminate passionate discussions between developers.

Developers will find many faults and list them gleefully, they feel they have to but its up to the moderator to keep everything in task. I also find that people fresh out of uni or college are very self confident and ready to argue. They also take things a lot more personally and you can end up in long drawn out discussions.

Inspection report: The moderator creates a review report usually during the meeting that lists each defect including its type and severity. At this point you could just put them straight into a issue tracking system while in the meeting as this will save time, and the report can be a list of these bugs. This therefore helps to ensure all the issues will be corrected and can also be used to develop the checklist which will in time highlight problems in coding etc… it can also give a good indication on the number of issues/bugs found. From the report you can find out the time spent on the issues over time which can be used to prove the usefulness of a review by giving hard data otherwise you are just left with saying reviews are good … because they are! This is also a good guide to see if they are working or not.

Rework: The moderator assigns issues/actions out to someone (usually the author unless its a larger project) for repair and the assignee resolves each defect on the list.

Follow-up: The moderator just checks to see how the issues/actions are being carried out.

Third-hour meeting: The peer review or review is just that… a review so the people in the meeting shouldn’t be discussing solutions, but of course you will, so its better for those who wish to discuss different solutions to do this at another meeting either straight after the review meeting or at another time, this also allows it to be a bit less formal and get more people involved and also doesn’t tie up other peoples time who are not interested.

Fine-tuning the review: Hopefully over time the checklist will become more refined and things will be added and things will be dropped. You should find the same common issues appearing on the checklist and you can work on refining how you work to fix them.

The Checklist:
As mentioned above this is used to track common issues so the developers use this as a guide when developing. It shouldn’t exceed more than one sheet of paper, an example of one is below, based on a PHP project:

Peer Review Checklist
1. Are the standard headers / footers being used?
2. Are functional javaDoc / phpDoc comments being used? And exports are correct
3. Is there enough documentation to work out what the system is doing?
4. Is there enough error handling relating to the importance of the work?
5. Does the code follow naming conventions?
6. Does the code follow formatting conventions (bracketing)?
7. No Magic Numbers/Text
8. Any duplicate code or code which could be marked as reusable in the repository?.
9. Removal of code which is not being used.


An Example of the Review Notes:
This was of a unified login system. A simple system to allow a single point sign-on to login to an ASP and PHP based system:


Peer Review
23/03/2004

Uni-Login Script Review

Moderator: Mr X
Author: James X (PHP), Peter Y (Architect)
Reviewer: Alistair X (Senior DB)
Reviewer: John X (Java Developer)
Reviewer: Simon X (Technical Documentation)
Reviewer: Tim X


Reviewers: Please make sure you prepare for the review meeting beforehand so you can become familiar with the code and arrive with a list of issues/questions and make use of the Checklist below.
Author: In the meeting you will be expected to read the code/design and explain all logic including each branch loop and each logical structure/object.

Code to be Reviewed:
http://internal-cvs/cvs/cvs.php/Common_SQL/LMSUniLogout.asp
http://internal-cvs/cvs/cvs.php/Common_SQL/LMSUnifier.asp
http://internal-cvs/cvs/cvs.php/Common_SQL/LMSkeepalive.asp
http://internal-cvs/cvs/cvs.php/forumsUnifier.php
http://internal-cvs/cvs/cvs.php/account.php

Related Files:
S:\web-sw development\Documentation\Code Reviews\Code Review Checklist.doc
S:\web-sw development\Documentation\Code Reviews\Uni-Login Script.doc

Results From Review:
ForumsUnifier.asp
ACTION:
Missing Header/Footer
ACTION: ASP Comments (including related files) missing
ACTION: Use include to lmskeepalive.asp rather than static code

LMSUniLogout.asp
ACTION: Header/Footer missing
ACTION: Point to gatekeeper (update link)

ForumsUnifier.php
ACTION: Header/Footer Missing
ACTION: Not enough documentation
ACTION: Remove code which is not being used (Line 15,17,34 to 40, 41)

account.php
ACTION: Header/Footer needs to be updated with latest versions, with more information
ACTION: Line 247: Needs a header refresh to index.php and remove acountDetails() reference and function.
ACTION: PHPDoc comments missing.
ACTION: Line 97 – 99, 229 comment removal
ACTION: Lines: 127 to 171; Remove echo’s and drop out of php instead.
ACTION: Line 113; Remove magic numbers
ACTION: Line 114, 113; Remove rand functionality
ACTION: Line 212; replace “relevant” with “Company Systems”
ACTION: Line 218; change to “Back to website.com”

CheckList Items Added:
9. Any duplicate code should be put in a separate module for reusability.
10. Removal of code which is not being used.

Duration: 1 Hour

====

Be Sociable, Share!