Code Review Best Practices from SmartBear
Code Review Best Practices from Perforce
Code Review Best Practices from JetBrains
Best Practices from SmartBear
Ref: Best Practices for Peer Code Review, somewhat self-promotional
-
Review fewer than 200-400 lines of code at a time
-
Aim for an inspection rate less than 300-400 LOC/hour
- Use enough time for a proper review, but not longer than 60-90 minutes
- reviewer effectiveness drops after 60 minutes
- Author should annotate his source code before the review begins.
By annotating his own code,
- author finds many bugs himself
- helps reviewers understand the code or changes, hence saves time
- Establish quantifiable goals for code review and capture metrics so you can improve your process
- this requires you record some data about defect rates, support calls, etc., for comparison
- not a vague, fuzzy goal like “find more bugs”
- Use a checklist. Checklists improve results
- start with a standard checklist
- add a personal checklist of your own common mistakes, or things you forget to review
- Verify that defects are actually fixed!
- the issues/defects often are not things that go into the project bug tracker
- use a tool for defect tracking. For class projects, Github issues are OK.
- reviewers should verify the issues really were fixed, not just “closed”.
- Managers must foster a good code review culture in which finding defects is viewed positively
- must take the postive view that finding defects improves the code and improves the developers’ skills/knowledge
- fosters good communication
- don’t criticize developers when defects are found. If you do, they won’t look for defects.
- Beware the “Big Brother” effect
- be careful how metrics are used to avoid discouraging defect reporting
- don’t single out individuals based on metrics
- The Ego Effect: Do at least some code review
- Reviews encourage developers to improve their code quality out of a sense of pride or ego
- Lightweight code reivews are efficient, practical, and effective at finding bugs
- SmartBear claims that lightweight reviews find almost as many defects as formal inspections, but take less time
Best Practices from Perforce
Ref: 9 Code Review Best Practices
For Participating in Peer Code Reviews
-
Know What to Look for in Code Reviews
- Build and Test Before Code Reviews
- use CI to build and test your code
- use tools for static analysis and code quality
- Don’t Review Code for Longer Than 60 Minutes
- and has a fresh mind before the review starts
- Review No More than 400 Lines at a Time
- if you try to review too much you will miss (overlook) more defects
- Give Feedback That Helps (Not Hurts)
- be constructive in feedback, not critical
- ask questions rather than making statements
- give (sincere) praise along with constructive feedback
- approach code review as a learning process
For Running Code Reviews
- Communicate Goals and Expectations
- be clear what the goals are, and what you expect from reviewers
- give reviewers a checklist to ensure reviews are consistent
- Include Everyone in the Code Review Process
- both senior and junior developers should review and have their work reviewed
- Foster a Positive Culture
- create a positive culture for code improvement
- teams should appreciate code reviews, not dread them
- Automate to Save Time
- use tools to automate what you can. Don’t waste reviewers’ time.
- static code analysis to help find problems
- Ex: uninitialize or unused variables, wrong parameters passed to function
- IDE to find syntax errors
- style checker(s) for coding style
Best Practices from JetBrains
ref: Code Review Best Practices at https://blog.jetbrains.com. Transcript of their video
- Automate as much as possible
- Agree on goals for Code Review
- Use a common check list for review
- Prepare for review
- author should check his own code
- annotate (comment) to help reviewers understand
- During Review
- Respond timely
- Set clear expectations
- Try to resolve review issues quickly
- Close the Review when everyone is satisfied
Some of the best practices in the video apply to online reviews where reviewers work separately and asynchronously. I tried to extratract the best practices that also apply to face-to-face reviews.