On code reviews

Yesterday I got this question asked in a tech interview. It is not a usual interview question. Usually, we jump straight to inverting the binary tree (figuratively) after the introductions.
I thoroughly enjoyed thinking through my past experiences in reviewing code, the kind of comments we encounter (and tend to or aspire to provide to other developers) and how code reviews got handled in my team and the broader organisation.
Reviewing code is hard intellectual labour.
First off, I want to acknowledge the hard intellectual labour that goes into every code review. The reasons why I believe it is hard work:
- The reviewers have their own tasks to take care of.
- To provide meaningful feedback, At the bare minimum, the reviewer must understand both the problem and your solution.
- On top of that, they often need to think through a lot more:
- Alternate or better approaches to solving the same problem.
- The corner cases that the change owner might have missed.
- How will this change behave in the larger system, i.e. will this introduce a side effect (Say, during the application initialisation/exit or an even more subtle case is that of the upgrades? Simply think if the code will behave correctly after an upgrade which might involve DB and other external schema changes.)
- Considered on-prem deployments? which often have longer release cycles, and often in the wild, the system might not even be able to reach the internet for security reasons.
- Is your solution backwards compatible (Think of APIs and other systems that will use this functionality).
- Will this change surprise users!? Will they lose anything (Data entered, extra effort in input needed, configuration, or God-forbid will they lose data in storage or worse, a backup!?)
- In case of a major feature or behavioural change in the product, How easy is it to salvage the situation, Do you have a kill switch? (Should require no code change, ideally through UI, backed by a config file)
- Is the code developer friendly? Are log messages clearly worded and at the right logging level? (INFO, DEBUG, ERROR) Do they include all data points/variables needed to identify the issue with minimal effort?
- A more aspirational one — Is the change IT/Admin/Ops friendly? Does the logging/monitoring trail include all data points that might help them/us isolate the problem before even looking at the code? (Imagine the SWAT, IT admins or Operations teams, who might be rolling out your code late night/off-peak hours or worse, Sundays?

Now that we have established that reviewing code is hard work and deserves the recognition and care it requires, let us think about how we can make it as easy as possible for the people who have to review your code.
Before posting the code for review, Make sure you follow a checklist (Make one for your team, It will pay rich dividends!)

- Make the change small.
Post incremental reviews for large changes. E.g. for APIs, a good starting point is the interfaces or the OpenAPI specs.
2. Do the first-pass review yourself.
You will be surprised and grateful that you did a review yourself. Typos and weird naming are just a few that you can fix immediately.
3. Make sure you appropriately name the classes, methods, variables, and anything in-between. Let it be a long name. Be descriptive in comments and coding.
4. When in doubt, prioritise for (IMHO, in this order)
- Security.
- Usability.
- Maintainability.
Readability — Especially a few months down the line.
Testability — Consider mock-ability, automated testability without expensive application context initialisations, or worst, bringing up an entire application instance like a DB or a cache.
- Algorithmic complexity.
- Space complexity. (Storage is cheaper, sacrifice it if that improves any of the above)
Tests are immensely useful in helping the reviewers understand the solution, expectations and assumptions. It is joyous to read through a code review with unit tests!
5. Strive for automated tests (unit tests preferably, which are fastest and often done with builds, followed by integration tests.).
Tests are immensely useful in helping the reviewers understand the solution, expectations and assumptions. I love to start the review from the tests whenever they are available. It is joyous to read through a code review with unit tests.
6. For humanity’s general well-being and advancement:
- Write a concise summary of the change, State assumptions, alternate approaches thought about, and why you zeroed in on this solution.
- Links to static analysis (For security, Style, and Inclusive terminology)
- Links to the user story/bug/spec/design-document etc.
- A snippet of the test results and aspirably the code-coverage diff.
- State any potential issues anticipated. (Eg. we tend to assume APIs always receive a response. While in reality, the connection can simply fail, the Authentication token might expire, get rate-limited, The API might have been deprecated… you get the idea)
-Cheers, Gopi.