For various reasons, I’ve been doing more than my usual number of code reviews recently, including three graduate code reviews on the same problem in Java over the weekend. One of the reviews I thought was relatively poor, one was average and one was quite good.
And it occurred to me that it wouldn’t have taken much for each of the lower submissions to be much, much closer to each other. Many of the ways in which their code fell down are quite simple to address, hence this blog post.
Note: As with the other times I’ve written on our recruitment process, I don’t think I’m giving the game away with anything I write. In fact, the people who read this post are probably already self-selecting to proceed through our recruitment process quite well 😦
Anyway, here are the quick ways for giving your graduate code review solution the best chance to shine.
1. Made your README worthy of reading
From the perspective of a reviewer, I want to minimise the amount of time between when I first get my hands on your submission and when I can run your code. This means the following tasks need to be as frictionless as possible:
- Installation
- Build/package
- Run
The things likely to become roadblocks in these activities are:
Not having an easy way to build your application: although I’ve long moaned about Maven’s way of working, I’ve come to really appreciate its ability to let me build/package code solutions in Java without really needing to think about it. Likewise for those projects that use Ant, Gradle, make or any one of a number of other language specific or agnostic build management tools.
Not knowing which version of the language to use: Java 7? Java 8? Ruby 1.9.x? Ruby 2.x? I can find the right version by trial-and-error compile/run cycles until I have no errors… or you can put the versions into your README and put me out of my misery.
Looking for good examples of README files? Try any one of a number of the most popular libraries in Github to see what their authors have chosen to include.
2. Do out-of-the-box testing
There are sets of test data supplied with each of our coding problems. By all means, make sure your solution works according to this data, but don’t consider this data to be exhaustive! Try another couple of variants of the test data to see how your application behaves in these cases. This sort of testing might well expose limitations in your assumptions that result in unexpected behaviour in your code. Whether you choose to address these limitations in the code, or simply state them in your README is then your choice.
3. If using an IDE, listen to the IDE
If you’re writing in a compiled language like C# or Java and using an IDE (something like IntelliJ, Eclipse, Visual Studio instead of Text Mate, Sublime Text, Vim or Emacs), you can make a vast improvement on the first impression people have of your code just by turning on the IDE warnings around code quality and addressing them where needed:
- If your IDE highlights an unused variable or method, delete it.
- If your IDE highlights a statement that can be simplified, simplify it.
- If your IDE highlights an unnecessary initialisation, remove it.
- You get the idea!
None of these warnings are critical, but all help remove code that will just become a distraction to people reviewing your code.
Good luck with your submissions!