Avoiding The Needless Multiplication Of Forms

Automated Testing In Docker

Dr Andrew Moss

2015-09-03

Part 1/5

Everyone has an itch that they want to scratch using automated testing. Mostly these itches lie in a unique place depending on which parts of their job they find the most routine and boring. As a teacher I need to assess my students. It's not a particularly fun, or rewarding, part of the job. Sadly though it is highly necessary. In any given term I run through the following list hundreds of times:
  1. Retrieve the submitted item from the next student.
  2. Unpack their submission.
  3. Perform some syntactic checks: depending on the course this could be:
  4. Pass their code through a compiler to see if their executable will be graded.
  5. Run a custom parser against the syntax specified in their assignment.
  6. Run a suite of tests. Sometimes this involves feeding specific input to their executable and checking the output. Sometimes this involves running shell commands in a known environment and checking how it has changed afterwards, in addition to any explicit output.
  7. Do some semantic analysis: i.e. read their code, poke around a bit and estimate their understanding.
  8. Take the results of the testing, combine it with an estimate of their understanding and award them a grade.
Some of these steps are easier than others. Some of these steps are more boring than others. It is only the final step that I actually get paid to do: the rest are just functional dependencies. Looking at this process from a student's perspective, steps 1-5 often produce more important information than step 6. During the first five steps there is strong possibility that they will learn something new about their submission. Step six is really just a confirmation.
Learning something new about their submission is a highly profitable development for a student, and as a teacher I want to keep them in that state as often as I can get away it. Possibly it is because interactive processes drive retention rates higher and improves learning outcomes, in the modern parlance. Or possibly it is because that is the bit that is actually teaching.
So, we're all highly accomplished computer scientists and formidable command-line ninjas. Why not just write a script for the first five steps? The simplest answer is that it involves running untrusted code in a live production environment. We don't do that round here. Normally, an experienced teacher will do step five ahead of step four. Just incase. It's not that many students would place something malicious in their submission, although honestly a few would. It's because...
I must have put a decimal point in the wrong place or something. Shit! I always do that. I always mess up some mundane detail.
--- Michael Bolton
Accidental damage from code is much larger risk than facing a malicious adversary (who knows that you are about to grade their work). So the real problem is isolation. How can we script the first five steps in a way that isolates them from the real system. Hello Docker.