It seems as though every time I run the same code I may get a vastly different runtime depending on the time or day I run it.
Me too. This may be due to a lot of different things but a common reason is that the execution server uses dynamic frequency management on the CPU. This means the server's CPU runs at a lower frequency when the server gets too hot in order to avoid overheating and risk of component damage.
Therefore, it is normal that when the server is under light load because very few people are submitting jobs, you run at maximum speed.
Another reason may be OS preemption, meaning the operative system takes control of the CPU (to handle some interrupt, for example), interrupting your program.
A common way to work under this situations is to measure performance in terms of number of CPU cycles required to run the program. There are profiling tools that do exactly that. This way, it doesn't matter if the CPU frequency varies or the OS pushes your program out for a few milliseconds.