Runtime speed comparison


  • 1
    J

    The runtime speed comparison feature seems to highly depend on the
    system load and scheduler.

    Maybe use another metric, such as the number of instruction executed,
    or the sum sys + user time (instead of real time) might give a better
    performance indicator (the linux kernel provides all the tools to measure those value).


  • 0
    M

    @jeremagician Late, but +1 on this because I noticed the same thing. I just had identical solutions (literally resubmitting the same code without modifications) vary between beating 10%, 40%, and 90% of submissions, within 5 minutes.


  • 0
    O

    I know it's quite old thread but...
    I'm new user here just done two tasks just to try how it works and one thing I'm curious about. It's runtime speed distribution graph. It's cool to see comparison how speed is my implementation comparing to other's solutions but when I've tried to learn on their code (which means I've started to use some of their approach) I usually failed, so I've tried with copy and paste whole solutions and such a code didn't even pass all test cases which means in the meantime after submitting those implementations some new test cases has been added. This brings me to question if after any change in test cases/task description all old (already submitted) solutions should be shown on this graph?


Log in to reply
 

Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.