The two by far fastest reported times so far are for Python solutions. As much as I'd like to think Python is faster for once, I gotta admit I now doubt it. I just tested my Python solution and two posted C++ solutions on my own PC on a large test case, and the C++ solutions were far faster. Like factor 20. At the OJ, it's the reverse - my Python solution is far faster. Like a factor of 5. Together, that's a mysterious factor 100 (!) between my PC/test and the OJ that I can't explain.
What's going on? Are the C++/Java solutions given harder test cases here? The number is the same, 32 test cases. Or is the compilation time included? Or is it something else?
Currently the runtime includes everything from program startup to end. So please do not put too much weight on the comparison between different languages as they use different methods to parse inputs, etc...
Yes, I agree that usually C++ is faster than Java/Python, but in this case C++ is slower and I believe the bottleneck is in the input parsing which uses a third party library.
Aw, that's a pity. I don't care that much about comparing the languages, but I'd much prefer all timings to reflect how long the actual solutions took, not how long the framework took. This case is really extreme. If the C++ solutions are 20 times as fast as my Python solution on the OJ test cases as well, then the framework is responsible for like 99% of the runtime! Can that be fixed, so only the time taken by our solutions is measured?
Looks like your connection to LeetCode Discuss was lost, please wait while we try to reconnect.