Another quite typical week drifted past, with a TopCoder round and a Codeforces round. First, TopCoder SRM 634 took place very early on Friday morning (problems, results, top 5 on the left). The early start was too much for me, but I want to congratulate mikhailOK and piob on solving all three problems and thus leaving the rest of the competition far behind — great job!
Codeforces Round 270 occupied the Sunday evening (problems, results, top 5 on the left). The problemset had a very nice common background story: each problem explained how one can come up with problems of this type — consider reading them all if you look for inspiration for creating new problems.
The hardest problem brought an interesting aspect of the programming competitions into the limelight: sometimes the expected algorithms get so complicated that the constant hidden in the O-notation of their running time is so big that solutions that are very straightforward and efficiently implemented can be faster even despite worse asymptotic complexity. In this particular problem, the reference solution from the editorial has complexity O((nlogn)1.5), while most, if not all, accepted solutions have complexity O(n2). With n as high as 200000, one might think that the extra square root of n will dominate, but since the n squared solutions are very straightforward and thus can easily be optimized, for example using bitwise operations, this is not the case.
ilyakor's accepted solution is the fastest, solving the worst testcase in just 2.5 out of 7 seconds by reusing a publicly available SSE3-based code for counting the number of elements in a bitset — a great use of all available resources to solve the problem! But as far as programming competitions in general are concerned, we've mostly given up on separating O(n2*polylog(n)) algorithms from O(n3) ones, and it's sometimes even difficult to separate O(n*polylog(n)) algorithms from O(n2) ones. It's a pity since we eliminate a large class of quite beautiful algorithms and data structures this way :( Any ideas on reversing the trend?
Thanks for reading, and check back next week!
Codeforces Round 270 occupied the Sunday evening (problems, results, top 5 on the left). The problemset had a very nice common background story: each problem explained how one can come up with problems of this type — consider reading them all if you look for inspiration for creating new problems.
The hardest problem brought an interesting aspect of the programming competitions into the limelight: sometimes the expected algorithms get so complicated that the constant hidden in the O-notation of their running time is so big that solutions that are very straightforward and efficiently implemented can be faster even despite worse asymptotic complexity. In this particular problem, the reference solution from the editorial has complexity O((nlogn)1.5), while most, if not all, accepted solutions have complexity O(n2). With n as high as 200000, one might think that the extra square root of n will dominate, but since the n squared solutions are very straightforward and thus can easily be optimized, for example using bitwise operations, this is not the case.
ilyakor's accepted solution is the fastest, solving the worst testcase in just 2.5 out of 7 seconds by reusing a publicly available SSE3-based code for counting the number of elements in a bitset — a great use of all available resources to solve the problem! But as far as programming competitions in general are concerned, we've mostly given up on separating O(n2*polylog(n)) algorithms from O(n3) ones, and it's sometimes even difficult to separate O(n*polylog(n)) algorithms from O(n2) ones. It's a pity since we eliminate a large class of quite beautiful algorithms and data structures this way :( Any ideas on reversing the trend?
Thanks for reading, and check back next week!
0 komentar:
Post a Comment