I wondered if there's an efficient option between using std::list or std::vector, and I did a benchmark to see it myself.
Scenario 1: Insert n items, traverse n times.
when n=100, there isn't a huge difference but list is better. For vector it's between 15ms and 78ms, and for list it's between 15ms and 62ms.
When I calculate the average of rates, I can see that list is ~%11 faster.
when n=300, I can't see a clear winner. For vector it's between 200ms and 310ms, and for list it's between 203ms and 343ms.
According to the average of rates, vector is ~%3 faster.
Here's the source for this benchmark:http://pastebin.com/f41f684cb
Scenario 2: Insert n items, traverse n times and replace %20 of items on every iteration whilst traversing.
when n=100, mostly list wins. For vector it's between 170msms and 250ms, and for list it's between 172ms and 190ms.
List is ~%5 faster on average.
when n=200, list is always the winner. For vector it's between 1360ms and 1531ms, and for list it's between 1297ms and 1422ms.
List is ~%7 faster.
Here's the source for this benchmark:http://pastebin.com/f522f7e6b
Conclusion
If the number of items in your world are around 300 or more, and you don't perform so many insert&remove operations, pick vector. Otherwise, pick list.
I think the reason behind the truth that vector can't deal with continuous insert&remove operations is that it has to relocate remaining elements in the array when you delete one of them. For list, it's far more easy because it's implemented as a linked-list which means it's just about modifying two pointers to remove an element.
On the other hand, I think the reason why the list beats vector for 100 items is about the allocation strategy that vector uses. I don't know exactly how MS implemented this in standard library, but it's generally something like doubling the allocated memory everytime it's completely consumed. If it starts with a number like 5, to reach 100 it must be doubled 5 times, and everytime it looks for a bigger chunk of memory. The memory management has to find a continuous part in the memory for it. And after 100, to reach 300 it hast to be doubled just twice.
6 comments:
Good post, thanks for your benchmarking. I have some code that uses vector to do processing similar to your second scenario. I ran your benchmarking code, and I will be switching to list now. Only buys about 3% performance but hey, every little bit helps. Thank you.
Thanks :)
Good post!! I did like the way you benchmarked.
Your benchmark is flawed. The reason you're seeing std::vector perform so poorly is due to the way vector's handle resizing.
Every time you insert an item that exceeds the vector's internal buffer size, the vector double's the buffer size. In your code, you're starting from a vector with a buffer size of 0. I've modified your code for Scenario 1 and now std::vector handily beats std::list. The larger the value of i, the larger the performance differenece between std::list and std::vector.
Scenario 2 is custom made for std::list to shine. Lists excel at insertions, while vectors excel at random access. Thus if your code is going to do a lot of deletes and inserts into the middle, list is your best bet. That said, your code can be better written by jumping to the position of the item to be deleted, instead of iterating through every item in the vector. See http://pastebin.com/f2edc593f
If you are using Visual Studio you should try adding
#define _SECURE_SCL 0
before including any headers, that will remove all the nasty extra checks MS added to their STL code. I tried it and saw a 6x increase in performance with vector easily thumping list.
Post a Comment