-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about sparse_hash::find_impl. #1
Comments
Hello, Thank you for the report.
It should not be possible as the Do you call |
Yes, I do call erase on the map. |
… or containing a value (no free bucket), the find and erase operations would loop forever.
Hello, It could effectively come from a bug in the I pushed a fix (and a test that reproduced the previous problem), warn me if it ever occurs again. I'll probably also add this week-end a way to automatically shrink the map when the number of deleted buckets is too high compared to the number of free ones (kind of like |
Hi, Thank you for the fast fix. I'm going to pull it and start using it today. Pavel. |
I'll close it. Don't hesitate to reopen it if the problem still occurs. |
Hi there,
I'm using your tsl::sparse_map with its default settings (growth policy, etc) in the memory cache subsystem of our P2P proxy.
In the last two or three weeks I observed two times that one thread of the proxy was looping in some functionality consuming 100% CPU and when I provoked the process to print stacktrace it was in the
sparse_map::find function both times.
I'm still investigating if the bug is related to the tsl::sparse_map or to some other functionality from the upper levels. However, I was looking at sparse_hash::find_impl and I was wondering if it's possible for this function to loop forever in some edge case. For example, if every checked sparse_ibucket has some value but the keys are not equal to the searched key?
As I said we use the sparse_map with the default growth policy, which I think is power of two and with default search policy, which is linear, as far as I checked.
The count of the entries in the map reaches 262144 (this is the max allowed value) and then usually stays there minus 10-20-50 entries but it doesn't go beyond the set limit.
Regards,
Pavel.
The text was updated successfully, but these errors were encountered: