-
-
Notifications
You must be signed in to change notification settings - Fork 49
maxSize is not correctly enforced #17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
It's confusing but this is expected behavior. The trick described in the readme is using an old/new cache and swapping items out of old if they're picked up before the entire old cache is thrown away. The Example code to illustrate what I mean.
|
It might be "expected behavior" if you've read the code and know that it's going to behave that way. General users of this NPM package should only have to read the public documentation to understand what the package will do for them. If the implementation fails to produce the documented behavior then the implementation is bugged and should be fixed. |
The interaction I am describing is implied by the readme. |
Are we looking at the same readme? https://github.com/sindresorhus/quick-lru/blob/master/readme.md Anyway, I don't really care if this is fixed; I implemented my own fast LRU that only uses one Map and it runs in O(1) time. I left this issue open for the benefit of your 2.4 million users that you might either fix the implementation or improve the documentation. It's up to you whether you want to close it. |
Ah I misremembered. I had clicked through to the description of the algorithm, but that's on another page linked from the readme. I'm not the developer of quick-lru. That would be @sindresorhus. I'm just a normal user of the library, just like you. I only happened to doing a review of our installed libraries, when I noticed your issue and thought to give an answer based on what I had read months ago. |
@BTOdell it would be great if you can share your Map based LRU cache implementation. May be we can take inspiration and fix this one? |
It looks like you already merged a fix to this issue. If my original code snippet runs and prints |
As of today on 5.1.1, the original code snippet still prints |
@adityapatadia A Map can be iterated in insertion order according to the MDN docs. This can be exploited to only evict the least recently accessed item. Not sure how fragile relying on this enumeration order promise is. Otherwise just use some doubly linked list to keep track of ancienty. |
@aureooms I ended up using a single |
@BTOdell I see three issues with the current implementation of
The current implementation has the advantage of being somewhat simple. Do you see other issues or advantages? |
ran into the same issue with latest 5.1.1. Took me hours to debug. |
version 7, 4 years later from the latest comment, and it's still doing it :) |
The implementation of this LRU cache does not enforce the documented behavior:
Running this code snippet produces
true
in the console:The cache should only be able to store 2 items and when adding a total of 3 items to the list, the first key-value pair (
key1
-value1
) should be evicted. However, from the result of executing the code, you can see thatkey1
is still in the cache.The text was updated successfully, but these errors were encountered: