8000 maxSize is not correctly enforced · Issue #17 · sindresorhus/quick-lru · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

maxSize is not correctly enforced #17

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
8000
BTOdell opened this issue Sep 11, 2019 · 14 comments
Open

maxSize is not correctly enforced #17

BTOdell opened this issue Sep 11, 2019 · 14 comments

Comments

@BTOdell
Copy link
BTOdell commented Sep 11, 2019

The implementation of this LRU cache does not enforce the documented behavior:

maxSize
Required
Type: number

The maximum number of items before evicting the least recently used items.

Running this code snippet produces true in the console:

const m = new QuickLRU({
    maxSize: 2
});
m.set("key1", "value1");
m.set("key2", "value2");
m.set("key3", "value3");
console.log(m.has("key1"));

The cache should only be able to store 2 items and when adding a total of 3 items to the list, the first key-value pair (key1-value1) should be evicted. However, from the result of executing the code, you can see that key1 is still in the cache.

@KaySackey
Copy link
KaySackey commented Oct 25, 2019

It's confusing but this is expected behavior. The trick described in the readme is using an old/new cache and swapping items out of old if they're picked up before the entire old cache is thrown away.

The has method checks to see if its in old OR new. So maxSize will end up using 2x the memory---in this case 2 values permissible in new, and 2 values permissible in old.

Example code to illustrate what I mean.

var QuickLRU = require("quick-lru");
const m = new QuickLRU({
    maxSize: 2
});
// New cache has 0 values. Old has 0 values
m.set("key1", "value1");
// New cache has 1 values. Old has 0 values
m.set("key2", "value2");
 // New cache has 2 values & becomes old cache. 
m.set("key3", "value3");
// New cache has 1 value. Old cache has 2 values.
console.log(`has value? ${m.has("key1")}`)  // <-- This will be true
m.set("key4", "value3");
// New cache now has 2 values; new cache becomes old cache
// and previous values thrown away
console.log(`has value? ${m.has("key1")}`); // <-- This will be false

@BTOdell
Copy link
Author
BTOdell commented Oct 25, 2019

It might be "expected behavior" if you've read the code and know that it's going to behave that way. General users of this NPM package should only have to read the public documentation to understand what the package will do for them. If the implementation fails to produce the documented behavior then the implementation is bugged and should be fixed.

@KaySackey
Copy link

The interaction I am describing is implied by the readme.
But another choice is to just make it more clear in the documentation rather than changing the implementation and thus affecting the performance characteristics that drew many of us to quicklru in the first place.

@BTOdell
Copy link
Author
BTOdell commented Oct 25, 2019

Are we looking at the same readme? https://github.com/sindresorhus/quick-lru/blob/master/readme.md
No-where in the quick-lru readme does it describe the behavior that you're describing.

Anyway, I don't really care if this is fixed; I implemented my own fast LRU that only uses one Map and it runs in O(1) time. I left this issue open for the benefit of your 2.4 million users that you might either fix the implementation or improve the documentation. It's up to you whether you want to close it.

@KaySackey
Copy link

Ah I misremembered. I had clicked through to the description of the algorithm, but that's on another page linked from the readme.

I'm not the developer of quick-lru. That would be @sindresorhus.

I'm just a normal user of the library, just like you. I only happened to doing a review of our installed libraries, when I noticed your issue and thought to give an answer based on what I had read months ago.

@adityapatadia
Copy link
Contributor

@BTOdell it would be great if you can share your Map based LRU cache implementation. May be we can take inspiration and fix this one?

@BTOdell
Copy link
Author
BTOdell commented Jul 4, 2020

It looks like you already merged a fix to this issue. If my original code snippet runs and prints false, then feel free to close this issue.

@alexmuch
Copy link
alexmuch commented Jul 7, 2020

As of today on 5.1.1, the original code snippet still prints true

@make-github-pseudonymous-again

@adityapatadia A Map can be iterated in insertion order according to the MDN docs. This can be exploited to only evict the least recently accessed item. Not sure how fragile relying on this enumeration order promise is. Otherwise just use some doubly linked list to keep track of ancienty.

@make-github-pseudonymous-again

@BTOdell
Copy link
Author
BTOdell commented Sep 26, 2020

@aureooms I ended up using a single Map to implement my own LRU using the technique you described. It's fast and uses a minimal amount of memory. The iteration order is defined in the ES6 spec, so it's okay to rely on it.

@make-github-pseudonymous-again

@BTOdell I see three issues with the current implementation of quick-lru:

The current implementation has the advantage of being somewhat simple.

Do you see other issues or advantages?

@missing1984
Copy link

ran into the same issue with latest 5.1.1. Took me hours to debug.
Arguably this is a blocke 91A1 r especially when an LRU cache does not enforce size

@venil7
Copy link
venil7 commented Sep 18, 2024

version 7, 4 years later from the latest comment, and it's still doing it :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants
0