Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LRU remove by key #143

Open
yosiat opened this issue Oct 17, 2020 · 6 comments
Open

LRU remove by key #143

yosiat opened this issue Oct 17, 2020 · 6 comments

Comments

@yosiat
Copy link

yosiat commented Oct 17, 2020

Hi,

I am trying to use LRUCache class with DataLoader - https://github.com/graphql/dataloader, and I need to implement removal by key.

The only option I have is to set(key, undefined) which is an ugly hack and I have a risk where this call can evict something else from the cache.

Is that possible to add support for (I can submit a PR)?

@Yomguithereal
Copy link
Owner

Hello @yosiat. The issue with key deletion regarding this particular and very efficient implementation of a lru cache in JavaScript is that it would incur a performance & memory cost to support it. The reason why it would incur such a cost is that if I want to be able to support arbitrary deletions in the cache, I need to maintain a stack of pointers made available when unsetting keys, so I can use them subsequently when setting new keys. This said, I am not opposed to the idea of adding a variant of the structure in the lib which enables you to delete keys if you know you want to pay this cost (which you will pay one way or another if you need to support deletion in any lru cache anyway).

What do you think? Do you want to check the implementation's code and tell me if you see another way to implement deletions (to keep it working in constant time, while adding a very marginal O(n) memory cost)? Or would it be cool to have a version of the structure allowing deletions?

@yosiat
Copy link
Author

yosiat commented Nov 20, 2020

Hi @Yomguithereal !

Sorry for a long time without any reply :)

I tried implementing deletes and it was harder than I thought, but then I looked on how many deletes I have in my apps and I saw it causes my existing lru-cache to not be effective as I wished for.

So we removed our lru-cache and our other caches I am moving them to mnemonist because they don't need deletes.

TL;DR - I don't delete now :)

@Yomguithereal
Copy link
Owner

The upper performance bound for delete in the cache is indeed the very abysmal performance of the delete keyword and of the ES6 Map #.delete method, in any case yes :). Anyway, if you find you still need deletes in the future, note we can still implement it in mnemonist if required. I will just create a different structure for that not to hamper the default lru cache exposed by the lib.

@trivikr
Copy link
Contributor

trivikr commented May 5, 2021

I want to be able to support arbitrary deletions in the cache, I need to maintain a stack of pointers made available when unsetting keys, so I can use them subsequently when setting new keys.

I've posted a PR to introduce remove functionality in LRUCache and LRUMap in #154

@Yomguithereal I would like to get your feedback on the implementation.

@trivikr
Copy link
Contributor

trivikr commented May 5, 2021

I will just create a different structure for that not to hamper the default lru cache exposed by the lib.

The time complexity of the default lru cache implementation is not affected as it just gets the pointer from this.removed array while doing set or setpop operations. The additional memory used is O(n) to store this.removed array.

@Yomguithereal
Copy link
Owner

Hello @trivikr. I will answer in the related PR then :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants