Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add LRU cache alongside redis #6

Merged
merged 4 commits into from
Jun 25, 2024
Merged

feat: Add LRU cache alongside redis #6

merged 4 commits into from
Jun 25, 2024

Conversation

CCristi
Copy link

@CCristi CCristi commented Jun 24, 2024

This PR addes LRU cache alongside redis for better performance.
Right now LRU cache size is set to 100 items

@CCristi CCristi requested a review from kimpers June 24, 2024 11:26
@CCristi CCristi self-assigned this Jun 24, 2024
Copy link
Member

@kimpers kimpers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Noice, just a couple things!

Comment on lines 20 to 22
await Promise.allSettled(
cachedIdls.map(async (res, i) => {
const programId = programIds[i]!;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we run allSettled on an awaited Promise<(string | null)[]>?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is mostly because missing idls need to make an RPC call to solana-fm to retrieve and store that in the cache. See line below:

const idl = await getProgramIdl(programId);

await redisClient.connect();

const lruCache = new LRUCache<string, string>({
max: LRU_CACHE_MAX_ITEMS_COUNT,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should use the same ttl for the in-memory cache as redis

const client = createClient({
url: config.REDIS_URL,
});
const LRU_CACHE_MAX_ITEMS_COUNT = 100;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could probably bump this to 1000 without any problems right?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Size of IDL is quite big, but I think node would be fine even with that.
btw I've tested that even a LRU cache of 100 items is going to decrease redis call rate by >90%

Copy link
Member

@kimpers kimpers Jun 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Size of IDL is quite big, but I think node would be fine even with that. btw I've tested that even a LRU cache of 100 items is going to decrease redis call rate by >90%

Ahh nice! 👍 I'm fine with either 100 or 1000 then

@CCristi CCristi requested a review from kimpers June 24, 2024 13:41
Copy link
Member

@kimpers kimpers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍 👍 don't forget to update the ExplorerKit Grafana dashboard with these new metrics once you have deployed 🙏

@CCristi CCristi merged commit ae76e16 into main Jun 25, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants