-
-
Notifications
You must be signed in to change notification settings - Fork 259
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: async #446
Comments
Yep, that's definitely possible! I'm not sure if I'd be happy adding an async method for every sync method though, because the API will look a bit messy then. Right now it's very simple and minimal. 😅 Maybe we can find a way around that Either way, it's a little bit of work, not sure how much exactly but we need to set up a Promise resolver that runs on a separate Thread, custom batched methods, etc etc |
API wise maybe the counterpart to all the standard methods eg. Do you like that? @mrousavy Another hacky thing you could do is have Personally I think the API |
Just one question around thread safety, say for example we implement the method |
Arg parsing will need to happen outside, but that's fast anyways. |
I'm thinking of a potential solution for setting. I can implement one additional method on the MMKV instance called const storage = new MMKV()
await storage.batch((s) => {
s.set('hello', 'world')
s.set('test', 'another')
// ...
}) Inside you will use the The only problem I see with this now is that you cannot use getters ( A "solution" for that would be to make those funcs async, but that's a bit complicated again. |
🤔Perhaps batch would have to return an array of items, does that solve it? const storage = new MMKV()
const [foo] = await storage.batch((s) => [
s.getString('foo'),
s.set('hello', 'world'),
s.set('test', 'another')
]) |
Also could you not make storage to be context aware, then you could have this API: const storage = new MMKV()
const [foo] = await storage.batch(() => [
storage.getString('foo'),
storage.set('hello', 'world'),
storage.set('test', 'another')
]) Provided you enforce that batch is always a synchronous function. BTW, I love the batch() idea |
That's a good idea, but then you'd be very limited in terms of doing stuff contextually (e.g. only set if the value in storage is ....), but yea that would probably work.
Wdym? The whole idea is to have batch asynchronous and awaitable |
Sorry I mean the batch implementation would enforce that the batch argument callback is synchronous, this is so that the batchFn doesn't need to be injected with a proxy. class Storage {
// ...
batch(fn) {
ensureSynchronous(fn)
this._batchingEnabled = true
try {
const items = fn().filter(x => x)
items.every(x => ensureIsBatchItem(x))
return await this._executeBatch(items)
} finally {
this._batchingEnabled = false
}
}
} |
What use case might be limited? |
eg. this (but on a larger scale with tons of items like arrays): const storage = new MMKV()
await storage.batch((s) => {
const x = s.getBoolean('bla')
if (x === true) {
s.set('hello', 'world')
} else {
s.set('test', 'another')
}
// ...
}) |
Hmm yeah it's true that wouldn't work and having a callback does give the impression that it should be possible. One saving grace is that typescript users will get the following error: const storage = new MMKV()
await storage.batch((s) => {
const x = s.getBoolean('bla')
// ❌ This condition will always return 'false' since the types 'MMKVBatchItem' and 'boolean' have no overlap
if (x === true) {
s.set('hello', 'world')
} else {
s.set('test', 'another')
}
// ...
}) I don't see how you could support the above API, unless... you implemented a similar strategy to react-native-multithreading and all |
BTW one thing which is way better about await Promise.all([
storage.async.getString('foo'),
storage.async.getString('bar'),
storage.async.getString('baz')
]) Although, thinking about it more, even with the |
Also maybe just twitter poll the options? |
+1 For async, as it will make larger buffer storage play nicely :) In the meantime, is there any way to wrap the existing methods so they don't block the main thread ? |
We have a project where mmkv overgrown it and currently blocks often main thread |
Yea I think the async batch methods are cool, there are some things we need to think about though. I also need a lot of free time to work on this, so if you want to see this happening consider sponsoring me / emailing me about those efforts |
Not really, if the underlying implementation is synchronous. The best you could do would be to defer execution to the next tick as a microtask, using promises, requestAnimationFrame, timeout, etc. E.g. const resolvedPromise = Promise.resolve()
const setAsync = (k: string, v: string) => resolvedPromise.then(() => storage.set(k, v))
setAsync('foo', 'bar') This would defer to the next tick, running before any timeouts, but it would still block. Async in this way is only useful if your underlying implementation eventually uses threads and doesn't block the main thread. Otherwise, you're just blocking later instead of now. |
I was wondering if it would be possible to delegate saving to MMKV to Worklet, so it doesn't block the main thread? I was trying to implement that but I'm facing issues with executing mmkv instance methods within the worklet. And I'm actually worried that the reason is that I don't really understand the concept of worklets properly 😅 |
This shouldn't be implemented on worklets, because data sharing with other thread is a big overhead |
that makes sense, thanks for explanation @XantreGodlike! |
Is this feature planned for in the future? |
1 similar comment
Is this feature planned for in the future? |
Maybe when I rewrite the library to Nitro. But please stop spamming. |
Thank you very much, we are looking forward to it! |
While mmkv is super fast, and that's awesome 😎 it still blocks the main JS thread and for situations where you have to get a bunch of keys in a loop of some kind, that can add up. It would be great if mmkv could provide a threaded async version where
jsCallInvoker
Do you think that would be possible?
The text was updated successfully, but these errors were encountered: