-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discussion: Improve consistency of stream interface #32
Comments
Would decorating a promise object with a steam under an added property be an option? I do something like that for git-client. So you'd do something like const importPromise = await gitSheets.import(opts)
req.body.pipe(csvParser()).pipe(importPromise.stream)
const commitHash = await importPromise Or instead const { promise, stream } = await gitSheets.import(opts)
req.body.pipe(csvParser()).pipe(stream)
const commitHash = await promise |
Hm, I'm not sure that would work the way you're describing. An |
I believe either would work fine. In the former case you'd just instantiate the promise object yourself: async function import() {
const stream = await prepareStream()
const commitPromise = new Promise((resolve, reject) => {
stream.on('close', async () => {
await stuff()
buildCommit().then(resolve).catch(reject)
});
commitPromise.stream = stream
return commitPromise
}
const commitPromise = import() // stream is ready synchronously
commitPromise.stream.pipe(stuff)
const commitHash = await commitPromise In the latter case you'd double-wrap the promise (this first one returned directly would be a promise for the import stream being ready, and the one within the compound return value would be a promise for when the import was done and a commit was ready: async function import() {
const stream = await prepareStream()
const commitPromise = new Promise((resolve, reject) => {
stream.on('close', async () => {
await stuff()
buildCommit().then(resolve).catch(reject)
});
commitPromise.stream = stream
return { promise: commitPromise, stream }
}
const { promise: commitPromise, stream } = await import() // waits for stream to be ready
stream.pipe(stuff)
const commitHash = await commitPromise I think it's important that the The later is probably a better way to do it, allowing for the "opening" of the import to be async, avoiding decorating any unexpected properties onto a promise object, allowing you to await the final result directly, and direct access to the stream Another approach might be to wrap a subset of the stream API so you can't possibly do things out of order (like await the result before piping a stream in), but you loose the flexibility of having direct access to the stream API to, for example, write one chunk at a time: async function import() {
const stream = await prepareStream()
const commitPromise = new Promise((resolve, reject) => {
stream.on('close', async () => {
await stuff()
buildCommit().then(resolve).catch(reject)
});
return {
fromStream: async (inputStream) => {
stream.pipe(inputStream)
return commitPromise
},
withStream: async (fn) => {
fn(stream)
return commitPromise
}
}
}
const myImport = await import() // waits for stream to be ready
const commitHash = await myImport.fromStream(stuff)
// or
const commitHash = await myImport.withStream(stream => {
stream.pipe(stuff)
}) This offers a lot of flexibility but leaves you with this stateful object with ambiguous state, whereas in the middle example you're just getting a compound value containing a stream and a promise, which both have well-defined states |
I've been knee-deep in stream land for the past few days and finally remember enough to be able to contribute to this conversation again. What you've proposed may work, but it feels rather unconventional: I'm trying to think of another example of a promise that resolves to an object containing another promise that resolves later, let alone splitting the asynchronousness into a stream and a promise that relates to that stream. It just feels like we shouldn't need to be pioneers here...surely someone's encountered and solved this? 🤔 Unless you've seen this pattern elsewhere and I've just misunderstood? I've been considering a few alternatives. A. Use Node's
|
Re:
Why would it be any of const treeHash = await pipeline(
payload,
csvParser({ strict: true }),
await gitSheets.import({
parentRef: ref,
saveToBranch: branch
})
) Then |
Hm, I forget.. will JavaScript await the argument before it runs the surrounding What would be even better is if |
|
At the moment:
export
is an async method that takes one string param and returns a promise that resolves to a readable stream. Callers can then encode it to csv by piping the returned stream to a writable csv encoder stream.import
is an async method that takes several params, including a readable stream, and returns a promise (no resolved value applicable). CSV decoding happens within the method.compare
is an async method that takes two string params and returns a promise that resolves to an array of objectsmerge
is an async method that takes a few string params and returns a promise (no resolved value applicable)The main issue I have is with where csv encoding/decoding happens (it's different in import vs export). I'm thinking it should always happen outside (at the server/cli level) and the methods should just deal with object streams.
But moving csv decoding to the server is a bit tricky because of the way we pass the readable stream to the
import
method, as an argument. The method then calls a couple async functions, and then attaches the.on('data')
and.on('error')
handlers.The trouble is that the flow is activated back in server when we
{ data: req.body.pipe(csvParser()) }
, and the error handler isn't attached until after some async functions, so early errors are missed. Normally a stream interface would look something likeSo that means the
import
method should probably take the opts and return a writable stream.And while we're at it,
compare
may as well return a readable stream of the diffs since that's whatexport
does. The combination of promises and streams can be a bit confusing, but the alternative of just dealing with promises and not streams makes us less equipped to deal with large CSV files (though maybe it's premature optimisation).The text was updated successfully, but these errors were encountered: