Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resuming in a later session results in corrupted uploads #56

Closed
Rambarani opened this issue Jun 1, 2019 · 3 comments
Closed

Resuming in a later session results in corrupted uploads #56

Rambarani opened this issue Jun 1, 2019 · 3 comments

Comments

@Rambarani
Copy link

This is duplicate of this issue . But that one not answered . i am facing also the exact same problem . can you help me guys please ??

How is one supposed to track parts as they are uploaded? Our app needs to upload giant files, and we need to handle poor internet connections. So there's not always a chance that the upload will ever send out a pause event.

Currently we are tracking parts from the part event and adding them to an array. When I use the UploadId and Parts from a previous session, s3-upload-stream always starts from the beginning and ends up uploading too much, corrupting the file.

Here's how we're calling s3-upload-stream

const destinationDetails = {
// ...
}

const sessionDetails = {
"UploadId": "SECRET_UPLOAD_ID",
"Parts": [
{
"ETag": ""ee72b288175a979f6475b7241dcb9913"",
"PartNumber": 1
},
{
"ETag": ""e33260ba6180f1294592f25985c0109e"",
"PartNumber": 2
}
]
}

const uploader = s3Stream.upload(destinationDetails, sessionDetails)

uploader.on('part', details => {
console.log(details)
})
And if we see what the part event logs:

{
"ETag": ""ee72b288175a979f6475b7241dcb9913"",
"PartNumber": 3
}
It's the exact same ETag we passed in via sessionDetails, and it gets uploaded as another part, which corrupts the file.

@nathanpeck
Copy link
Owner

nathanpeck commented Jun 1, 2019 via email

@Rambarani
Copy link
Author

Rambarani commented Jun 2, 2019

Yes i keep tracking uploaded parts in db and send it as sessionDetails
const uploader = s3Stream.upload(destinationDetails, sessionDetails)

but the final uploaded file size is larger than original file size. and it is corrupted , not able to open.

i think the part buffer we are sending is wrong.

my case is, close the application and reopen means upload is to resume where is stopped. so that time only this issue happening.

we have to send the part Buffer from where it paused. but it starts sending from the beginning i think so.

Note: Pause , Resume works well in normal condition (Not closes the application) . close and opens the application means this issue happening. and i am using this package in Electron with react app

@Rambarani
Copy link
Author

finally i got .
i read the stream where i last time stopped
fs.createReadStream(path, { start: this.state.uploadedSize });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants