-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Resuming in a later session results in corrupted uploads #56
Comments
Pause happens locally in the browser so you don’t have to worry about
internet stopping the pause. But you do have to keep track of the uploaded
parts yourself, maybe in local storage
…On Sat, Jun 1, 2019 at 9:26 AM RamK777 ***@***.***> wrote:
This is duplicate of this issue
<#45> . But that one
not answered . i am facing also the exact same problem . can you help me
guys please ??
How is one supposed to track parts as they are uploaded? Our app needs to
upload giant files, and we need to handle poor internet connections. So
there's not always a chance that the upload will ever send out a pause
event.
Currently we are tracking parts from the part event and adding them to an
array. When I use the UploadId and Parts from a previous session,
s3-upload-stream always starts from the beginning and ends up uploading too
much, corrupting the file.
Here's how we're calling s3-upload-stream
const destinationDetails = {
// ...
}
const sessionDetails = {
"UploadId": "SECRET_UPLOAD_ID",
"Parts": [
{
"ETag": ""ee72b288175a979f6475b7241dcb9913"",
"PartNumber": 1
},
{
"ETag": ""e33260ba6180f1294592f25985c0109e"",
"PartNumber": 2
}
]
}
const uploader = s3Stream.upload(destinationDetails, sessionDetails)
uploader.on('part', details => {
console.log(details)
})
And if we see what the part event logs:
{
"ETag": ""ee72b288175a979f6475b7241dcb9913"",
"PartNumber": 3
}
It's the exact same ETag we passed in via sessionDetails, and it gets
uploaded as another part, which corrupts the file.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#56?email_source=notifications&email_token=AA2VN6UREDI5EWUOGYSSVL3PYJ2HPA5CNFSM4HSAQFQKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4GXDAUNA>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AA2VN6WVM5U33HY4JQ5U4VTPYJ2HPANCNFSM4HSAQFQA>
.
|
Yes i keep tracking uploaded parts in db and send it as sessionDetails but the final uploaded file size is larger than original file size. and it is corrupted , not able to open. i think the part buffer we are sending is wrong. my case is, close the application and reopen means upload is to resume where is stopped. so that time only this issue happening. we have to send the part Buffer from where it paused. but it starts sending from the beginning i think so. Note: Pause , Resume works well in normal condition (Not closes the application) . close and opens the application means this issue happening. and i am using this package in Electron with react app |
finally i got . |
This is duplicate of this issue . But that one not answered . i am facing also the exact same problem . can you help me guys please ??
How is one supposed to track parts as they are uploaded? Our app needs to upload giant files, and we need to handle poor internet connections. So there's not always a chance that the upload will ever send out a pause event.
Currently we are tracking parts from the part event and adding them to an array. When I use the UploadId and Parts from a previous session, s3-upload-stream always starts from the beginning and ends up uploading too much, corrupting the file.
Here's how we're calling s3-upload-stream
const destinationDetails = {
// ...
}
const sessionDetails = {
"UploadId": "SECRET_UPLOAD_ID",
"Parts": [
{
"ETag": ""ee72b288175a979f6475b7241dcb9913"",
"PartNumber": 1
},
{
"ETag": ""e33260ba6180f1294592f25985c0109e"",
"PartNumber": 2
}
]
}
const uploader = s3Stream.upload(destinationDetails, sessionDetails)
uploader.on('part', details => {
console.log(details)
})
And if we see what the part event logs:
{
"ETag": ""ee72b288175a979f6475b7241dcb9913"",
"PartNumber": 3
}
It's the exact same ETag we passed in via sessionDetails, and it gets uploaded as another part, which corrupts the file.
The text was updated successfully, but these errors were encountered: