Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with Data Integrity Protections for Amazon S3 #160

Open
ErwanLeroux opened this issue Feb 2, 2025 · 4 comments
Open

Problem with Data Integrity Protections for Amazon S3 #160

ErwanLeroux opened this issue Feb 2, 2025 · 4 comments

Comments

@ErwanLeroux
Copy link

With version 2.30.0 of https://mvnrepository.com/artifact/software.amazon.awssdk/bom/
byte[] tarContent = s3Resource.getContentAsByteArray(); returns more than the previously uploaded file, resulting in a corrupted file.

I found this issue :

And configured my S3Client with

S3Client.builder()
.responseChecksumValidation(ResponseChecksumValidation.WHEN_REQUIRED)
.requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED)
.build();

That made byte[] tarContent = s3Resource.getContentAsByteArray(); return the exact file.

It appears to be linked to aws/aws-sdk-java-v2#5802

Could you look into it, to see if there anything to do beside configuration ?

@Robothy
Copy link
Owner

Robothy commented Feb 6, 2025

@ErwanLeroux Thanks for reporting this issue. Please try to resolve it by upgrading the LocalS3 version to 1.20.

@ErwanLeroux
Copy link
Author

Sadly, it doesn't work without setting

                .responseChecksumValidation(ResponseChecksumValidation.WHEN_REQUIRED)
                .requestChecksumCalculation(RequestChecksumCalculation.WHEN_REQUIRED)

even with the 1.20 version

@ErwanLeroux
Copy link
Author

For security reason, I can't provide with the full file that cause the problem, but I can show you the extraneous bytes at the start of the file
"20000;chunk-signature=c9e25a83116f8f2210860984da2119e9cb0e0e6061898a6b125aa817bd9155ca\nEPE"
I'm not sure if there really is a return char between 'ca' and 'EPE', it could a misinterpretation from my binary visualiser.
In bytes it's '0D 0A'

@Robothy
Copy link
Owner

Robothy commented Feb 7, 2025

@ErwanLeroux It appears that LocalS3 is not decoding the stream correctly when uploading objects. Could you please provide a test code snippet to reproduce this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants