Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

oci os object get OOMs when the file it is getting is too big #883

Open
adamsb6 opened this issue Dec 27, 2024 · 0 comments
Open

oci os object get OOMs when the file it is getting is too big #883

adamsb6 opened this issue Dec 27, 2024 · 0 comments

Comments

@adamsb6
Copy link

adamsb6 commented Dec 27, 2024

While testing our cloudinit scripts with some smaller instances with 2GB of memory I found that my call to oci os object get ends up OOMing. Here's where this happens in the script output:

+ oci os object get -bn <redacted> --name releases/<redacted>.gz --file <redacted>.gz                                                                                   
Downloading object                                                                                                                                                                                                    
/var/lib/cloud/instance/scripts/part-001: line 111: 38203 Killed                  oci os object get -bn <redacted> --name releases/<redacted>.gz --file <redacted>.gz

And the corresponding line in dmesg:

[  354.613716] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/system.slice/cloud-final.service,task=oci,pid=38203,uid=0                                           
[  354.617651] Out of memory: Killed process 38203 (oci) total-vm:6534412kB, anon-rss:1497088kB, file-rss:5336kB, shmem-rss:0kB, UID:0 pgtables:6484kB oom_score_adj:0                                                

The target file here is 2.9GB. You can see the OOM happens when the oci process is consuming about 1.5GB.

Bumping RAM up to 8GB on these instances results in no OOM.

I see in the source some references to streaming interfaces, but maybe we're not using those, they're bugged, or we have a memory leak somewhere else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant