Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Proper way to utilize output-directory and temp-extension #1713

Closed
BastionNtB opened this issue May 14, 2024 · 7 comments
Closed

Comments

@BastionNtB
Copy link

Hello, so I'm trying to figure out what the best way to utilize temp directories and such.

I have automated download handling from Radarr/Sonarr, so they respectively handle the import processing. I was just curious, it seems the original file is being copied as is to the import location. Is this sonarr/radarr's doing? Essentially, it will copy the original mkv and then after that, mp4_automator is then called to encode the file... But I was curious, this process wastes so much time. I have a very slow networked, HDD raid setup for redundancy only, and it's about 1mb/s so when a movie could be around 20+ GBs... it can take awhile just to copy over the movie.

My questions are:
What options are there to not have it copy the file first before encoding, and is it compatible with sonarr/radarr automatic download handling? Ideally, it would just reference the file form the download folder, as that's the exact same storage location as the target import folder is.
If it must copy the file first, how can I get it to not put it in the import folder that it will be encoding to, as it results in Emby discovering the file first, generating metadata for it, and then after it's encoding is done, it has orphaned files due to the likely name change of the file in question. (e.g. a temp folder for the original file, not the encode file)

Thank you!
autoProcess.ini settings

[Converter]
ffmpeg = /usr/bin/ffmpeg
ffprobe = /usr/bin/ffprobe
threads = 0
hwaccels = 
hwaccel-decoders = 
hwdevices = 
hwaccel-output-format = 
output-directory = 
output-directory-space-ratio = 0.0
output-format = mp4
output-extension = mp4
temp-extension = temp
minimum-size = 0
ignored-extensions = nfo, ds_store
copy-to = 
move-to = 
delete-original = True
process-same-extensions = True
bypass-if-copying-all = False
force-convert = True
post-process = False
wait-post-process = False
detailed-progress = False
opts-separator = ,
preopts = 
postopts = -max_muxing_queue_size, 9999
regex-directory-replace = [^\w\-_\. ]
@mdhiggins
Copy link
Owner

This is a limitation of sonarr/radarr which has been discussed quite a bit in the past

Sonarr has some new script triggers which don't look to be quite ready yet that might allow for running the work before it gets moved but for now the only trigger is after the file is moved to its destination

If you want conversion before this, best bet is to apply the conversion step to the downloader instead of sonarr/radarr (IE deluge, SAB etc)

This itself has some limitations since the metadata information that gets provided by sonarr/radarr isn't available at this stage but at least the bulk of the work can be done

#1674

This would be the new option but a new script will need to be written and I don't think Radarr has anything yet and I think this was still missing some things last I looked

@eric9876-git
Copy link

eric9876-git commented Nov 23, 2024

Hi @mdhiggins! Was just reading this thread and the linked issue. Your last comment here was May 2024 here and Sept 2023 in #1674 . Just wondering if there has been any change in the abilitity to trigger your post processing script before transfer? I do see a Connection Trigger called "On Manual Interraction Required" that doesn't appear to be documented in the in the Radarr Settings Wiki. It would be really really nice to trim down the post processing time. In my case, it's initial download file transfer from the SSD drive to spinning disk on the NAS, then the conversion to MP4 on the NAS, and then a third convesion for the QTFS/ATOM stuff. The entire thing can take up to 30 minutes on the NAS, depending on the media, when I have blazing fast server locally for the initial download.

@mdhiggins
Copy link
Owner

Radarr still hasn't implemented this I don't believe and the Sonarr option still has some limitations that make this not ideal

I would suggest implementing the post downloader scripts to leverage your SSD before the file is handed back to the *arrs which is discussed in the readme

@eric9876-git
Copy link

Thanks @mdhiggins. Will definitely try that.

@mdhiggins
Copy link
Owner

Biggest tip for this that people miss is that you need to disable completed download handling; Sonarr/Radarr will grab the file while its being converted which causes issues. The post downloader (SAB, deluge, etc) scripts will notify Sonarr/Radarr when they are done instead of it being grabbed automatically

@eric9876-git
Copy link

Biggest tip for this that people miss is that you need to disable completed download handling; Sonarr/Radarr will grab the file while its being converted which causes issues. The post downloader (SAB, deluge, etc) scripts will notify Sonarr/Radarr when they are done instead of it being grabbed automatically

@mdhiggins do I still need postRadarr.sh/postSonarr.sh in the ARRs if I am now using post downloader SABPostProcess.sh?

@mdhiggins
Copy link
Owner

Not required but if you want the tagging then you still want them enabled, the sab script doesn't know anything about the content so it doesn't tag, just does the formatting conversion

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants