-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
YTDL: Huge memory usage per subtitle #3456
Comments
You need to add "write-sub=" to ytdl-raw-options (ex:
You don't need youtube-dl.exe in path, you place it in the same dir as mpv.exe or mpv's conf dir. |
I can't confirm such an excessive memory usage. But it starts a cache thread for each file and keeps the http connection open, which doesn't seem ideal. |
Its no real memory usage, probably just memory fragmentation. |
I think most of this is caused by the cache creating a 75MB backbuffer for each subtitle file (which obviously doesn't make too much sense). |
This is for text subtitles. libavformat currently always reads text subtitles completely on init. This means the underlying stream is useless and will consume resources for various reasons (network connection, file handles, cache memory). Take care of this by closing the underlying stream if we think the demuxer has read everything. Since libavformat doesn't export whether it did (or whether it may access the stream again in the future), we rely on a whitelist. Also, instead of setting the stream to NULL or so, set it to an empty dummy stream. This way we don't have to litter the code with NULL checks. demux_lavf.c needs extra changes, because it tries to do clever things for the sake of subtitle charset conversion. The main reason we keep the demuxer etc. open is because we fell for libavformat being so generic, and we tried to remove corresponding special-cases in the higher-level player code. Some of this is forced due to ass/srt mkv/mp4 demuxing being very similar to external text files. In the future it might be better to do this in a more straight-forward way, such as reading text subtitles into libass and then discarding the demuxer entirely, but for aforementioned reasons this could be more of a mess than the solution introduced by this commit. Probably fixes commit #3456.
This should significantly reduce resource usage of those subtitle tracks. |
When playing a youtube video with mpv it needs about 80MB (private bytes) for a 4KB subtitle. This gets problematic when playing a video with lots of subtitles. For example most of pewdiepie's videos (like this one "https://www.youtube.com/watch?v=LZ0rGTsdfwk") have almost 40 subtitles and therefore generate over 3GB of Ram usage.
I tried using "ytdl-raw-options=sub-lang=en": The high Ram usage doesnt occur anymore and the video is loading about 3 times faster; but no subtitles, not even the english one, are loaded.
Maybe the long loading time only comes from downloading all the different subtitles and the 3GB of Ram are just reserved and arent actually used but its still a strange behaviour. And if this doesnt get fixed it would be really nice to limit the amount of subtitles that get loaded e.g. with "ytdl-raw-options=sub-lang=..."
BTW: Why is there no parameter to set ytdl's path?
(I used the latest youtube-dl and mpv.srsfckn.biz builds for windows with --no-config)
The text was updated successfully, but these errors were encountered: