Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Advice on lowering memory usage #215

Closed
Darkle opened this issue Aug 10, 2023 · 4 comments
Closed

Advice on lowering memory usage #215

Darkle opened this issue Aug 10, 2023 · 4 comments
Labels
question Further information is requested

Comments

@Darkle
Copy link
Contributor

Darkle commented Aug 10, 2023

Hi, I was hoping to get some tips or tricks on lowering memory usage and fixing a possible memory leak in my code.

My F# code here is a simplified version of what I have in my regular code: https:/Darkle/netvips-memtest/blob/main/Program.fs
You should be able to run it with a dotnet restore and dotnet run

Some notes:

  • Using Parallel.ForEach, the memory usage goes past 4GB and slowly keeps going up.
  • I also tried using Array.Parallel in case there was an issue with Parallel.ForEach, but the memory keeps rising for that too.
    • You can try it out by commenting out the Parallel.ForEach code and uncommenting out the Array.Parallel code.
  • I also tried a simple for loop, and while it uses much less memory, the memory does slowly rise.
  • Switching from Enums.Access.Sequential to Enums.Access.SequentialUnbuffered helped a bit to lower the memory usage.
  • I'm totally willing to sacrifice speed for lower memory usage.

Details:

  • OS: Debian Linux Bullseye
  • .Net version: 7.0.306
  • NetVips version: 2.3.1
  • NetVips.Native version: 8.14.3

Any advice would be appreciated.
Thanks.

@Darkle
Copy link
Contributor Author

Darkle commented Aug 11, 2023

So I was looking through the Go lib for libvips and it has this section: https:/davidbyttow/govips#memory-usage-note
So i tried setting MALLOC_ARENA_MAX=2 and now I have no more memory leak! 🎉

@kleisauke
Copy link
Owner

I think this is a memory fragmentation issue. You might consider using a different memory allocator such as jemalloc to prevent memory fragmentation in long-running, multi-threaded, glibc-based Linux processes. See lovell/sharp#955 to learn more about memory allocation and why the choice of a memory allocator is important.

For the same reason Redis ships with jemalloc by default since version 2.4.

Since we introduced the specially encoded data types Redis started suffering from fragmentation. We tried different things to fix the problem, but basically the Linux default allocator in glibc sucks really, really hard. […] Every single case of fragmentation in real world systems was fixed by this change, and also the amount of memory used dropped a bit.

Please see lovell/sharp-libvips#95 for a future possible enhancement that relates to this.

@kleisauke
Copy link
Owner

Alternatively, you could also consider switching to a Alpine-based Docker image, as the memory allocator in musl, upon which Alpine Linux is based, is generally considered very good in terms of lack of fragmentation and returning freed memory.

FWIW, I'm also aware that sharp reduces the default concurrency to 1 when glibc' memory allocator is used, see commit:
lovell/sharp@5a9cc83

This can be controlled in NetVips with the VIPS_CONCURRENCY environment variable or the NetVips.Concurrency setter.

/// <summary>
/// Gets or sets the number of worker threads libvips' should create to process each image.
/// </summary>
public static int Concurrency
{
get => Vips.ConcurrencyGet();
set => Vips.ConcurrencySet(value);
}

@kleisauke kleisauke added the question Further information is requested label Aug 11, 2023
@Darkle
Copy link
Contributor Author

Darkle commented Aug 12, 2023

Thanks for the info. 👍

I also tried running my code in an Alpine Linux docker container and that also fixed the issue.

So for anyone reading this thread in the future (and you are also running into memory issues), try the following:

  1. Set the env variables: VIPS_CONCURRENCY=1 MALLOC_ARENA_MAX=2
  2. If the above doesn't work, try running your code in an Alpine Linux docker container
  3. If the above doesn't work, try changing the default memory allocator you are using to jemalloc

@Darkle Darkle closed this as completed Aug 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants