-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stars_proxy memory hog #708
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Intending to build a high-dimensional data cube from raster files in plain text ASCII grid format I read all files' meta data (file path and attributes) into a data frame (1), group by dimensions and concatenate files in each group into a stars_proxy (2) to then summarize/concantenate the stars_proxys into a higher dimensional star_proxy (3), similar to the process described in this post on StackExchange or this Github issue.
Upon loading the
star_proxy
viamy_star_proxy |> st_as_stars()
the memory usage ascends into 10s of GB even if only a couple of files with file size of 5-10 MB are read. The problem only occurs with files of the following formatwhereas with standard data no such problem occurs and only a couple 100 MB are used.
I suspect, I should supply some options to the
read_stars
routine but so far have not good guess.The text was updated successfully, but these errors were encountered: