Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When Inferencing OOM error even for just one image #104

Open
nareshmungpara opened this issue Dec 12, 2019 · 3 comments
Open

When Inferencing OOM error even for just one image #104

nareshmungpara opened this issue Dec 12, 2019 · 3 comments

Comments

@nareshmungpara
Copy link

Hi, the input shape is 196 * 720 * 1280. It looks you are feeding 196 images at once. It's obvious that you don't have an enough memory to process it at once.

I don't know you are using the model, but that is the reason. Thx!

Hi @UrYuWang and @jiny2001 ,

I am loading only 1 image and with dimension [1809,1164] and getting the issue

Resource exhausted: OOM when allocating tensor with shape[1,108,1809,1164] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator @#GPU_0_bfc

Now I don’t understand why it is showing me 108 even though I have loaded only one image. Can you please help me with that

Originally posted by @nareshmungpara in #81 (comment)

@nareshmungpara nareshmungpara changed the title > Hi, the input shape is 196 * 720 * 1280. It looks you are feeding 196 images at once. It's obvious that you don't have an enough memory to process it at once. When Inferencing OOM error even for just one image Dec 12, 2019
@nareshmungpara
Copy link
Author

Update:-

  • I tried to freeze the model and then run code for .pb file and now I am getting following error
    Resource exhausted: OOM when allocating tensor with shape[1,196,1809,1164] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [[{{node prefix/CNN1/prelu/Abs}}]]

  • shape[1,196,1809,1164] now it has become 196 and still, I am passing only 1 image as input.
    Not sure why it happened, can you please help me understand it and resolve it.

  • For freezing, I have used TensorRT and converted the model to .pb file which I am using above.

@Sylvus
Copy link

Sylvus commented Nov 15, 2020

I know this is too late for you, but for others, you can split the image into patches using functionality like this https:/idealo/image-super-resolution/blob/master/ISR/utils/image_processing.py

@jiny2001
Copy link
Owner

Thank you, @Sylvus !

Yes, "Resource exhausted: OOM" error simply means memory was not sufficient. This is because, the model size is proportional to the input image resolution.

So in your case, as @Sylvus suggested, you can split the image into smaller pieces to feed the model. Thx.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants