Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Speed /Optimization Tips #214

Closed
raunakdoesdev opened this issue May 24, 2023 · 2 comments
Closed

Inference Speed /Optimization Tips #214

raunakdoesdev opened this issue May 24, 2023 · 2 comments

Comments

@raunakdoesdev
Copy link

Could you give us some more information on how the web portal runs? Information about the hardware or any optimizations done would be greatly appreciated.

I've noticed that the speed of Opsin on the portal is a lot better than when running the jar locally, even on a pretty beefy server with 128 cpu cores and plenty of RAM.

@raunakdoesdev
Copy link
Author

Ooh got a huge speed up running with Java 20 instead of Java 8.

@dan2097
Copy link
Owner

dan2097 commented May 24, 2023

The hardware behind opsin.ch.cam.ac.uk is very underwhelming, it's a virtual machine that appears to have been allocated 1 core of a Intel Xeon Silver 4210R and 1 GB of ram. It's currently running Java 11.
I'm not aware of OPSIN having any performance issues on Java 8, but given that both Java 11 and 17 are long-term supported releases, there's little reason to still be using Java 8 unless you have to.

OPSIN processes each name using the thread that called OPSIN, so to take advantage of a system with many cores, multiple names can be processed in parallel using different threads. The NameToStructure instance is thread-safe and can be safely called from multiple threads simultaneously.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants