-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference Speed /Optimization Tips #214
Comments
Ooh got a huge speed up running with Java 20 instead of Java 8. |
The hardware behind opsin.ch.cam.ac.uk is very underwhelming, it's a virtual machine that appears to have been allocated 1 core of a Intel Xeon Silver 4210R and 1 GB of ram. It's currently running Java 11. OPSIN processes each name using the thread that called OPSIN, so to take advantage of a system with many cores, multiple names can be processed in parallel using different threads. The |
Could you give us some more information on how the web portal runs? Information about the hardware or any optimizations done would be greatly appreciated.
I've noticed that the speed of Opsin on the portal is a lot better than when running the jar locally, even on a pretty beefy server with 128 cpu cores and plenty of RAM.
The text was updated successfully, but these errors were encountered: