-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPT-2 state and GPT-j-6B #3
Comments
They will be published soon but I can not tell you when. Hope not later than this October.
In the short term we do not have plans to do so. However, we are continuously gathering corpora for this purpose. I will keep this issue open in order to provide updates. |
I am pleased to announce that GPT-2 base and large are already available. I will let this thread open for the next steps. |
I would like to ask about the state of the GPT-2 model. Will it arrive soon at huggingface?
I would also like to ask if you have the intention of train GPT-j-6B. Training this model for some people would be impossible due to its hardware requirements, but you have Mare Nostrum, the dataset and the previous version GPT-2.
The text was updated successfully, but these errors were encountered: