Skip to content
This repository has been archived by the owner on Feb 17, 2024. It is now read-only.

mT5 data sampling for pre-training #106

Open
cyk1337 opened this issue Aug 26, 2022 · 0 comments
Open

mT5 data sampling for pre-training #106

cyk1337 opened this issue Aug 26, 2022 · 0 comments

Comments

@cyk1337
Copy link

cyk1337 commented Aug 26, 2022

Hi @craffel, I have some questions about the data sampling and T5 span corruption. Could you please link me to the corresponding implementation?

  • About sampling high/low-resource languages. Have you used one dataloader for each language out of 100+, and doing sampling during pre-training?
  • About non-padding span corruption (merge_examples_to_reduce_padding ). Is mT5 trained without padding? If so, is each sampled batch the same language? Otherwise, each concatenated sample may contain different languages, and the span corruption may be across different languages.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant