May 2020 [arXiv]
Whats Unique This paper present a technique to predict super-sense of the masked word, with the help of weak supervision by infusing supersense of the words during pretraining. And, it surpasses state of the art results for WiC task of SuperGLUE.
How It Works
-
It has selected set of 45 supersenses as the candidates for infusion in weak supervision as well as for prediction.
-
Following figure layout the diagram for architecture
-
Following example illustrate the concept of weak supervion
-
Sense-language-modelling: Allowed senses prediction
-
Sense-language-modelling Regularisation
-
Following figure shows an example of how it predicts supersenses for masked words as well as for unmasked sentence.
Results
- It has shown 2.5% improvement for WiC task.