-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NLP/NLU/Singularity discussions #3
Comments
We're reaching a bit of a new era since 2018's BERT. Since then, there have been massive improvements in many NLP tasks, mostly using Transformers in some way. Very exciting, but sadly a bit out of my ballpark still. I'm excited to jump into it, but the lack of explainability that deep neural networks gives is bothersome. That said, the future is neural, and most actually interesting new developments use some (huggingface) neural model. https://huggingface.co/ is super interesting, and I'd love to work there, to be honest. |
Welcome back @tomaarsen ! :)
Shall we begin: Watching the evolution of the state of the art in NLP has been a pleasure. On transformers and progressAs you say, transformers and more generally pre-trained language models have been disrupting, enabling significant F1 score accuracy gains almost universally. Edit: |
Woah, thanks for the info. My assumption was that BERT-based models were still constantly improving, given the large number of BERT clones (as you mentioned). I also had not heard about XLNet yet! As for NLTK: the work is very interesting to me. It's essentially a large collection of small programs supplied by hundreds of contributors over the years, but weren't really maintained much beyond that. To me, it's great and fun practice in solving issues in a somewhat neglected codebase. Modernising, bug fixing, optimizing, all while learning about some of the core NLP tasks is just a lot of fun. I have great respect for spaCy, and am recently trying to learn more about it. For example, I did some work on PyTextRank, a spaCy pipeline extension. I am even considering trying to do an internship or something for Explosion (the company behind spaCy). That said, I'm unsure if I'm sufficiently qualified. |
Hi @tomaarsen, welcome to my collection of interesting brains, you have been recognized for your interesting and useful project in word inflection generation.
Since github does not provide a chat, if you wanna share thoughts/facts about NLP/NLU/information extraction/other, feel free to do it here :)
The text was updated successfully, but these errors were encountered: