Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

💫 Support simple training format in nlp.evaluate and add tests #4033

Merged
merged 2 commits into from
Jul 27, 2019

Conversation

ines
Copy link
Member

@ines ines commented Jul 27, 2019

Description

  • Support passing in (text, annotations) tuples in nlp.evaluate instead of just (doc, gold) tuples (to match nlp.update).
  • Add tests to make sure updating and evaluating with both formats works.

Types of change

enhancement

Checklist

  • I have submitted the spaCy Contributor Agreement.
  • I ran the tests, and all new and existing tests passed.
  • My changes don't require a change to the documentation, or if they do, I've added all required information.

@ines ines added enhancement Feature requests and improvements training Training and updating models labels Jul 27, 2019
@ines ines merged commit fc69da0 into master Jul 27, 2019
@ines ines deleted the feature/nlp-evaluate-simple branch July 27, 2019 15:30
polm pushed a commit to polm/spaCy that referenced this pull request Aug 18, 2019
…sion#4033)

* Support simple training format in nlp.evaluate and add tests

* Update docs [ci skip]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Feature requests and improvements training Training and updating models
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant