-
-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug - labels are only taken from true data #39
Labels
bug
Something isn't working
Comments
@icoxfog417 isn't it a bug? |
@icoxfog417 Can you explain why it is labeled as a question and not a bug? |
Hironsan
added
bug
Something isn't working
and removed
question
Further information is requested
labels
Sep 30, 2020
Merged
@yoeldk As of v1.0.0: >>> from seqeval.metrics import classification_report
>>> from seqeval.scheme import IOB2
>>> y_true = [['O', 'O', 'O', 'B-MISC', 'I-MISC', 'I-MISC', 'O'], ['B-PER', 'I-PER', 'O']]
>>> y_pred = [['O', 'O', 'B-MISC', 'I-MISC', 'I-MISC', 'I-MISC', 'O'], ['B-PER', 'I-PER', 'B-TEST']]
>>> print(classification_report(y_true, y_pred, mode='strict', scheme=IOB2))
precision recall f1-score support
MISC 0.00 0.00 0.00 1
PER 1.00 1.00 1.00 1
TEST 0.00 0.00 0.00 0
micro avg 0.33 0.50 0.40 2
macro avg 0.33 0.33 0.33 2
weighted avg 0.50 0.50 0.50 2 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Example:
from seqeval.metrics import classification_report
y_true = [['O', 'O', 'O', 'B-MISC', 'I-MISC', 'I-MISC', 'O'], ['B-PER', 'I-PER', 'O']]
y_pred = [['O', 'O', 'B-MISC', 'I-MISC', 'I-MISC', 'I-MISC', 'O'], ['B-PER', 'I-PER', 'B-TEST']]
print(classification_report(y_true, y_pred))
As you can see B-TEST does not appear in the table even though it's a false positive (it only appears in the predicted labels). The complete label list should be comprised from the union of true_labels + predicted_labels.
The text was updated successfully, but these errors were encountered: