-
-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in deserializing Spacy doc from disk #3468
Comments
Check for Token.is_sent_start first (which is serialized/deserialized correctly)
Thanks for the report! I think what's going on here is that the As a quick workaround, you can do the following: new_doc.is_parsed = True It's a hack, but it should work. Basically, if the |
Thank you. But it doesn't work. It makes some error while printing the content of new_doc. |
Same problem here! it takes 7 minutes to load.. strange indeed! |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
How to deserialize Spacy results from disk
I need to run an algorithm on a lot of text files. In order to pre-process them, I use Spacy which has pre-trained models in different languages. Since the pre-processed results are employed in different parts of the algorithm, it is better to save them on disk once and load them many times. However, the Spacy deserialization method makes an error. I wrote a simple code to show the error:
However, the above example code makes the following error:
Traceback (most recent call last): File "/tmp/test_result.bin", line 14, in <module> for ix, sent in enumerate(new_doc.sents, 1): File "doc.pyx", line 535, in __get__ ValueError: [E030] Sentence boundaries unset. You can add the 'sentencizer' component to the pipeline with: nlp.add_pipe(nlp.create_pipe('sentencizer')) Alternatively, add the dependency parser, or set sentence boundaries by setting doc[i].is_sent_start.
I tried to change the first two line to "de_nlp=spacy.load("de_core_news_sm")", but it still gives some other errors.
I really needs your help. Any information about this topic is appreciated.
My Environment
The text was updated successfully, but these errors were encountered: