-
-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
💫 Try multi-processing in v2 nlp.pipe()? #1303
Comments
@honnibal I am interested in working on the issue. |
@souravsingh Great! Here's the method that would need to change: https:/explosion/spaCy/blob/develop/spacy/language.py#L433 I would suggest first working on getting the empty pipeline working (i.e. just the tokenizer). Then you can try the models. The main complication you might encounter is that the v2 models use numpy, which multi-threads the matrix multiplications via OpenBlas. I'm not sure whether you'll have trouble with this in child processes. I also don't know whether the GPU will complain in child processes or not. |
@honnibal Are we free to use |
@souravsingh Yes, I like joblib. |
In case this is helpful, I've had success getting multiprocessing to work with spaCy by using the |
Hey, what's the status here? Is anyone working on this? |
Just out of curiosity, what is stopping you from releasing the GIL? |
@honnibal just wanna check that this was still the right method to be changing (since this link is from two years back). I'm interested in picking this up, since it seems like it hasn't been completed yet. |
* refactor: separate formatting docs and golds in Language.update * fix return typo * add pipe test * unpickleable object cannot be assigned to p.map * passed test pipe * passed test! * pipe terminate * try pipe * passed test * fix ch * add comments * fix len(texts) * add comment * add comment * fix: multiprocessing of pipe is not supported in 2 * test: use assert_docs_equal * fix: is_python3 -> is_python2 * fix: change _pipe arg to use functools.partial * test: add vector modification test * test: add sample ner_pipe and user_data pipe * add warnings test * test: fix user warnings * test: fix warnings capture * fix: remove islice import * test: remove warnings test * test: add stream test * test: rename * fix: multiproc stream * fix: stream pipe * add comment * mp.Pipe seems to be able to use with relative small data * test: skip stream test in python2 * sort imports * test: add reason to skiptest * fix: use pipe for docs communucation * add comments * add comment
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
In spaCy 1 multi-processing was a non-starter, for a variety of reasons. The model took a long time to load, and the integer ID mapping was stateful. These have been fixed in v2. At the same time, the v2 neural network model can't yet release the GIL, making multi-threading inefficient. We should therefore consider whether multi-processing would be a better solution.
The
nlp.pipe()
method is already a generator that takes abatch_size
argument. I think it should be pretty easy to try out multi-processing here.The text was updated successfully, but these errors were encountered: