Skip to content

Transformer inference on CPU #10543

Mar 24, 2022 · 1 comments · 2 replies
Discussion options

You must be logged in to vote

Transformers should run on CPU, and if they don't that could be a bug, though in most cases they will be too slow to be useful.

To be clear, are you getting that ValueError when running prefer_gpu() on a machine with no GPU? Or when running spacy.load()? Please provide the actual code / configuration you are using so we can understand the issue.

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@kanayer
Comment options

@danieldk
Comment options

Answer selected by kanayer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat / transformer Feature: Transformer
3 participants