You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently the Ollama class assumes Ollama is running on localhost. This is an unnecessary limitation.
The config object is not passed down to the "native wrapper" although it looks like it in the code and thus, it defaults to localhost and port.
As a developer, I would like to supply a host and port to be used when creating my Ollama instance.
Currently the Ollama class assumes Ollama is running on localhost. This is an unnecessary limitation.
The
config
object is not passed down to the "native wrapper" although it looks like it in the code and thus, it defaults to localhost and port.As a developer, I would like to supply a host and port to be used when creating my Ollama instance.
Example of intended use (see config object)
The "native wrapper" class where the config object never arrives:
LlamaIndexTS/packages/llamaindex/src/internal/deps/ollama.js
Line 274 in 41fe871
The text was updated successfully, but these errors were encountered: