Regarding the Studio AI Assistant and Local LLMs #22014
Replies: 2 comments
-
Hey, hoping to revitalize this discussion, which would be very useful. It would be trivial to update the openAI client as pushed out above, but it references a custom ai-commands/edge package and specifically the clippy function here: supabase/packages/ai-commands/src/docs.ts Line 169 in 879a668 Which seems to have hardcoded openai URLs, model names, etc. Still would be relatively easy to clean up overall, but probably not as easy as changing the one-line instantiation of the client. |
Beta Was this translation helpful? Give feedback.
-
I would love if the LLM could just run in the browser - like https://chat.webllm.ai/ After I use postgres.new for a few minutes, I get the error that I am going to melt their servers with all of my LLM requests. I would love if I could not be limited by this. |
Beta Was this translation helpful? Give feedback.
-
I was wondering if there are plans to support being able to connect to something like lmstudio, which can host a server that mimics the openai api.
Please let me know if I'm misunderstanding this, but would reading the environment variable
OPENAI_BASE_URL
and overriding the one used in each instantiation of the openAI client (const openai = new OpenAI({ apiKey: openAiKey, base_url: baseUrl })
) allow this to work?I would appreciate any insight offered.
Beta Was this translation helpful? Give feedback.
All reactions