Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allows User to Set System Prompt via "Additional Options" in Chat Interface #1353

Merged
merged 17 commits into from
Dec 10, 2023

Conversation

aly-shehata
Copy link
Contributor

@aly-shehata aly-shehata commented Dec 1, 2023

This PR allows users to set the system prompt from within the gradio interface (#1329). The default system prompt shows up as a placeholder to the input box, and the user may choose to override it.

I plan to add some documentation updates that provide example prompts, and these updates could be linked to from the gradio interface.

Recording.2023-12-01.142954.mp4

Copy link
Contributor

github-actions bot commented Dec 3, 2023

@aly-shehata aly-shehata marked this pull request as ready for review December 3, 2023 23:12
@aly-shehata
Copy link
Contributor Author

I'm noticing the preview docs published above do not contain the changes I have in ui.mdx. Unsure if that is a doc generation issue, or if some additional change is needed to review the documentation updates.

Copy link
Contributor

github-actions bot commented Dec 3, 2023

private_gpt/ui/ui.py Outdated Show resolved Hide resolved
private_gpt/ui/ui.py Outdated Show resolved Hide resolved
private_gpt/ui/ui.py Outdated Show resolved Hide resolved
…ly add system prompt as a system message if prompt is defined. Add new settings fields "default_query_system_prompt" and "default_chat_system_prompt". Updated documentation with new settings and minor corrections.
Copy link
Contributor

github-actions bot commented Dec 4, 2023

Copy link
Contributor

github-actions bot commented Dec 4, 2023

Copy link
Contributor

github-actions bot commented Dec 4, 2023

settings.yaml Outdated Show resolved Hide resolved
@@ -30,6 +30,8 @@

SOURCES_SEPARATOR = "\n\n Sources: \n"

MODES = ["Query Docs", "Search in Docs", "LLM Chat"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note for a later refactoring: this could be replaced by an enum.Enum class.

lopagela
lopagela previously approved these changes Dec 8, 2023
Copy link
Contributor

@lopagela lopagela left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me !

lopagela added a commit to lopagela/privateGPT that referenced this pull request Dec 8, 2023
As discussed on Discord, the decision has been made to remove the system prompts by default, to better segregate the API and the UI usages.

A concurrent PR (zylon-ai#1353) is enabling the dynamic setting of a system prompt in the UI.

Therefore, if UI users want to use a custom system prompt, they can specify one directly in the UI.
If the API users want to use a custom prompt, they can pass it directly into their messages that they are passing to the API.

In the highlight of the two use case above, it becomes clear that default system_prompt does not need to exist.
imartinez
imartinez previously approved these changes Dec 8, 2023
fern/docs/pages/manual/ui.mdx Outdated Show resolved Hide resolved
fern/docs/pages/manual/ui.mdx Outdated Show resolved Hide resolved
imartinez pushed a commit that referenced this pull request Dec 8, 2023
As discussed on Discord, the decision has been made to remove the system prompts by default, to better segregate the API and the UI usages.

A concurrent PR (#1353) is enabling the dynamic setting of a system prompt in the UI.

Therefore, if UI users want to use a custom system prompt, they can specify one directly in the UI.
If the API users want to use a custom prompt, they can pass it directly into their messages that they are passing to the API.

In the highlight of the two use case above, it becomes clear that default system_prompt does not need to exist.
@aly-shehata aly-shehata dismissed stale reviews from imartinez and lopagela via 626a9e0 December 8, 2023 22:34
Copy link
Contributor

github-actions bot commented Dec 8, 2023

Copy link
Contributor

github-actions bot commented Dec 8, 2023

…-prompt

# Conflicts:
#	private_gpt/components/llm/llm_component.py
#	private_gpt/settings/settings.py
Copy link
Contributor

github-actions bot commented Dec 8, 2023

settings.yaml Outdated Show resolved Hide resolved
private_gpt/settings/settings.py Outdated Show resolved Hide resolved
private_gpt/settings/settings.py Outdated Show resolved Hide resolved
private_gpt/settings/settings.py Outdated Show resolved Hide resolved
private_gpt/ui/ui.py Outdated Show resolved Hide resolved
private_gpt/ui/ui.py Outdated Show resolved Hide resolved
settings.yaml Outdated Show resolved Hide resolved
…nd update settings.py comments for new fields. Removed usage of llama_index.DEFAULT_SYSTEM_PROMPT.
Copy link
Contributor

github-actions bot commented Dec 9, 2023

Copy link
Contributor

github-actions bot commented Dec 9, 2023

imartinez
imartinez previously approved these changes Dec 9, 2023
private_gpt/settings/settings.py Outdated Show resolved Hide resolved
Copy link
Contributor

@aly-shehata
Copy link
Contributor Author

@imartinez I believe this is (finally) ready to PR... Please let me know if there are any other suggestions or findings. Thank you!

Copy link
Collaborator

@imartinez imartinez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@imartinez imartinez merged commit 145f3ec into zylon-ai:main Dec 10, 2023
8 checks passed
simonbermudez pushed a commit to simonbermudez/saimon that referenced this pull request Feb 24, 2024
As discussed on Discord, the decision has been made to remove the system prompts by default, to better segregate the API and the UI usages.

A concurrent PR (zylon-ai#1353) is enabling the dynamic setting of a system prompt in the UI.

Therefore, if UI users want to use a custom system prompt, they can specify one directly in the UI.
If the API users want to use a custom prompt, they can pass it directly into their messages that they are passing to the API.

In the highlight of the two use case above, it becomes clear that default system_prompt does not need to exist.
simonbermudez pushed a commit to simonbermudez/saimon that referenced this pull request Feb 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants