-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[META] ML Inference Processor Enhancements I #2839
Comments
@opensearch-project/admin -- Can you please move this to our roadmap https:/orgs/opensearch-project/projects/1 under 2.17? |
support one to one inference in ml inference search response processor in #2801 support list in substring during prediction API, so it can support GenAI/RAG use case in ml inference search response with prompt defined in connector level in #2871 adding a to_string() method in httpConnector to support custom prompt in #2871 |
Is your feature request related to a problem?
Search response processor:
What solution would you like?
A clear and concise description of what you want to happen.
What alternatives have you considered?
A clear and concise description of any alternative solutions or features you've considered.
Do you have any additional context?
Add any other context or screenshots about the feature request here.
The text was updated successfully, but these errors were encountered: