generated from konveyor-ecosystem/template-repo
-
Notifications
You must be signed in to change notification settings - Fork 32
Issues: konveyor/kai
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Write end-to-end test verifying codeplan flow works under few different inputs
#420
opened Oct 11, 2024 by
pranavgaikwad
Amazon Bedrock: InvokeModel operation: The provided model doesn't support on-demand throughput - "meta.llama3-2-90b-instruct-v1:0"
wontfix
This will not be worked on
#411
opened Oct 4, 2024 by
jwmatthews
Merging Something isn't working
kai/config.toml' and
build/config.toml` is causing confusion with changing models
bug
#400
opened Sep 27, 2024 by
jwmatthews
podman compose - broken log/trace to disk when using the supplied build/config_example.toml
bug
Something isn't working
podman compose up: psycopg2.OperationalError: connection to server at "127.0.0.1", port 5432 failed: Connection refused
bug
Something isn't working
#395
opened Sep 27, 2024 by
jwmatthews
[Experiment] Explore solved incident generated via semantic diff instead of llm_summary
experiment
#383
opened Sep 21, 2024 by
jwmatthews
Solved incident "llm_lazy" - "llm_summary" running against coolstore and llama3 is seeing multiple issues
bug
Something isn't working
#382
opened Sep 21, 2024 by
jwmatthews
Capture number of tokens in a request and response when possible
#373
opened Sep 17, 2024 by
jwmatthews
Consistent: "No codeblocks detected in LLM response" for several files with
#350
opened Sep 4, 2024 by
jwmatthews
Previous Next
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.