Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding ignore_eos_token support in Chat Completions API Schema #3387

Open
jiahong-liu opened this issue Aug 6, 2024 · 1 comment
Open

Adding ignore_eos_token support in Chat Completions API Schema #3387

jiahong-liu opened this issue Aug 6, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@jiahong-liu
Copy link

jiahong-liu commented Aug 6, 2024

Description

ignore_eos_token is commonly used additional parameter to help standardize LLM benchmarks by forcing the requests to generate a consistent output seq len.

-Will this change the current api? How?

It will be adding the ignore_eos_token as additional optional field in the request body.

-Who will benefit from this enhancement?

Anyone who is trying to do benchmark or gain a better understanding of the performance

References

@jiahong-liu jiahong-liu added the enhancement New feature or request label Aug 6, 2024
@lanking520
Copy link
Contributor

@sindhuvahinis

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants