You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to use Sentry's new LLM monitoring feature, but I am using async calls with different 3rd party APIs.
In my case it is mainly Anthropic and OpenAI integration.
The problem is that the AnthropicIntegration and OpenAIIntegration doesn't work because currently the sentry-python library only patches synchronous functions.
Problem Statement
Hello,
I would like to use Sentry's new LLM monitoring feature, but I am using async calls with different 3rd party APIs.
In my case it is mainly Anthropic and OpenAI integration.
The problem is that the
AnthropicIntegration
andOpenAIIntegration
doesn't work because currently thesentry-python
library only patches synchronous functions.For example:
OpenAI patches these:
Anthropic patches these:
But, it should be fairly simple to patch also async functions.
Solution Brainstorm
Most of the code in the
wrap_message_create
function, should be the same.It seems that the only difference is:
Anthropic patches:
Make
_sentry_patched_create
async and await the wrapped function.For example, something like this:
OpenAI patches:
etc. etc.
I tested it locally by changeing the original code to this and it seem to work okay.
Would you be able to add this feature?
The text was updated successfully, but these errors were encountered: