You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When deploying an MLflow model using MLServer, we can't specify params as part of the request.
Here is the error returned by the server:
{"error":"mlflow.utils.proto_json_utils.MlflowInvalidInputException: Invalid input. One of \"instances\" and \"inputs\" must be specified (not both or any other keys).Received: ['inputs', 'params']"}%
It seems that MLServer doesn't support parameters as part of the request, which breaks its compatibility with MLflow models that require these parameters to work.
The text was updated successfully, but these errors were encountered:
When deploying an MLflow model using MLServer, we can't specify
params
as part of the request.Here is the error returned by the server:
Here is how I'm running the server:
Here is an example of the request:
It seems that MLServer doesn't support parameters as part of the request, which breaks its compatibility with MLflow models that require these parameters to work.
The text was updated successfully, but these errors were encountered: