This document describes how to resolve errors that you might encounter when using an application.
Operation schemas is empty
If your application returns an empty list from .operation_schemas()
, it might
be caused by one of the following issues:
Failure generating a schema during application creation
Issue:
When you deploy your application, you receive a warning similar to the following:
WARNING:vertexai.reasoning_engines._reasoning_engines:failed to generate schema: issubclass() arg 1 must be a class
Possible cause:
This warning might occur if you deploy an application using the prebuilt
LangchainAgent
template on a version of google-cloud-aiplatform
that's
earlier than 1.49.0
. To check which version you're using, run the following
command in the terminal:
pip show google-cloud-aiplatform
Recommended solution:
Run the following command in your terminal to update your
google-cloud-aiplatform
package:
pip install google-cloud-aiplatform --upgrade
After you update your google-cloud-aiplatform
package, run the following
command to verify that its version is 1.49.0
or later:
pip show google-cloud-aiplatform
If you're in a notebook instance (for example, Jupyter or Colab or Workbench),
you might need to restart your runtime to use the updated package. After you've
verified your version of google-cloud-aiplatform
is 1.49.0
or later, try to
deploy your application again.
PermissionDenied
error when querying your application
Your query might fail if you don't have the required permissions.
LLM permissions
Issue:
You might receive a PermissionDenied
error that's similar to the following:
PermissionDenied: 403 Permission 'aiplatform.endpoints.predict' denied on resource
'//aiplatform.googleapis.com/projects/{PROJECT_ID}/locations/{LOCATION}/publishers/
google/models/{MODEL}' (or it may not exist). [reason: "IAM_PERMISSION_DENIED"
domain: "aiplatform.googleapis.com"
metadata {
key: "permission"
value: "aiplatform.endpoints.predict"
}
metadata {
key: "resource"
value: "projects/{PROJECT_ID}/locations/{LOCATION}/publishers/google/models/{MODEL}"
}
]
Possible cause:
Your Service Account might not have the proper permissions to query your large language model (LLM).
Recommended solution:
Make sure your service account has the proper Identity and Access Management (IAM)
permissions listed in the error message. An example of an IAM
permission you might be missing is aiplatform.endpoints.predict
. See Set up your service agent permissions
for more information.
Invalid Request
If you run into issues with invalid requests when querying your application, it might be due to one of the issues that's described in this section.
FailedPrecondition
Issue:
You might receive a FailedPrecondition
error that's similar to the following:
FailedPrecondition: 400 Reasoning Engine Execution failed. Error Details:
{"detail":"Invalid request: `{'query': ...}`"}
Possible cause:
This might happen if you are calling agent.query(query_str)
instead of
agent.query(input=query_str)
(i.e. specifying the inputs to the query as
positional arguments instead of keyword arguments).
Recommended solution:
When querying an instance of a reasoning engine that has been deployed, specify
all inputs as keyword arguments (e.g. agent.query(input=query_str)
).