Reasoning Engine API

The Reasoning Engine API provides the managed runtime for your customized agentic workflows in generative AI applications. You can create an application using orchestration frameworks such as LangChain, and deploy it with Reasoning Engine. This service has all the security, privacy, observability, and scalability benefits of Vertex AI integration.

For more conceptual information about Reasoning Engine, see Deploy the application.

Limitations

  • The Reasoning Engine API only supports Python orchestration frameworks.
  • The Reasoning Engine API is only supported in the us-central1 region.

Example syntax

Syntax to create and register a reasoning engine resource.

Python

class SimpleAdditionApp:
    def query() -> str:
        """
           ...

        """

        return
...

reasoning_engine = reasoning_engines.ReasoningEngine.create(
    SimpleAdditionApp(),
    display_name="",
    description="",
    requirements=[...],
    extra_packages=[...],
)

Parameter list

Parameters
display_name

Required: string

The display name of the ReasoningEngine.

description

Optional: string

The description of the ReasoningEngine.

spec

Required: ReasoningEngineSpec

Configurations of the ReasoningEngine.

package_spec

Required: PackageSpec

A user provided package specification, such as pickled objects and package requirements.

class_methods

Optional: protobuf.Struct

Declarations for object class methods.

PackageSpec

PackageSpec contains the reference to the Cloud Storage URI storing the OpenAPI YAML file.

Parameters
pickle_object_gcs_uri

Optional: string

The Cloud Storage URI of the pickled python object.

dependency_files_gcs_uri

Optional: string

The Cloud Storage URI of the dependency files with the tar.gz extension.

requirements_gcs_uri

Optional: string

The Cloud Storage URI of the requirements.txt file.

python_version

Optional: string

The Python version. Supported versions include Python 3.8, 3.9, 3.10, and 3.11. If not specified, the default value is 3.10.

QueryReasoningEngine

Parameters
input

protobuf.struct

The arguments inside input should be consistent with the query class method defined in the creation step.

Examples

Deploy a basic app configuration

The following example uses an application that adds two integers and a remote app with Reasoning Engine:

Python

To learn how to install or update the Vertex AI SDK for Python, see Install the Vertex AI SDK for Python. For more information, see the Python API reference documentation.

import vertexai
from vertexai.preview import reasoning_engines

# TODO(developer): Update and un-comment below lines
# project_id = "PROJECT_ID"
# staging_bucket = "gs://YOUR_BUCKET_NAME"

vertexai.init(
    project=project_id, location="us-central1", staging_bucket=staging_bucket
)

class SimpleAdditionApp:
    def query(self, a: int, b: int) -> str:
        """Query the application.

        Args:
            a: The first input number
            b: The second input number

        Returns:
            int: The additional result.
        """

        return f"{int(a)} + {int(b)} is {int(a + b)}"

# Locally test
app = SimpleAdditionApp()
app.query(a=1, b=2)

# Create a remote app with reasoning engine.
# This may take 1-2 minutes to finish.
reasoning_engine = reasoning_engines.ReasoningEngine.create(
    SimpleAdditionApp(),
    display_name="Demo Addition App",
    description="A simple demo addition app",
    requirements=[],
    extra_packages=[],
)

Deploy an advanced app configuration

This is an advanced example that uses LangChain's chain, prompt templates, and the Gemini API:

Python

To learn how to install or update the Vertex AI SDK for Python, see Install the Vertex AI SDK for Python. For more information, see the Python API reference documentation.


from typing import List

import vertexai
from vertexai.preview import reasoning_engines

# TODO(developer): Update and un-comment below lines
# project_id = "PROJECT_ID"
# location = "us-central1"
# staging_bucket = "gs://YOUR_BUCKET_NAME"

vertexai.init(project=project_id, location=location, staging_bucket=staging_bucket)

class LangchainApp:
    def __init__(self, project: str, location: str) -> None:
        self.project_id = project
        self.location = location

    def set_up(self) -> None:
        from langchain_core.prompts import ChatPromptTemplate
        from langchain_google_vertexai import ChatVertexAI

        system = (
            "You are a helpful assistant that answers questions "
            "about Google Cloud."
        )
        human = "{text}"
        prompt = ChatPromptTemplate.from_messages(
            [("system", system), ("human", human)]
        )
        chat = ChatVertexAI(project=self.project_id, location=self.location)
        self.chain = prompt | chat

    def query(self, question: str) -> Union[str, List[Union[str, Dict]]]:
        """Query the application.

        Args:
            question: The user prompt.

        Returns:
            str: The LLM response.
        """
        return self.chain.invoke({"text": question}).content

# Locally test
app = LangchainApp(project=project_id, location=location)
app.set_up()
print(app.query("What is Vertex AI?"))

# Create a remote app with reasoning engine
# This may take 1-2 minutes to finish because it builds a container and turn up HTTP servers.
reasoning_engine = reasoning_engines.ReasoningEngine.create(
    LangchainApp(project=project_id, location=location),
    requirements=[
        "google-cloud-aiplatform==1.50.0",
        "langchain-google-vertexai",
        "langchain-core",
    ],
    display_name="Demo LangChain App",
    description="This is a simple LangChain app.",
    # sys_version="3.10",  # Optional
    extra_packages=[],
)

Query Reasoning Engine

Query a reasoning engine.

This example uses the SimpleAdditionApp class from the Deploy a basic app configuration example.

REST

Before using any of the request data, make the following replacements:

  • PROJECT_ID: Your project ID.
  • LOCATION: The region to process the request. Must be us-central1.
  • REASONING_ENGINE_ID: The ID of the reasoning engine.
  • INPUT: protobuf.struct: The arguments inside input should match the arguments inside of the def query(self, question: str) method defined during the Deploy a basic app configuration.

HTTP method and URL:

POST http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID:query

Request JSON body:

{
  "input": {
    INPUT
  }
}

To send your request, choose one of these options:

curl

Save the request body in a file named request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID:query"

PowerShell

Save the request body in a file named request.json, and execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID:query" | Select-Object -Expand Content

Python

To learn how to install or update the Vertex AI SDK for Python, see Install the Vertex AI SDK for Python. For more information, see the Python API reference documentation.

import vertexai
from vertexai.preview import reasoning_engines

# TODO(developer): Update and un-comment below lines
# project_id = "PROJECT_ID"
# reasoning_engine_id = "REASONING_ENGINE_ID"

vertexai.init(project=project_id, location="us-central1")
reasoning_engine = reasoning_engines.ReasoningEngine(reasoning_engine_id)

# Replace with kwargs for `.query()` method.
response = reasoning_engine.query(a=1, b=2)
print(response)

List Reasoning Engines

List reasoning engines in a project.

REST

Before using any of the request data, make the following replacements:

  • PROJECT_ID: Your project ID.
  • PROJECT_ID: Your project ID.
  • LOCATION: The region to process the request. Must be us-central1.

HTTP method and URL:

GET http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines

To send your request, choose one of these options:

curl

Execute the following command:

curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines"

PowerShell

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines" | Select-Object -Expand Content

Python

To learn how to install or update the Vertex AI SDK for Python, see Install the Vertex AI SDK for Python. For more information, see the Python API reference documentation.

import vertexai
from vertexai.preview import reasoning_engines

# TODO(developer): Update and un-comment below lines
# project_id = "PROJECT_ID"

vertexai.init(project=project_id, location="us-central1")

reasoning_engine_list = reasoning_engines.ReasoningEngine.list()
print(reasoning_engine_list)

Get Reasoning Engine

Get details of a reasoning engine.

REST

Before using any of the request data, make the following replacements:

  • PROJECT_ID: Your project ID.
  • PROJECT_ID: Your project ID.
  • LOCATION: The region to process the request. Must be us-central1.
  • REASONING_ENGINE_ID: The ID of the reasoning engine.

HTTP method and URL:

GET http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID

To send your request, choose one of these options:

curl

Execute the following command:

curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID"

PowerShell

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method GET `
-Headers $headers `
-Uri "http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID" | Select-Object -Expand Content

Python

To learn how to install or update the Vertex AI SDK for Python, see Install the Vertex AI SDK for Python. For more information, see the Python API reference documentation.

import vertexai
from vertexai.preview import reasoning_engines

# TODO(developer): Update and un-comment below lines
# project_id = "PROJECT_ID"
# reasoning_engine_id = "REASONING_ENGINE_ID"

vertexai.init(project=project_id, location="us-central1")

reasoning_engine = reasoning_engines.ReasoningEngine(reasoning_engine_id)
print(reasoning_engine)

Delete Reasoning Engine

Delete a reasoning engine.

REST

Before using any of the request data, make the following replacements:

  • PROJECT_ID: Your project ID.
  • LOCATION: The region to process the request. Must be us-central1.
  • REASONING_ENGINE_ID: The ID of the reasoning engine.

HTTP method and URL:

DELETE http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID

To send your request, choose one of these options:

curl

Execute the following command:

curl -X DELETE \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID"

PowerShell

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method DELETE `
-Headers $headers `
-Uri "http://LOCATION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/LOCATION/reasoningEngines/REASONING_ENGINE_ID" | Select-Object -Expand Content

Python

To learn how to install or update the Vertex AI SDK for Python, see Install the Vertex AI SDK for Python. For more information, see the Python API reference documentation.

import vertexai
from vertexai.preview import reasoning_engines

# TODO(developer): Update and un-comment below lines
# project_id = "PROJECT_ID"
# reasoning_engine_id = "REASONING_ENGINE_ID"

vertexai.init(project=project_id, location="us-central1")

reasoning_engine = reasoning_engines.ReasoningEngine(reasoning_engine_id)
reasoning_engine.delete()

What's next