채팅 시나리오에서 외부 함수 호출과 함께 Gemini API를 사용하여 텍스트 응답 생성

외부 함수 호출과 함께 Gemini API를 사용하여 텍스트 응답을 생성합니다. 이 예시에서는 함수 2개와 순차적 프롬프트 2개가 있는 채팅 시나리오를 보여줍니다.

더 살펴보기

이 코드 샘플이 포함된 자세한 문서는 다음을 참조하세요.

코드 샘플

Python

이 샘플을 사용해 보기 전에 Vertex AI 빠른 시작: 클라이언트 라이브러리 사용Python 설정 안내를 따르세요. 자세한 내용은 Vertex AI Python API 참고 문서를 참조하세요.

Vertex AI에 인증하려면 애플리케이션 기본 사용자 인증 정보를 설정합니다. 자세한 내용은 로컬 개발 환경의 인증 설정을 참조하세요.

import vertexai
from vertexai.generative_models import (
    FunctionDeclaration,
    GenerativeModel,
    Part,
    Tool,
)

def generate_function_call_chat(project_id: str, location: str) -> tuple:
    prompts = []
    summaries = []

    # Initialize Vertex AI
    vertexai.init(project=project_id, location=location)

    # Specify a function declaration and parameters for an API request
    get_product_info_func = FunctionDeclaration(
        name="get_product_sku",
        description="Get the SKU for a product",
        # Function parameters are specified in OpenAPI JSON schema format
        parameters={
            "type": "object",
            "properties": {
                "product_name": {"type": "string", "description": "Product name"}
            },
        },
    )

    # Specify another function declaration and parameters for an API request
    get_store_location_func = FunctionDeclaration(
        name="get_store_location",
        description="Get the location of the closest store",
        # Function parameters are specified in OpenAPI JSON schema format
        parameters={
            "type": "object",
            "properties": {"location": {"type": "string", "description": "Location"}},
        },
    )

    # Define a tool that includes the above functions
    retail_tool = Tool(
        function_declarations=[
            get_product_info_func,
            get_store_location_func,
        ],
    )

    # Initialize Gemini model
    model = GenerativeModel(
        "gemini-1.0-pro", generation_config={"temperature": 0}, tools=[retail_tool]
    )

    # Start a chat session
    chat = model.start_chat()

    # Send a prompt for the first conversation turn that should invoke the get_product_sku function
    prompt = "Do you have the Pixel 8 Pro in stock?"
    response = chat.send_message(prompt)
    prompts.append(prompt)

    # Check the function name that the model responded with, and make an API call to an external system
    if response.candidates[0].content.parts[0].function_call.name == "get_product_sku":
        # Extract the arguments to use in your API call
        product_name = (
            response.candidates[0].content.parts[0].function_call.args["product_name"]
        )
        product_name

        # Here you can use your preferred method to make an API request to retrieve the product SKU, as in:
        # api_response = requests.post(product_api_url, data={"product_name": product_name})

        # In this example, we'll use synthetic data to simulate a response payload from an external API
        api_response = {"sku": "GA04834-US", "in_stock": "yes"}

    # Return the API response to Gemini so it can generate a model response or request another function call
    response = chat.send_message(
        Part.from_function_response(
            name="get_product_sku",
            response={
                "content": api_response,
            },
        ),
    )

    # Extract the text from the summary response
    summary = response.candidates[0].content.parts[0].text
    summaries.append(summary)

    # Send a prompt for the second conversation turn that should invoke the get_store_location function
    prompt = "Is there a store in Mountain View, CA that I can visit to try it out?"
    response = chat.send_message(prompt)
    prompts.append(prompt)

    # Check the function name that the model responded with, and make an API call to an external system
    if (
        response.candidates[0].content.parts[0].function_call.name
        == "get_store_location"
    ):
        # Extract the arguments to use in your API call
        location = (
            response.candidates[0].content.parts[0].function_call.args["location"]
        )
        location

        # Here you can use your preferred method to make an API request to retrieve store location closest to the user, as in:
        # api_response = requests.post(store_api_url, data={"location": location})

        # In this example, we'll use synthetic data to simulate a response payload from an external API
        api_response = {"store": "2000 N Shoreline Blvd, Mountain View, CA 94043, US"}

    # Return the API response to Gemini so it can generate a model response or request another function call
    response = chat.send_message(
        Part.from_function_response(
            name="get_store_location",
            response={
                "content": api_response,
            },
        ),
    )

    # Extract the text from the summary response
    summary = response.candidates[0].content.parts[0].text
    summaries.append(summary)

    return prompts, summaries

다음 단계

다른 Google Cloud 제품의 코드 샘플을 검색하고 필터링하려면 Google Cloud 샘플 브라우저를 참조하세요.