Agent Engine is a managed service that helps you to build and deploy agent frameworks. LangGraph is a library for building stateful, multi-actor applications with LLMs, used to create agent and multi-agent workflows.This notebook demonstrates how to build, deploy, and test a simple LangGraph application using Agent Engine in Vertex AI. You’ll learn how to combine LangGraph’s workflow orchestration with the scalability of Vertex AI, which enables you to build custom generative AI applications.Note that the approach used in this notebook defines a custom application template in Agent Engine, which can be extended to LangChain or other orchestration frameworks. If just want to use Agent Engine to build agentic generative AI applications, refer to the documentation for developing with the LangChain template in Agent Engine.This notebook covers the following steps:
Define Tools: Create custom Python functions to act as tools your AI application can use.
Define Router: Set up routing logic to control conversation flow and tool selection.
Build a LangGraph Application: Structure your application using LangGraph, including the Gemini model and custom tools that you define.
Local Testing: Test your LangGraph application locally to ensure functionality.
Deploying to Vertex AI: Seamlessly deploy your LangGraph application to Agent Engine for scalable execution.
Remote Testing: Interact with your deployed application through Vertex AI, testing its functionality in a production-like environment.
Cleaning Up Resources: Delete your deployed application on Vertex AI to avoid incurring unnecessary charges.
By the end of this notebook, you’ll have the skills and knowledge to build and deploy your own custom generative AI applications using LangGraph, Agent Engine, and Vertex AI.
To use the newly installed packages in this Jupyter runtime, you must restart the runtime. You can do this by running the cell below, which restarts the current kernel.The restart might take a minute or longer. After it’s restarted, continue to the next step.
Import the necessary Python libraries. These libraries provide the tools we need to interact with LangGraph, Vertex AI, and other components of our application.
You’ll start by defining the a tool for your LangGraph application. You’ll define a custom Python function that act as tools in our agentic application.In this case, we’ll define a simple tool that returns a product description based on the product that the user asks about. In reality, you can write functions to call APIs, query databases, or anything other tasks that you might want your agent to be able to use.
Copy
Ask AI
def get_product_details(product_name: str): """Gathers basic details about a product.""" details = { "smartphone": "A cutting-edge smartphone with advanced camera features and lightning-fast processing.", "coffee": "A rich, aromatic blend of ethically sourced coffee beans.", "shoes": "High-performance running shoes designed for comfort, support, and speed.", "headphones": "Wireless headphones with advanced noise cancellation technology for immersive audio.", "speaker": "A voice-controlled smart speaker that plays music, sets alarms, and controls smart home devices.", } return details.get(product_name, "Product details not found.")
Then, you’ll define a router to control the flow of the conversation, determining which tool to use based on user input or the state of the interaction. Here we’ll use a simple router setup, and you can customize the behavior of your router to handle multiple tools, custom logic, or multi-agent workflows.
Copy
Ask AI
def router(state: list[BaseMessage]) -> Literal["get_product_details", "__end__"]: """Initiates product details retrieval if the user asks for a product.""" # Get the tool_calls from the last message in the conversation history. tool_calls = state[-1].tool_calls # If there are any tool_calls if len(tool_calls): # Return the name of the tool to be called return "get_product_details" else: # End the conversation flow. return "__end__"
Now you’ll bring everything together to define your LangGraph application as a custom template in Agent Engine.This application will use the tool and router that you just defined. LangGraph provides a powerful way to structure these interactions and leverage the capabilities of LLMs.
Copy
Ask AI
class SimpleLangGraphApp: def __init__(self, project: str, location: str) -> None: self.project_id = project self.location = location # The set_up method is used to define application initialization logic def set_up(self) -> None: #arize instrumentation begin from arize.otel import register tracer_provider = register( space_id = "<INSERT YOUR SPACE KEY>", # in app space settings page api_key = "<INSERT YOUR ARIZE API KEY>", # in app space settings page project_name = "agent-framework-langgraph", # name this to whatever you would like ) from openinference.instrumentation.langchain import LangChainInstrumentor LangChainInstrumentor().instrument(tracer_provider=tracer_provider) #arize instrumentation end model = ChatVertexAI(model="gemini-2.0-flash") builder = MessageGraph() model_with_tools = model.bind_tools([get_product_details]) builder.add_node("tools", model_with_tools) tool_node = ToolNode([get_product_details]) builder.add_node("get_product_details", tool_node) builder.add_edge("get_product_details", END) builder.set_entry_point("tools") builder.add_conditional_edges("tools", router) self.runnable = builder.compile() # The query method will be used to send inputs to the agent def query(self, message: str): """Query the application. Args: message: The user message. Returns: str: The LLM response. """ chat_history = self.runnable.invoke(HumanMessage(message)) return chat_history[-1].content
Now that you verified that your LangGraph application is working locally, it’s time to deploy it to Agent Engine! This will make your application accessible remotely and allow you to integrate it into larger systems or provide it as a service.
Copy
Ask AI
remote_agent = agent_engines.create( SimpleLangGraphApp(project=PROJECT_ID, location=LOCATION), display_name="Agent Engine with LangGraph", description="This is a sample custom application in Agent Engine that uses LangGraph", extra_packages=[],)
After you’ve finished experimenting, it’s a good practice to clean up your cloud resources. You can delete the deployed Agent Engine instance to avoid any unexpected charges on your Google Cloud account.