The AI Travel Agents is a robust enterprise application (hosted on ACA) that leverages MCP and multiple LamaIndex AI agents to enhance travel agency operations.
:star: To stay updated and get notified about changes, star this repo on GitHub!
Overview • Architecture • Features • Preview locally FOR FREE • Cost estimation • Join the Community
The AI Travel Agents is a robust enterprise application that leverages multiple AI agents to enhance travel agency operations. The application demonstrates how LlamaIndex.TS orchestrates multiple AI agents to assist employees in handling customer queries, providing destination recommendations, and planning itineraries. Multiple MCP (Model Context Protocol) servers, built with Python, Node.js, Java and .NET, are used to provide various tools and services to the agents, enabling them to work together seamlessly.
Agent Name | Purpose |
---|---|
Customer Query Understanding | Extracts key preferences from customer inquiries. |
Destination Recommendation | Suggests destinations based on customer preferences. |
Itinerary Planning | Creates a detailed itinerary and travel plan. |
Code Evaluation | Executes custom logic and scripts when needed. |
Model Inference | Runs a custom LLM using ONNX and vLLM on Azure Container Apps' serverless GPU for high-performance inference. |
Web Search | Uses Grounding with Bing Search to fetch live travel data. |
Echo Ping | Echoes back any received input (used as an MCP server example). |
The architecture of the AI Travel Agents application is designed to be modular and scalable:
[!NOTE] New to the Model Context Protocol (MCP)? Check out our free MCP for Beginners guide.
To run and preview the application locally, we will use Docker Model Runner.
[!IMPORTANT] The Phi4 14B model requires significant resources (at least 16GB RAM and a modern CPU or GPU) to run efficiently. GPU acceleration is only supported on macOS Apple Silicon and NVIDIA GPUs on Windows). If your machine does not have enough resources to run LLMs locally, you can still run the application using Azure AI Foundry. Please refer to the Advanced Setup documentation for more information.
The script will do the following:
In order to run the application locally, ensure you have the following installed before running the preview script:
pwsh.exe
from a PowerShell terminal. If this fails, you likely need to upgrade PowerShell/bin/bash <(curl -fsSL https://aka.ms/azure-ai-travel-agents-preview)
iex "& { $(irm https://aka.ms/azure-ai-travel-agents-preview-win) }"
When provisioning resources in Azure, it's important to consider the costs associated with running the application. The AI Travel Agents sample uses several Azure services, and the costs can vary based on your usage and configuration.
Pricing varies per region and usage, so it isn't possible to predict exact costs for your usage. However, you can use the Azure pricing calculator for the resources below to get an estimate.
[!IMPORTANT] To avoid unnecessary costs, remember to take down your app if it's no longer in use, either by deleting the resource group in the Portal or running
azd down --purge
(see Clean up).
azd auth login
.azd up
to deploy the application to Azure. This will provision Azure resources, deploy this sample, with all the containers, and set up the necessary configurations.
swedencentral
.swedencentral
. You can set a different location with azd env set AZURE_LOCATION <location>
. Currently only a short list of locations is accepted. That location list is based on the OpenAI model availability table and may become outdated as availability changes.The deployment process will take a few minutes. Once it's done, you'll see the URL of the web app in the terminal.
You can now open the web app in your browser and start chatting with the bot.
To clean up all the Azure resources created by this sample:
azd down --purge
y
The resource group and all the resources will be deleted.
To run the application in a more advanced local setup or deploy to Azure, please refer to the troubleshooting guide in the Advanced Setup documentation. This includes setting up the Azure Container Apps environment, using local LLM providers, configuring the services, and deploying the application to Azure.
We welcome contributions to the AI Travel Agents project! If you have suggestions, bug fixes, or new features, please feel free to submit a pull request. For more information on contributing, please refer to the CONTRIBUTING.md file.
We encourage you to join our Azure AI Foundry Developer Community to share your experiences, ask questions, and get support:
No configuration available
Related projects feature coming soon
Will recommend related projects based on sub-categories