Server for context-aware translations with language detection support
A Model Context Protocol (MCP) Server for Lara Translate API, enabling powerful translation capabilities with support for language detection, context-aware translations and translation memories.
Model Context Protocol (MCP) is an open standardized communication protocol that enables AI applications to connect with external tools, data sources, and services. Think of MCP like a USB-C port for AI applications - just as USB-C provides a standardized way to connect devices to various peripherals, MCP provides a standardized way to connect AI models to different data sources and tools.
Lara Translate MCP Server enables AI applications to access Lara Translate's powerful translation capabilities through this standardized protocol.
More info about Model Context Protocol on: https://modelcontextprotocol.io/
Lara Translate MCP Server implements the Model Context Protocol to provide seamless translation capabilities to AI applications. The integration follows this flow:
This integration architecture allows AI applications to access professional-grade translations without implementing the API directly, while maintaining the security of your API credentials and offering flexibility to adjust translation parameters through natural language instructions.
Integrating Lara with LLMs creates a powerful synergy that significantly enhances translation quality for non-English languages.
While large language models possess broad linguistic capabilities, they often lack the specialized expertise and up-to-date terminology required for accurate translations in specific domains and languages.
Lara overcomes this limitation by leveraging Translation Language Models (T-LMs) trained on billions of professionally translated segments. These models provide domain-specific machine translation that captures cultural nuances and industry terminology that generic LLMs may miss. The result: translations that are contextually accurate and sound natural to native speakers.
Lara has a strong focus on non-English languages, addressing the performance gap found in models such as GPT-4. The dominance of English in datasets such as Common Crawl and Wikipedia results in lower quality output in other languages. Lara helps close this gap by providing higher quality understanding, generation, and restructuring in a multilingual context.
By offloading complex translation tasks to specialized T-LMs, Lara reduces computational overhead and minimizes latencyโa common issue for LLMs handling non-English input. Its architecture processes translations in parallel with the LLM, enabling for real-time, high-quality output without compromising speed or efficiency.
Lara also lowers the cost of using models like GPT-4 in non-English workflows. Since tokenization (and pricing) is optimized for English, using Lara allows translation to take place before hitting the LLM, meaning that only the translated English content is processed. This improves cost efficiency and supports competitive scalability for global enterprises.
Inputs:
text
(array): An array of text blocks to translate, each with:
text
(string): The text contenttranslatable
(boolean): Whether this block should be translatedsource
(optional string): Source language code (e.g., 'en-EN')target
(string): Target language code (e.g., 'it-IT')context
(optional string): Additional context to improve translation qualityinstructions
(optional string[]): Instructions to adjust translation behaviorsource_hint
(optional string): Guidance for language detectionReturns: Translated text blocks maintaining the original structure
Returns: Array of memories and their details
Inputs:
name
(string): Name of the new memoryexternal_id
(optional string): ID of the memory to import from MyMemory (e.g., 'ext_my_[MyMemory ID]')Returns: Created memory data
Inputs:
id
(string): ID of the memory to updatename
(string): The new name for the memoryReturns: Updated memory data
Inputs:
id
(string): ID of the memory to deleteReturns: Deleted memory data
Inputs:
id
(string | string[]): ID or IDs of memories where to add the translation unitsource
(string): Source language codetarget
(string): Target language codesentence
(string): The source sentencetranslation
(string): The translated sentencetuid
(optional string): Translation Unit unique identifiersentence_before
(optional string): Context sentence beforesentence_after
(optional string): Context sentence afterReturns: Added translation details
Inputs:
id
(string): ID of the memorysource
(string): Source language codetarget
(string): Target language codesentence
(string): The source sentencetranslation
(string): The translated sentencetuid
(optional string): Translation Unit unique identifiersentence_before
(optional string): Context sentence beforesentence_after
(optional string): Context sentence afterReturns: Removed translation details
Inputs:
id
(string): ID of the memory to updatetmx
(file path): The path of the TMX file to uploadgzip
(boolean): Indicates if the file is compressed (.gz)Returns: Import details
Inputs:
id
(string): The ID of the import jobReturns: Import details
The installation process is standardized across all MCP clients. It involves manually adding a configuration object to your client's MCP configuration JSON file.
If you're unsure how to configure an MCP with your client, please refer to your MCP client's official documentation.
Lara Translate MCP supports multiple installation methods, including NPX and Docker.
Below, we'll use NPX as an example.
Step 1: Open your client's MCP configuration JSON file with a text editor, then copy and paste the following snippet:
{
"mcpServers": {
"lara-translate": {
"command": "npx",
"args": [
"-y",
"@translated/lara-mcp@latest"
],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
Step 2: Replace <YOUR_ACCESS_KEY_ID>
and <YOUR_ACCESS_KEY_SECRET>
with your Lara Translate API credentials (refer to the Official Documentation for details).
Step 3: Restart your MCP client.
After restarting your MCP client, you should see Lara Translate MCP in the list of available MCPs.
The method for viewing installed MCPs varies by client. Please consult your MCP client's documentation.
To verify that Lara Translate MCP is working correctly, try translating with a simple prompt:
Translate with Lara "Hello world" to Spanish
Your MCP client will begin generating a response. If Lara Translate MCP is properly installed and configured, your client will either request approval for the action or display a notification that Lara Translate is being used.
This option requires Node.js to be installed on your system.
{
"mcpServers": {
"lara-translate": {
"command": "npx",
"args": ["-y", "@translated/lara-mcp@latest"],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
<YOUR_ACCESS_KEY_ID>
and <YOUR_ACCESS_KEY_SECRET>
with your actual Lara API credentials.This option requires Docker to be installed on your system.
{
"mcpServers": {
"lara-translate": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"LARA_ACCESS_KEY_ID",
"-e",
"LARA_ACCESS_KEY_SECRET",
"translatednet/lara-mcp:latest"
],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
<YOUR_ACCESS_KEY_ID>
and <YOUR_ACCESS_KEY_SECRET>
with your actual Lara API credentials.git clone https://github.com/translated/lara-mcp.git
cd lara-mcp
# Install dependencies
pnpm install
# Build
pnpm run build
{
"mcpServers": {
"lara-translate": {
"command": "node",
"args": ["<FULL_PATH_TO_PROJECT_FOLDER>/dist/index.js"],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
<FULL_PATH_TO_PROJECT_FOLDER>
with the absolute path to your project folder<YOUR_ACCESS_KEY_ID>
and <YOUR_ACCESS_KEY_SECRET>
with your actual Lara API credentials.git clone https://github.com/translated/lara-mcp.git
cd lara-mcp
docker build -t lara-mcp .
{
"mcpServers": {
"lara-translate": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"LARA_ACCESS_KEY_ID",
"-e",
"LARA_ACCESS_KEY_SECRET",
"lara-mcp"
],
"env": {
"LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
"LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
}
}
}
}
<YOUR_ACCESS_KEY_ID>
and <YOUR_ACCESS_KEY_SECRET>
with your actual credentials.For a complete list of MCP clients and their feature support, visit the official MCP clients page.
Client | Description |
---|---|
Claude Desktop | Desktop application for Claude AI |
Aixplain | Production-ready AI Agents |
Cursor | AI-first code editor |
Cline for VS Code | VS Code extension for AI assistance |
GitHub Copilot MCP | VS Code extension for GitHub Copilot MCP integration |
Windsurf | AI-powered code editor and development environment |
{ "mcpServers": { "lara-translate": { "command": "npx", "args": [ "-y", "@translated/lara-mcp@latest" ], "env": { "LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>", "LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>" } } } }
Related projects feature coming soon
Will recommend related projects based on sub-categories