TikTok frontend engineers Qun Lin and Pete Gonzalez introduce Rush MCP Server, an open-source project their team developed to improve the effectiveness of AI coding tools when managing a large codebase.
Our build engineering team oversees large monorepos used to develop TypeScript projects at TikTok. As AI-assisted programming tools like Trae, Cursor, and Cline have gained popularity among engineers, we've seen real productivity gains. However, the quality of results from these tools depends heavily on how well they understand your specific projects and workspace. The scale of TikTok's codebase makes it impractical for AI to analyze every file. Also, AI cannot easily navigate project relationships without understanding your build tools, which come in many flavors for frontend.
The Model Context Protocol (MCP) can help with this problem. It's an open standard for implementing specialized services that an AI assistant can consult while performing its tasks. Our monorepos run on RushJS, a popular build orchestrator for TypeScript. Until recently, Rush did not provide its own MCP server. So we decided to build one!
What does an MCP interaction look like?
Below are some transcripts showing real interactions with the Cursor AI assistant. Wherever you see "Called MCP tool," Cursor is communicating with our Rush MCP server.
Does Rush have a command called install-autoinstaller?
What’s the difference between the "rush update" and "rush install" commands?
Which subspace does rush-lib-test belong to?
Please add the React dependency to the rush-lib-test project.
How does rush-lib-test indirectly depend on @rushstack/node-core-library? Please illustrate it as a dependency tree
How does MCP work?
The Model Context Protocol enables your coding environment to communicate with one or more MCP servers, which may be hosted in a cloud environment or on the local machine.
Our Rush MCP server is a local process that gets launched on your own computer, so that it can access the source files in your monorepo repository and even invoke Rush commands. Behind the scenes, the AI agent will start the Rush MCP server process, communicate with it using STDIO, and then terminate the process when the session is over.
MCP tools
In the protocol, MCP tools are specialized actions that the server offers to the AI agent, like a menu of actions that it can choose to perform. The Rush MCP server includes the following built-in tools, some of which were invoked in the screenshots above:
rush_docs
: Retrieves relevant documentation sections based on your queriesrush_workspace_details
: Retrieves detailed workspace informationrush_project_details
: Gets detailed information about a specific projectrush_command_validator
: Validates whether commands are compliant and follow best practicesrush_migrate_project
: Migrates a project from one directory to another or into a different subspacerush_pnpm_lock_file_conflict_resolver
: Resolves pnpm-lock.yaml conflicts
Plugin system
Those are the open-source MCP tools that everyone can use; however, most organizations will also want MCP tools that access internal systems. For example, at TikTok we might want to query team wikis, search company chat groups, or access our private issue tracking system. This could be handled by implementing a second MCP server, but now every engineer would need to correctly install and configure two things. Also, when providing oncall support for a large team, we need to ensure that everyone is using the same version of the MCP tooling with consistent behavior.
To address these requirements, we added a plugin system to the open-source Rush MCP Server. Rush itself will automatically install the MCP Server and plugins based on configuration stored in Git. If your plugin is reused across different monorepos, you can also define config files to customize it.
The Rush MCP plugins documentation has step-by-step instructions for making your own plugin.
Context files for AI agents
Besides the MCP server, most AI agents typically can also accept context files, plain text files that contain general information about your workspace. For example, Cursor supports .cursor/rules, Trae supports project_rules.md, and GitHub Copilot supports copilot_instructions.md. To help the community, we've created standard context file templates for use in any Rush monorepo.
The diagram below summarizes how all these components fit together:
Give it a try!
We've published complete documentation on the Rush website explaining how to use these new features. In a nutshell, the steps are:
- Decide what AI assistant you will use.
- For best results, add the appropriate Rush agent context file to your Git repository.
- Configure the Rush MCP Server by creating an MCP config file. This file gets added directly to your Git repository, allowing the monorepo owners to control the specific version of
@rushstack/mcp-server
to be used, as well as its configuration, ensuring a consistent experience for everyone on the team. - Open your coding tool and use the appropriate menu command to launch the Rush MCP Server. Now you're ready to start coding!
The documentation provides an example file for each AI assistant. Although the filenames vary, the file formats are very similar. Below is an example file for Cursor:
<your-repo>/.cursor/mcp.json
{
"mcpServers": {
"rush-mcp-server": {
"command": "node",
"args": [
"./common/scripts/install-run.js",
"@rushstack/mcp-server@0.2.2",
"mcp-server",
"."
]
}
}
}
The road ahead
We have many exciting ideas to improve MCP tools, including making them easier to debug and better integrated with existing Rush workflows. There is also interest in supporting custom toolchains and cross-repo indexing. If you have suggestions or want to contribute, please open an issue or PR in the Rush repository. We'd love to hear from you!



