MCP configuration in AI client
Learn about how to configure MCP server in AI clients.
AI clients/Plugins
You can use MCP servers in any AI clients which support MCP.
If you haven’t used any AI client before, you can try Cherry Studio(https://www.cherry-ai.com/)
Next I will show how to configure using scanpy-mcp.
MCP setting
running in local
locate the MCP server, by which(Linux/Mac) or where(Window):
Refer to configure
running in remote
mcp servers are running in remote server, but chatting in local AI client.
run mcp in your server by:
Make sure you have forwarded the necessary ports. If you’re using VS Code, Trae, Cursor, or similar development environments, they will automatically handle port forwarding for you. If automatic port forwarding is not available, you’ll need to run the following command locally:
Then configure your MCP client in local AI client, like this:
Prompt setting
It’s important to configure a system prompt that directs the LLM to focus on scRNA-seq analysis and tool calling. Below is an example that you can adapt to your specific use case:
Configure in Trae
see it in https://youtu.be/HXPqaDvjKvg