This article details the architecture and implementation of a local proxy designed to enable interoperability between Cursor IDE and GitHub Copilot. It explores the challenges of bypassing proprietary routing and transforming API request schemas in real-time to bridge two different AI model ecosystems. The solution highlights practical techniques for HTTP interception, payload manipulation, and AST cleansing within a proxy architecture.
Read original on Dev.to #architectureThe core problem addressed is the lack of interoperability between proprietary AI tool ecosystems. Cursor IDE, while feature-rich, is designed to route all its AI agent requests through its own billing and backend infrastructure, specifically tailored for Anthropic models. This creates a 'walled garden' scenario, preventing users from leveraging existing subscriptions like GitHub Copilot with Cursor's advanced UI features. The solution involves creating a Man-in-the-Middle (MITM) proxy to intercept, modify, and redirect traffic, effectively making Cursor communicate with a different LLM provider than it expects.
The proxy architecture consists of two main local services working in tandem to achieve the desired re-routing and payload transformation. This distributed setup ensures separation of concerns: one service handles authentication and acts as a bridge, while the other intercepts and modifies network requests.
Request Flow
The system design establishes a clear request flow: `Cursor UI Proxy Router (4142) Copilot Bridge (4141) GitHub Servers`. This sequential processing allows for multiple layers of manipulation and translation before the request reaches its final destination.
A significant architectural challenge was Cursor's internal routing logic, which hardcodes known model names (e.g., 'claude') to its proprietary backend, overriding user-defined custom API endpoints. The solution involved discovering a "loophole": prepending a unique prefix (e.g., `cus-`) to the model name. This tricks Cursor into not recognizing the model, causing it to gracefully fall back to the user-specified local proxy URL. The proxy then strips this prefix before forwarding the request, ensuring the target LLM receives the correct model identifier.
let json = await req.json();
const PREFIX = "cus-";
// The Heist: Strip the prefix so Copilot gets the real model name
if (json.model && json.model.startsWith(PREFIX)) {
const targetModel = json.model.slice(PREFIX.length);
console.log(`🔄 Rewriting model: ${json.model} -> ${targetModel}`);
json.model = targetModel;
}The most complex part of the system design was handling the differences in API schema for tool calling. Cursor's frontend sends Anthropic-flavored tool schemas, which are incompatible with GitHub Copilot's strict OpenAI-flavored function call expectations. This required a recursive Abstract Syntax Tree (AST) cleaner within the proxy to intercept, traverse, and mutate the JSON payload in real-time. The cleaner removes illegal JSON schema properties (like `additionalProperties`, `$schema`, `title`) and translates the schema structure (e.g., `input_schema` to `parameters` within a `function` object) to ensure Copilot's API accepts the request without errors.