Back to all posts

ChatGPT Now Taps Company Data via MCP

2025-06-18Radhika Rajkumar3 minutes read
AI
ChatGPT
MCP

Colorful server racks representing AI data integration

Floriana/Getty Images

The Rise of Model Context Protocol

Anthropic's Model Context Protocol (MCP) is rapidly becoming the go to industry standard for smoothly integrating data with artificial intelligence systems. In an unusual display of collaboration among competitors, major tech companies including Google and OpenAI, as well as Microsoft, have embraced this open protocol. Its adoption in recent months stems from its capability to empower AI agents by connecting them to previously inaccessible data sources, a significant advantage for businesses dealing with siloed information.

OpenAI Integrates MCP into ChatGPT

Earlier this month, OpenAI incorporated MCP into ChatGPT, adding it to a suite of business oriented features. This integration allows users to connect their internal tools with OpenAI's Deep Research feature, enabling employees to access company specific data directly through the chatbot. OpenAI had initially adopted MCP in April, starting with its SDK. Currently, the functionalities are focused on search and document retrieval capabilities.

For further insights, you might also be interested in this comparison of ChatGPT's Deep Research against competitors like Gemini, Perplexity, and Grok AI.

(Disclosure: Ziff Davis, ZDNET's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

How to Connect ChatGPT to Your MCP Server

In a detailed platform guide, OpenAI explains the setup process. Users need to build an MCP server compatible with Deep Research, utilizing search and fetch tools. Following this, a custom deep research connector must be created within ChatGPT. This connector requires thorough instructions to ensure the bot correctly interacts with your system. For users of ChatGPT Enterprise, Edu, and Team editions, there's an added benefit: they can publish this connector to their broader workspaces, where it can serve as a centralized knowledge source within Deep Research.

OpenAI also provides guidance for establishing authentication mechanisms and for testing the MCP server using the API Playground to ensure proper functionality before full deployment.

As with any emerging technology, implementing MCP comes with inherent risks and requires adherence to safety best practices. A key point to understand is that custom MCP servers are not vetted by OpenAI. Consequently, the company urges users to report any suspicious servers encountered to OpenAI via email at security AT openai DOT com. Such servers could potentially introduce prompt injections or malicious hidden directives that could manipulate ChatGPT's behavior.

For more on Large Language Models, consider reading about what Apple's controversial research paper reveals about LLMs.

OpenAI explicitly warns, "We recommend that you do not connect to a custom MCP server unless you know and trust the underlying application. For example, pick official servers hosted by the service providers themselves (e.g., we recommend connecting to the Stripe server hosted by Stripe themselves on mcp.stripe.com, instead of a custom Stripe MCP server hosted by a third party)."

It is crucial to remember that MCP servers grant OpenAI systems access to your company's data. Therefore, OpenAI strongly cautions users to meticulously review their tools and data sources for any sensitive information before enabling such connections.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.