ChatGPT With Plugins: An Ecosystem Under Construction
Actualizado: 2026-05-03
In March 2023 OpenAI opened the ChatGPT plugins beta with an ambitious promise: turn the chat into a platform where ChatGPT could talk to external services — banks, hotels, data APIs, internal tools — the same way it talks to the user. Three months later, the ecosystem has around 500 plugins available and a clear pattern of which use cases work and which don’t.
Key takeaways
- A plugin is an OpenAPI spec plus a manifest that describes to the model when to invoke it.
- Works well for live data lookup, low-risk actions, and internal data exposure via OpenAPI.
- Main friction points are multi-plugin orchestration, real-money transactions, and cumulative latency.
- Function calling is the more powerful alternative for developers building their own experiences.
- The ecosystem converges toward enterprise vertical plugins and, longer term, toward multi-step agents.
What a plugin does
A plugin is, in essence, an OpenAPI specification plus a manifest that describes to the model when to invoke it. ChatGPT reads both, and when the user asks a relevant question, the model decides to call the plugin with the right parameters, interprets the response, and integrates it into the conversation.
The typical flow follows five steps:
- User installs the plugin from the ChatGPT store.
- Asks a question: “What’s the best time to fly from Madrid to Rome on July 15?”
- ChatGPT decides the Kayak[1] plugin is relevant and invokes it.
- The plugin returns structured data (flights, prices, schedules).
- ChatGPT presents them to the user in natural language.
Key concept: the model doesn’t learn the plugin — it decides to invoke it case by case, based on the manifest description and conversation context. It’s the same logic as mature prompt engineering: the model acts on precise instructions, not implicit knowledge.
What works well
Three scenarios where plugins shine:
- Live data lookup. Plugins like WolframAlpha[2], Kayak, OpenTable[3], or Instacart[4] address a real ChatGPT limitation: access to dynamic or specific information the training doesn’t cover. Finding an available restaurant tonight in a particular city is a textbook case.
- Low-transaction-risk actions. Generating a document with the Zapier[5] plugin, creating a diagram, summarising a PDF: tasks where an error is easy to detect and doesn’t cost money.
- Internal data exposure. Companies that already publish OpenAPI can build private plugins (inside an Enterprise plan) so their team interacts with corporate data in natural language.

Where friction appears
Four areas where plugins don’t deliver on the initial promise:
- Orchestrating multiple plugins in a flow. ChatGPT allows up to 3 active plugins at once, but choosing well between them is fragile. When asking “find a flight and book a restaurant at the destination”, the model often stalls or invokes the wrong plugin.
- Real-money transactions. Commercial plugins are cautious: Kayak searches flights but redirects to its website for payment; Instacart builds the basket but you close the purchase. The friction of leaving the chat to pay reduces perceived value.
- Cumulative latency. Each plugin adds 2-5 seconds to response time. Conversations with two or three invocations go from instant to noticeably slow.
- Discovery. The plugin store is a very long list without reliable rankings. Finding the right plugin for a specific case is manual work.
Plugins vs. function calling
In June 2023 OpenAI introduced function calling[6] in their public API — a more granular capability for developers to define functions the model can invoke, without going through the store. Both coexist with different roles:
- Plugins: end-consumer experience in ChatGPT. They must pass an OpenAI review and target non-technical use. Users install them from the store.
- Function calling: direct API for developers building their own experiences. No store, no gatekeeper, no predefined UX.
For a team integrating AI into their product, function calling is almost always the better primitive because it maintains control over UX and transactions. Plugins shine only when the goal is for ChatGPT itself to be the interface.
This architecture connects with generative AI applied to code assistants and with the ChatGPT 4 chatbot technology ecosystem: the conversational interface evolves toward deep integration with external systems.
Where the ecosystem is heading
Three reasonable trends:
- Consolidation. Of the 500+ plugins, those solving a clear and maintainable use case will survive. The long tail of low-use plugins will fade.
- Enterprise vertical plugins. The next successful plugins will be specific: ChatGPT inside tools like Notion[7] or Salesforce[8], with integrated business context.
- Convergence with agents. The distinction between “plugin” and “agent” tends to blur. The next systems will orchestrate multiple plugins in multi-step plans, not just answer a single question. This approaches the analysis of OpenAI’s Code Interpreter, where the model autonomously manages complex tools.
Conclusion
ChatGPT plugins are a valuable experiment in how a conversational assistant can open its behaviour to external services. Three months of real adoption show the potential — live data lookup, internal API exposure — and the limits — complex orchestration, real-money transactions. As an industrial standard for developers, the ball is with function calling; plugins are primarily a consumer product. The ecosystem that emerges from both paths will define the next generation of conversational interfaces.