OpenAI’s New Apps SDK (Preview): What It Means and How to Experiment

Veselina Staneva

OpenAI just dropped something big - the new Apps SDK, a toolkit that lets developers build fully interactive apps inside ChatGPT.

If that sentence alone made your brain spark with “wait… what does that even mean?”, you’re not alone.

Let’s unpack what we know so far, what it could mean for developers, and how you can start experimenting, especially if you’re already building on SashiDo or another mBaaS platform.

What the Apps SDK Actually Is

The Apps SDK is OpenAI’s way of turning ChatGPT into a platform, not just a chatbot. Instead of relying on a plain API integration, developers can now build apps that live inside ChatGPT and respond to users in real-time.

Think of it like giving your backend a front-row seat in the conversation.

With the Apps SDK you can:

  • Define tools your app exposes (e.g. fetch data, trigger actions, send updates)
  • Build interactive UI components - cards, tables, even full-screen layouts
  • Use the Model Context Protocol (MCP) to connect ChatGPT with your external APIs
  • Handle authentication, state, and logic in your own backend
  • Monetization is expected via Instant Checkout and the Agentic Commerce Protocol; app submissions open later this year.

solen-feyissa-m06dRiQAy9M-unsplash

Photo by Solen Feyissa on Unsplash

The Apps SDK is currently in preview, meaning it’s an early-access version still evolving. It’s available to some developers, but not everyone can use it yet. If you’re amongst the lucky ones, you can experiment with it today, but expect changes and limited availability as OpenAI continues testing and rollout.

The docs are live on developers.openai.com, and it’s very early days, but the implications are huge, and interest around it is already loud.

⚠️ Note: What this isn’t (yet): Not GA, limited regional availability, evolving APIs, and discovery/monetization still in progress. Build prototypes, not production.

Why You Should Care

For anyone who builds products with GPT-based functionality, this unlocks an entirely new layer of distribution and UX.And if you’re just exploring GPT-powered UX, this is worth a look.

Here are just a few reasons you might want to consider:

Reach new users directly inside ChatGPT.

Hundreds of millions of people(some reports estimate ~800M) use ChatGPT every week. Apps SDK lets them invoke your service by name or discover it through ChatGPT’s suggestions. This goes beyond SEO, GEO, or any kind of content optimization.

Build a conversational UI without building a chat UI.

Your users can talk to your app naturally. ChatGPT handles the conversation, context, and follow-ups. You just focus on logic and data. And if we have to be honest, some users are already doing this, so why not make it an even smoother experience?
solen-feyissa-75EbgtnrVfw-unsplash-1

Photo by Solen Feyissa on Unsplash

Reuse your existing backend.

You don’t need to reinvent your infrastructure. Or at least it looks like it at this point, as this is a preview version only, and surely it will evolve. Seems that if you already have APIs, data models, and logic running on your backend, you’re halfway there.

Open door to future monetization.

OpenAI hinted at an upcoming “Agentic Commerce Protocol” and instant checkout flow for paid apps. Early builders will likely have the advantage when that ecosystem matures. Back in the days, OpenAI mentioned monetization for GPTs as well, but so far, we haven’t seen the release of a direct way to do this. This time, the claim sounds stronger, so I do believe it’s worth the early bet.

⚠️ Note: The Apps functionality in ChatGPT is already available to all logged-in ChatGPT users outside of the EU on Free, Go, Plus, and Pro plans. Amongst the first apps are Booking.com, Canva, Coursera, Figma, Expedia, Spotify, and Zillow in English-speaking markets. More pilot apps are expected to launch by the end of the year, bringing apps to EU users is defined with the vague “soon”.

But… You’ll Still Need a Backend That Scales

Here’s the catch: Apps SDK handles the front-end experience inside ChatGPT, but all the real work - fetching data, running logic, processing actions - still happens on your servers.

That means your backend needs to be reliable (no timeouts or lag - users won’t wait), secure (handle tokens, data, and permissions cleanly), flexible (connect to APIs, store state, run business logic), and of course - easy to extend as you iterate

thisisengineering-64YrPKiguAE-unsplash
Photo by ThisisEngineering on Unsplash

According to the officially released docs, OpenAI’s Apps SDK is essentially a client/runtime + protocol (MCP) that invokes standard web APIs. Most managed backend providers already provide and scale such web APIs. So the capability is architectural, not a product toggle, and those backends should be compatible by design.

If you’ve selected SashiDo for your backend, please contact us at support@sashido.io so we can discuss and advise if and how your use case can be implemented.

What to Do Next

If you’re curious about experimenting with the Apps SDK, here’s a simple way to start:

1. Check access to the Apps SDK preview

In ChatGPT, go to Settings → Connectors → Advanced and toggle Developer mode(some accounts show ‘Apps & Connectors’ → Advanced → Developer mode). If you see it, you can start building right away. If you don’t see Developer mode, you may need to be added to the current developer experiment or (for Enterprise) ask your workspace admin to enable connector creation.

2. Start with the official examples

Clone the Apps SDK examples and run them locally. Expose your local server securely (e.g., via ngrok) so ChatGPT can reach it while you iterate. This is the fastest way to understand the server ↔️ ChatGPT loop.

3. Plan your backend approach (keep it tiny at first)

Any HTTPS server can speak MCP (the protocol the Apps SDK uses). Begin with a single, well-scoped tool (one endpoint, one job), add logging, and version your payloads so you can adapt as the preview evolves.

4. Link it in ChatGPT and test

In ChatGPT, Settings → Connectors → Create, point to your HTTPS server, and link it in Developer mode. Run “golden prompts,” record which tools fire, and iterate.

⚠️ Reminder: The Apps SDK is in preview - APIs and UX may change(OpenAI notes app submissions open later this year.). Plan for iteration and keep your rollout gated behind Developer mode while you experiment.

Fin

The Apps SDK is new and in preview. We haven’t shipped production apps with it yet, and that’s intentional. It will change, and that’s what makes it worth exploring. If you’re curious, start small: wire one narrowly scoped MCP tool, measure latency and UX, and learn. We’re doing the same on our side.

SashiDo is a managed mBaaS that allows you to spin up APIs fast, simple & with minimal budget as our prices start at $4.95/mo per app. We believe our platform can become the perfect playground for such experiments, so we’re open to collaborating with you and exploring this new and uncharted path together.

👉Have a use case? Send a short paragraph to support@sashido.io and we’ll suggest a lean experiment plan.

👉Want a place to tinker? Spin up a project on SashiDo’s free trial and prototype an MCP tool against a tiny endpoint.

Curious, cautious, experimental - that’s the energy. Let’s see what this new surface can unlock together.

Useful Links:
Official release announcement form OpenAI
Apps SDK official docs
Announcement for OpenAI's Apps in Techcrunch

Veselina Staneva

Business Dev & Product Manager @ SashiDo.

Find answers to all your questions

Our Frequently Asked Questions section is here to help.