Build a ChatGPT App with NitroStack

A practical, developer-first guide to building a production-ready ChatGPT App using NitroStack and the Model Context Protocol (MCP). Go from zero to a live app with CLI scaffolding, ngrok tunneling, and NitroCloud deployment in under 15 minutes.

Abhishek Pandit
Abhishek Pandit
Build a ChatGPT App with NitroStack

A step-by-step tutorial on scaffolding an MCP server with NitroStack, connecting it to ChatGPT via ngrok, and registering it as a native ChatGPT App — all in under 15 minutes.

OpenAI has opened up ChatGPT to third-party apps via MCP. NitroStack is the fastest way to get a production-grade MCP server in front of that integration — from scaffold to live ChatGPT App in minutes.

The Model Context Protocol (MCP) is rapidly becoming the standard layer for connecting AI models to external tools. And now that ChatGPT supports MCP-based app registration natively, the opportunity to ship apps directly inside ChatGPT is very real.

The problem is that setting up an MCP server from scratch is still tedious — schema definitions, transport wiring, tool registration, and server infrastructure all have to be handled before you write a single line of business logic.

NitroStack eliminates that entirely. It gives you a decorator-based TypeScript framework that handles all MCP infrastructure so you can focus on what your app actually does. In this guide, you'll go from zero to a working ChatGPT App backed by a live MCP server.

Here's what you'll build, step by step:

  • Scaffold an MCP server using the NitroStack CLI
  • Configure it to run in openai mode
  • Expose it to the internet using ngrok for local testing
  • Register it as a ChatGPT App inside ChatGPT settings
  • Invoke MCP tools directly from a ChatGPT conversation
  • Deploy to NitroCloud for a permanent production URL

Prerequisites

Make sure your environment is ready before starting.

Node.js>= 18.0.0

npm>= 9.0.0

ChatGPT PlanPlus or Pro

Verify your Node and npm versions:

node -v
npm -v

💡 ChatGPT App support is available on ChatGPT Plus and Pro plans. You'll need Developer Mode enabled in ChatGPT Settings — covered in Step 6 below.

Step 01 — Install the NitroStack CLI

Install the NitroStack CLI globally using npm.

npm install -g @nitrostack/cli

Verify the installation:

nitrostack-cli --help

If the CLI help menu prints successfully, you're ready to move on.

Step 02 — Initialize a New MCP Project

Create a new project with the CLI init command.

nitrostack-cli init pizza-shop-finder
Blog image

The CLI walks you through a short setup flow. When prompted to select a template, choose:

Advanced Template
Pizza Shop Finder with Maps and Widgets

This template ships with fully functional MCP tools, UI widgets, pizza data, and map integration — everything needed to demonstrate a real app end-to-end.

Next, enter a project description when prompted. For example:

An MCP app for pizza shops where consumers can find andpurchase their favorite pizza through AI models

Enter your name as the author and the CLI scaffolds the complete project automatically.

Step 03 — Open the Project in Your Editor

Navigate into the project folder and open it in your preferred code editor.

# VS Codecode pizza-shop-finder
# Cursorcursor pizza-shop-finder
Blog image

NitroStack uses a module-based architecture — tools, services, and data are organized into dedicated modules, keeping concerns cleanly separated as your project grows.

Step 04 — Explore the MCP Tools

Open the pizza tools file to see how NitroStack defines MCP tools:

src/modules/pizza/pizza.tools.ts

Inside, tools are declared using the @Tool decorator — no manual schema wiring required:

@Tool({
name: "show_pizza_list",
description: "Shows available pizzas"})
async showPizzaList() {
// Returns pizza list with widget return this.pizzaService.getAll();
}
Blog image

The @Tool decorator automatically registers the function as an MCP-compliant tool, generates the JSON schema, and exposes it to any MCP client — including ChatGPT. You don't need to modify anything here for this tutorial.

Step 05 — Configure OpenAI Mode & Start the Server

This is the critical configuration step for ChatGPT compatibility. Open the .env file at the project root and set the app mode to openai:

NITROSTACK_APP_MODE=openai

⚠️ Don't skip this step. Without NITROSTACK_APP_MODE=openai, your server will start in standard MCP mode and the ChatGPT integration will not function correctly.

Now start the server:

npm run start
Blog image

The server starts on port 3000. Your MCP endpoint is now live at http://localhost:3000/sse.

Step 06 — Expose the Server with ngrok

ChatGPT needs a publicly accessible URL to reach your MCP server. ngrok creates a secure tunnel from the internet to your localhost — perfect for development testing.

Install ngrok

Follow the ngrok quickstart guide to install and authenticate. Once installed, run:

ngrok http 3000
Blog image

ngrok will output something like:

Forwarding https://abc123.ngrok.io -> http://localhost:3000

Copy the https:// URL. You'll need it in the next step.

ngrok is for development only. It generates a temporary URL that changes each session. For a permanent production URL, you'll deploy to NitroCloud in the final step.

Step 07 — Register Your App in ChatGPT

With your server running and ngrok proxying traffic, it's time to register the app in ChatGPT. Follow these steps precisely.

Enable Developer Mode

Open ChatGPT and go to Settings

Navigate to the Apps section

Under Advanced, toggle on Developer Mode

Blog image

Create the App

With Developer Mode active, click Create App. A modal will appear asking for the following details:

  • Logo — Upload an icon for your app
  • Name — e.g., Pizza Shop Finder
  • Description — A brief summary of what the app does
  • MCP URL — Your ngrok URL with /sse appended

The MCP URL should look exactly like this:

https://abc123.ngrok.io/sse
Blog image

Click Create App. ChatGPT will connect to your MCP server and load the available tools automatically.

💡 Always append /sse to your ngrok URL when registering. The /sse path is the Server-Sent Events transport endpoint that ChatGPT uses to maintain a connection with your MCP server.

Step 08 — Test the App in ChatGPT

Your app is registered. Now test it end-to-end from a ChatGPT conversation.

Open a new chat with ChatGPT and type:

Show pizza list
Blog image

ChatGPT will analyze the prompt, identify the show_pizza_list tool in your MCP server, call it, and render the response — including the NitroStack widget — directly in the chat interface.

What's happening under the hood

Tool invocation flow

User sends message → ChatGPT identifies tool → Calls MCP via SSE → NitroStack executes tool → Result + widget rendered

You can continue the conversation naturally — ask to find a pizza shop near me or order a Margherita and ChatGPT will call the corresponding MCP tools on your server.

Step 09 — Deploy to NitroCloud

ngrok is temporary — your URL changes every time you restart the tunnel and your server goes offline when your laptop does. For a persistent production URL, deploy to NitroCloud.

Go to nitrostack.ai/cloud and follow the deployment flow. NitroCloud gives you:

  • A stable, permanent HTTPS URL for your MCP server
  • Serverless auto-scaling — no infrastructure to manage
  • Built-in observability, logs, and analytics
  • Zero-downtime deploys from your Git repo
Blog image

Once deployed, update the MCP URL in your ChatGPT App settings — replace the ngrok URL with your NitroCloud URL (still with /sse appended). Your ChatGPT App is now fully production-ready.

🚀 Production tip: NitroCloud URLs are stable across deployments. Once you've updated the ChatGPT App registration to point at your NitroCloud URL, future deploys won't require updating the registration.

What You Just Built

In a few minutes, without writing any MCP infrastructure code, you:

  • Scaffolded a production-grade MCP server using the NitroStack CLI
  • Configured it for OpenAI/ChatGPT compatibility with a single environment variable
  • Exposed it locally using ngrok for rapid testing
  • Registered it as a native ChatGPT App via Developer Mode
  • Invoked MCP tools and rendered widgets directly inside ChatGPT
  • Deployed to NitroCloud for a permanent production URL

Why NitroStack Works for This

Traditional MCP development forces you to handle SDK wiring, schema generation, transport configuration, and server scaffolding before you ship a single tool. That's the wrong order of operations for moving fast.

NitroStack inverts this. You declare tools with decorators, the framework handles all protocol-level complexity, and you ship. The same codebase that runs in NitroStudio connects to ChatGPT, Claude, Gemini, and any other MCP-compatible client — with one environment variable determining the mode.

Where to Go from Here

The pizza shop template is a starting point. From here you can extend it with:

  • Authentication — JWT tokens or API key middleware built into NitroStack
  • Database integration — swap the static pizza data for a live Postgres or MongoDB connection
  • Custom widgets — build interactive UI components that render inside NitroChat and ChatGPT
  • Long-running tasks — background jobs with progress streaming back to the client
  • Rate limiting — per-tenant invocation limits with built-in NitroStack support

Start building on NitroStack

Everything you need to go from MCP server to live ChatGPT App — in minutes.

View on GitHub →

Read the Docs

NitroCloud →

Abhishek Pandit

Abhishek Pandit

Author