openai_apps_sdk 1.0.2 copy "openai_apps_sdk: ^1.0.2" to clipboard
openai_apps_sdk: ^1.0.2 copied to clipboard

Platformweb

Unofficial Flutter/Dart package that provides seamless integration with OpenAI's Apps SDK for ChatGPT, enabling developers to build interactive widgets and applications that run natively inside ChatGP [...]

OpenAI Apps SDK for Flutter #

Pub License: BSD-3-Clause

An unofficial Flutter/Dart package that provides seamless integration with OpenAI's Apps SDK for ChatGPT, enabling developers to build interactive widgets and web applications that run natively inside ChatGPT conversations.

Build rich, interactive experiences that leverage ChatGPT's AI capabilities while maintaining full access to your Flutter codebase and UI components.

πŸ“‘ Table of Contents #

πŸ“± Screenshots #

Flutter app running inside ChatGPT - View 1 Flutter app running inside ChatGPT - View 2

Note: Both screenshots show the app running in inline mode within ChatGPT conversations, displaying different features of the application.

πŸ’‘ Background & Motivation #

This package was created in response to OpenAI's new Apps SDK, which enables developers to serve applications and interactive components directly within ChatGPT conversations.

Why This Package Exists #

OpenAI's official documentation focuses exclusively on React applications and components. However, this package makes the unexpected and innovative possible: running Flutter applications inside ChatGPT.

The Flutter Advantage #

While Flutter can compile to web, running a Flutter web app inside ChatGPT's Apps SDK environment is different from running a traditional Flutter web application:

  • 🎨 Theme Integration: Your Flutter app must dynamically adapt to ChatGPT's theme changes
  • πŸ“± Display Modes: Handle inline, fullscreen, and picture-in-picture modes
  • πŸ”— Interoperability: Bridge between Flutter's Dart code and OpenAI's JavaScript SDK
  • πŸ’¬ Conversation Context: Your app runs within an active AI conversation, not standalone
  • πŸ”§ MCP Integration: Connect to Model Context Protocol servers for backend functionality

This package provides the bridge between Flutter's powerful UI framework and OpenAI's Apps SDK, enabling you to build rich, cross-platform Flutter applications that run natively inside ChatGPT conversationsβ€”something that wasn't officially designed but is now possible.

What Makes This Novel #

  • ✨ First of its kind: Use Flutter (not just vanilla web components) inside ChatGPT
  • πŸŒ‰ JavaScript ↔ Dart Bridge: Seamless interop between OpenAI's JS SDK and Flutter's Dart
  • πŸ“¦ Type-Safe API: Converts JavaScript types to idiomatic Dart types
  • πŸš€ Flutter Ecosystem: Access to the entire Flutter widget library and pub.flutter-io.cn packages
  • 🎯 Developer-Friendly: Familiar Flutter development experience with reactive streams

✨ Features #

  • 🎨 Automatic Theme Synchronization - Your app automatically adapts to ChatGPT's light/dark theme
  • πŸ“± Responsive Display Modes - Support for inline, fullscreen, and picture-in-picture display modes
  • 🌍 Locale Detection - Automatically detect and respond to user's language preferences
  • πŸ”§ MCP Tool Integration - Call Model Context Protocol server tools from your Flutter app
  • πŸ’¬ Follow-up Messages - Programmatically send messages to ChatGPT on behalf of the user
  • πŸ”— External Navigation - Open external URLs while keeping ChatGPT session active
  • πŸ’Ύ Persistent Widget State - Save and restore state across sessions
  • πŸ“ Safe Area Management - Automatic handling of notches, status bars, and system UI
  • πŸ–±οΈ Device Capability Detection - Detect hover and touch capabilities for optimal UX
  • πŸ“‘ Reactive Streams - Observable streams for all global state changes

πŸš€ Installation #

Add openai_apps_sdk to your pubspec.yaml:

dependencies:
  openai_apps_sdk: ^1.0.0

Or install via command line:

dart pub add openai_apps_sdk
flutter pub add openai_apps_sdk

πŸ“‹ Prerequisites #

  • Dart SDK: ^3.9.0 or higher
  • Platform: Web only (this package is designed to run inside ChatGPT)
  • OpenAI ChatGPT Plus/Pro/Business/Enterprise/Education: A paid OpenAI subscription is required
  • Developer Mode: Must be enabled in your OpenAI account settings

πŸ”§ Enabling Developer Mode #

To test and use applications built with this SDK, you must enable Developer Mode in your OpenAI account. This feature is only available with paid subscriptions.

Requirements #

  • βœ… Active OpenAI subscription (Pro, Plus, Business, Enterprise or Education)
  • βœ… Access to ChatGPT's Advanced Settings

Setup Steps #

  1. Open ChatGPT Settings

    • Click on your profile icon in the bottom-left corner
    • Select Settings
  2. Navigate to Apps & Connectors

    • In the settings menu, click on Apps & Connectors
  3. Access Advanced Settings

    • Click on Advanced settings
    • ⚠️ Note: This option is only visible if you have an active paid subscription
  4. Enable Developer Mode

    • Toggle the Developer Mode switch to ON

OpenAI Advanced Settings - Developer Mode

Important: Without Developer Mode enabled, your Flutter apps won't be able to run inside ChatGPT conversations.

What Developer Mode Enables #

Once activated, Developer Mode allows you to:

  • πŸš€ Test and run custom apps inside ChatGPT
  • πŸ”§ Connect to Model Context Protocol (MCP) servers
  • πŸ§ͺ Debug and iterate on your applications

🌐 Connecting Your Flutter App to ChatGPT #

Once you've built your Flutter web application with this SDK, you need to connect it to ChatGPT so it can be served within conversations.

For step-by-step instructions on how to deploy and connect your app, visit the official OpenAI documentation:

πŸ“š Connect to ChatGPT - Official Guide

πŸ“š Usage #

Getting Started #

Import the package and access the singleton instance:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

void main() {
  final sdk = OpenAiAppsSDKBridge();
  
  // Your app is now connected to ChatGPT!
  print('Current theme: ${sdk.theme}');
  print('Device type: ${sdk.deviceType}');
}

1. Theme Synchronization #

Automatically adapt your app's theme to match ChatGPT's current theme:

import 'package:flutter/material.dart';
import 'package:openai_apps_sdk/openai_apps_sdk.dart';

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    final sdk = OpenAiAppsSDKBridge();
    
    return StreamBuilder<OpenAiTheme>(
      stream: sdk.themeStream,
      initialData: sdk.theme,
      builder: (context, snapshot) {
        final isDark = snapshot.data == OpenAiTheme.dark;
        
        return MaterialApp(
          theme: isDark ? ThemeData.dark() : ThemeData.light(),
          home: HomePage(),
        );
      },
    );
  }
}

2. Display Mode Management #

Request and respond to display mode changes:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

// Request fullscreen mode
Future<void> enterFullscreen() async {
  final sdk = OpenAiAppsSDKBridge();
  
  final grantedMode = await sdk.requestDisplayMode(
    OpenAiDisplayMode.fullscreen,
  );
  
  if (grantedMode == OpenAiDisplayMode.fullscreen) {
    print('Successfully entered fullscreen');
  } else {
    print('Fullscreen request denied, current mode: $grantedMode');
  }
}

// Listen to display mode changes
void listenToDisplayMode() {
  final sdk = OpenAiAppsSDKBridge();
  
  sdk.displayModeStream.listen((mode) {
    switch (mode) {
      case OpenAiDisplayMode.inline:
        // Optimize UI for inline display (limited space)
        print('Switched to inline mode');
        break;
      case OpenAiDisplayMode.fullscreen:
        // Expand UI to use full screen
        print('Switched to fullscreen mode');
        break;
      case OpenAiDisplayMode.pip:
        // Minimize UI for picture-in-picture
        print('Switched to PiP mode');
        break;
      default:
        break;
    }
  });
}

3. Configure Inline Mode Height (Experimental) #

Set custom heights for inline mode based on device type:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

void configureInlineMode() {
  final sdk = OpenAiAppsSDKBridge();
  
  sdk.initInlineModeSizeConfig(
    desktopHeight: 400,
    mobileHeight: 280,
    tabletHeight: 350,
    unknownDeviceHeight: 300,
  );
}

4. Call MCP Server Tools #

Invoke tools on your Model Context Protocol server:

import 'dart:convert';
import 'package:openai_apps_sdk/openai_apps_sdk.dart';

Future<void> fetchUserData() async {
  final sdk = OpenAiAppsSDKBridge();
  
  try {
    final result = await sdk.callTool(
      'getUserData',
      {
        'userId': 42,
        'includeProfile': true,
      },
    );
    
    print('Result: $result');
  } catch (e) {
    print('Error calling tool: $e');
  }
}

5. Send Follow-up Messages #

Programmatically send messages to ChatGPT:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

Future<void> askForMoreInfo() async {
  final sdk = OpenAiAppsSDKBridge();
  
  await sdk.sendFollowUpMessage(
    'Can you provide more details about this data?',
  );
}

Future<void> guideConversation(String userSelection) async {
  final sdk = OpenAiAppsSDKBridge();
  
  await sdk.sendFollowUpMessage(
    'The user selected: $userSelection. What are the next steps?',
  );
}

Navigate users to external resources:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

void openDocumentation() {
  final sdk = OpenAiAppsSDKBridge();
  sdk.openExternal('https://docs.example.com/api');
}

void openSupportPage() {
  final sdk = OpenAiAppsSDKBridge();
  sdk.openExternal('https://support.example.com');
}

7. Persistent Widget State #

Save and restore state across sessions (widget mode only):

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

// Save state
Future<void> saveState(int counter, Map<String, dynamic> preferences) async {
  final sdk = OpenAiAppsSDKBridge();
  
  await sdk.setWidgetState({
    'counter': counter,
    'lastUpdated': DateTime.now().toIso8601String(),
    'preferences': preferences,
  });
}

// Restore state
void restoreState() {
  final sdk = OpenAiAppsSDKBridge();
  final state = sdk.widgetState;
  
  if (state != null) {
    final counter = state['counter'] ?? 0;
    final lastUpdated = state['lastUpdated'];
    final preferences = state['preferences'] as Map<String, dynamic>? ?? {};
    
    print('Restored counter: $counter');
    print('Last updated: $lastUpdated');
  }
}

8. Locale Detection and Localization #

Adapt your app to user's language preferences:

import 'package:flutter/material.dart';
import 'package:openai_apps_sdk/openai_apps_sdk.dart';

class LocalizedApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    final sdk = OpenAiAppsSDKBridge();
    
    return StreamBuilder<String>(
      stream: sdk.localeStream,
      initialData: sdk.locale,
      builder: (context, snapshot) {
        if (!snapshot.hasData) {
          return MaterialApp(home: HomePage());
        }
        
        final locale = snapshot.data!;
        final parts = locale.split('-');
        final languageCode = parts[0];
        final countryCode = parts.length > 1 ? parts[1] : null;
        
        return MaterialApp(
          locale: Locale(languageCode, countryCode ?? ''),
          localizationsDelegates: [
            // Add your localization delegates
          ],
          supportedLocales: [
            Locale('en', 'US'),
            Locale('es', 'ES'),
            Locale('fr', 'FR'),
            // Add your supported locales
          ],
          home: HomePage(),
        );
      },
    );
  }
}

9. Safe Area Management #

Handle device notches, status bars, and system UI:

import 'package:flutter/material.dart';
import 'package:openai_apps_sdk/openai_apps_sdk.dart';

class SafeAreaExample extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    final sdk = OpenAiAppsSDKBridge();
    
    return StreamBuilder<OpenAiSafeArea>(
      stream: sdk.safeAreaStream,
      initialData: OpenAiSafeArea(insets: sdk.safeAreaInsets),
      builder: (context, snapshot) {
        if (!snapshot.hasData) return YourContent();
        
        final insets = snapshot.data!.insets;
        
        return Padding(
          padding: EdgeInsets.only(
            top: insets.top,
            bottom: insets.bottom,
            left: insets.left,
            right: insets.right,
          ),
          child: YourContent(),
        );
      },
    );
  }
}

10. Device Capability Detection #

Optimize UX based on device capabilities:

import 'package:flutter/material.dart';
import 'package:openai_apps_sdk/openai_apps_sdk.dart';

class AdaptiveButton extends StatelessWidget {
  final VoidCallback onPressed;
  final String label;
  
  const AdaptiveButton({
    required this.onPressed,
    required this.label,
  });
  
  @override
  Widget build(BuildContext context) {
    final sdk = OpenAiAppsSDKBridge();
    
    // Check device capabilities
    final hasHover = sdk.hasHoverCapability;
    final hasTouch = sdk.hasTouchCapability;
    final deviceType = sdk.deviceType;
    
    return ElevatedButton(
      onPressed: onPressed,
      style: ElevatedButton.styleFrom(
        // Increase touch target size for touch devices
        minimumSize: hasTouch ? Size(48, 48) : Size(36, 36),
      ),
      child: hasHover
          ? Tooltip(
              message: 'Click to $label',
              child: Text(label),
            )
          : Text(label),
    );
  }
}

// Device-specific layout
Widget buildAdaptiveLayout() {
  final sdk = OpenAiAppsSDKBridge();
  
  switch (sdk.deviceType) {
    case OpenAiDeviceType.mobile:
      return MobileLayout();
    case OpenAiDeviceType.tablet:
      return TabletLayout();
    case OpenAiDeviceType.desktop:
      return DesktopLayout();
    default:
      return ResponsiveLayout();
  }
}

11. Access Tool Input Parameters #

Retrieve the parameters passed when your tool was invoked:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

void handleToolInput() {
  final sdk = OpenAiAppsSDKBridge();
  final input = sdk.toolInput;
  
  final userId = input['userId'] as int?;
  final options = input['options'] as Map<String, dynamic>?;
  
  print('Tool invoked with:');
  print('  User ID: $userId');
  print('  Options: $options');
}

12. Access Tool Output and Metadata #

Retrieve the output and metadata from the MCP tool that triggered your app:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

void handleToolData() {
  final sdk = OpenAiAppsSDKBridge();
  
  // Access tool output
  final output = sdk.toolOutput;
  if (output != null) {
    final result = output['result'];
    final data = output['data'] as Map<String, dynamic>?;
    print('Tool output: $result');
    print('Data: $data');
  }
  
  // Access tool execution metadata
  final metadata = sdk.toolResponseMetadata;
  if (metadata != null) {
    final executionTime = metadata['executionTime'];
    final status = metadata['status'];
    print('Execution time: $executionTime ms');
    print('Status: $status');
  }
}

13. Listen to Global State Changes #

Monitor all global state changes with a single stream:

import 'package:openai_apps_sdk/openai_apps_sdk.dart';

void monitorGlobalChanges() {
  final sdk = OpenAiAppsSDKBridge();
  
  sdk.globalsStream.listen((globals) {
    if (globals.theme != null) {
      print('Theme changed to: ${globals.theme}');
    }
    if (globals.displayMode != null) {
      print('Display mode changed to: ${globals.displayMode}');
    }
    if (globals.maxHeight != null) {
      print('Max height changed to: ${globals.maxHeight}');
    }
    if (globals.locale != null) {
      print('Locale changed to: ${globals.locale}');
    }
    if (globals.safeArea != null) {
      print('Safe area changed: ${globals.safeArea}');
    }
  });
}

🎯 Example Application #

A complete example application demonstrating all the features of this package is available in the example folder.

Running the Example #

Testing Flutter apps inside ChatGPT requires a different setup than traditional Flutter web apps. Follow these steps carefully:

Prerequisites

  • Node.js: Required to run the MCP server
  • ngrok or Cloudflared: To expose your localhost to the public internet
  • Developer Mode: Must be enabled in ChatGPT (see Enabling Developer Mode)

Step 1: Build the Flutter App

First, build your Flutter web application:

  1. Navigate to the example directory:
cd example
  1. Build the Flutter app for web:
flutter build web

This will generate the web build in the example/build/web directory. The important files generated are:

  • main.dart.js: The compiled Dart code
  • assets/: All app assets (images, fonts, etc.)

These files will be served by the MCP server. To see how the MCP server uses these files, check mcp_example/src/server.ts.

Step 2: Expose Your Localhost

Expose your local server (port 8000) to the public internet using one of these tools:

Option A: Using ngrok

ngrok http 8000

Option B: Using Cloudflared

cloudflared tunnel --url http://localhost:8000

Copy the public URL provided (e.g., https://abc123.ngrok.io).

Step 3: Configure the MCP Server

  1. Navigate to the MCP example directory:
cd mcp_example
  1. Create a .env file with your configuration:
PORT=8000
CDN_URL=<YOUR_PUBLIC_URL>/

Replace <YOUR_PUBLIC_URL> with the URL from ngrok or Cloudflared.

Step 4: Install Dependencies and Run MCP Server

  1. Install Node.js dependencies:
npm install
  1. Start the MCP server:
npm run dev

The server should now be running on port 8000 and accessible via your public URL.

Step 5: Connect MCP to ChatGPT

  1. Open ChatGPT and go to Settings
  2. Navigate to Apps & Connectors
  3. Click Create to add a new connection
  4. Fill in the form:
    • App Name: Your app name
    • MCP URL: Your public URL from Step 1
    • Icon/Image: Upload an icon (optional)
    • Other fields: As explained in the official documentation

Form to connect your app in ChatGPT Settings

Step 6: Test Your App in ChatGPT

  1. Ensure Developer Mode is enabled (see requirements above)

  2. In the ChatGPT interface, click the "+" (More) button at the bottom

  3. Search for and select your app from the list

Selecting your app in ChatGPT

  1. Send a prompt to activate your app:
show me the demo app

Your Flutter app should now appear inside the ChatGPT conversation! πŸŽ‰

What the Example Demonstrates #

The example app showcases:

  • βœ… Theme synchronization with ChatGPT
  • βœ… Display mode management (inline, fullscreen, PiP)
  • βœ… Locale detection and localization
  • βœ… Safe area handling
  • βœ… Device capability detection
  • βœ… MCP tool integration
  • βœ… Widget state persistence

For more details, check out the example application.

πŸ“– API Reference #

Core Methods #

Method Description Returns
callTool(name, args) Invoke a tool on your MCP server Future<String>
sendFollowUpMessage(prompt) Send a message to ChatGPT Future<void>
openExternal(href) Open an external URL void
requestDisplayMode(mode) Request a display mode change Future<OpenAiDisplayMode>
setWidgetState(state) Update persistent widget state Future<void>
initInlineModeSizeConfig() Configure inline mode height void

Reactive Streams #

Stream Description Type
globalsStream All global state changes Stream<PartialOpenAiGlobals>
themeStream Theme changes Stream<OpenAiTheme>
displayModeStream Display mode changes Stream<OpenAiDisplayMode>
safeAreaStream Safe area inset changes Stream<OpenAiSafeArea>
localeStream Locale changes Stream<String>

Global State Getters #

Getter Description Type
theme Current ChatGPT theme OpenAiTheme
locale User's locale String?
maxHeight Maximum available height double?
displayMode Current display mode OpenAiDisplayMode
deviceType Device type OpenAiDeviceType?
hasHoverCapability Supports hover bool
hasTouchCapability Supports touch bool
safeAreaInsets Safe area insets OpenAiSafeAreaInsets
toolInput Tool input parameters Map<String, dynamic>
toolOutput Tool output result Map<String, dynamic>?
toolResponseMetadata Tool execution metadata Map<String, dynamic>?
widgetState Persistent widget state Map<String, dynamic>?

Enums #

OpenAiTheme

  • light - Light theme
  • dark - Dark theme
  • unknown - Unknown theme

OpenAiDisplayMode

  • inline - Inline display (within conversation)
  • fullscreen - Fullscreen display
  • pip - Picture-in-picture display
  • unknown - Unknown display mode

OpenAiDeviceType

  • mobile - Mobile phone
  • tablet - Tablet device
  • desktop - Desktop computer
  • unknown - Unknown device type

🌐 Platform Support #

Platform Supported
Web βœ…
Android ❌
iOS ❌
macOS ❌
Windows ❌
Linux ❌

Note: This package is specifically designed to run inside ChatGPT's web environment and requires the OpenAI Apps SDK to be available.

πŸ”— Resources #

🀝 Contributing #

Contributions are welcome!

Development Setup #

  1. Clone the repository:
git clone https://github.com/gbmarcos/openai_apps_sdk_pub.git
cd openai_apps_sdk
  1. Install dependencies:
dart pub get
  1. Check code quality:
dart analyze
dart format .

πŸ“„ License #

This project is licensed under the BSD 3-Clause License - see the LICENSE file for details.

⚠️ Disclaimer #

This is an unofficial package and is not affiliated with, maintained, authorized, endorsed, or sponsored by OpenAI.

πŸ™ A Personal Note #

Buy me a... No, no, noβ€”I don't want your money; I want you to succeed. If you succeed, and you will, just take a second to bless my path.


Made with ❀️ for the Flutter community

4
likes
150
points
6
downloads

Publisher

unverified uploader

Weekly Downloads

Unofficial Flutter/Dart package that provides seamless integration with OpenAI's Apps SDK for ChatGPT, enabling developers to build interactive widgets and applications that run natively inside ChatGPT conversations.

Repository (GitHub)
View/report issues

Topics

#openai #apps-sdk #chatgpt #openai-apps-sdk

Documentation

API reference

License

BSD-3-Clause (license)

Dependencies

freezed_annotation, json_annotation, web

More

Packages that depend on openai_apps_sdk