Masamune AI
[GitHub](https://github.com/mathrunet) | [YouTube](https://www.youtube.com/c/mathrunetchannel) | [Packages](https://pub.flutter-io.cn/publishers/mathru.net/packages) | [X](https://x.com/mathru) | [LinkedIn](https://www.linkedin.com/in/mathrunet/) | [mathru.net](https://mathru.net)
Masamune AI
Overview
masamune_aiis a generative AI integration library for the Masamune framework.- Configure AI behavior through the adapter (
AIMasamuneAdapter), including Model Context Protocol (MCP) settings. - Use
AIThreadto manage multi-turn conversations andAISinglefor single-turn prompts. - Manage tool integrations via the MCP client (
McpClient) andAITool.
Setup
- Add the package to your project.
Note: This is the base package. You'll also need a concrete implementation:
masamune_ai_firebasefor Firebase Vertex AI / Geminimasamune_ai_openaifor OpenAI ChatGPT
flutter pub add masamune_ai
flutter pub add masamune_ai_firebase # or masamune_ai_openai
- Register an AI adapter in
MasamuneApp.
void main() {
runApp(
MasamuneApp(
appRef: appRef,
adapters: const [
// Use FirebaseAIMasamuneAdapter or OpenaiAIMasamuneAdapter
// See implementation-specific packages for details
],
),
);
}
AIMasamuneAdapter
- Implements common AI initialization and configuration logic.
- Provide the following when extending:
defaultConfig: anAIConfigdescribing the model, system prompt, and response schema.initializeandgenerateContent: connect to the actual AI service.mcpServerConfig,mcpClientConfig,mcpFunctions: settings for MCP integration.contentFilterandonGenerateFunctionCallingConfig: preprocess content or control function-calling retries.
Runtime Adapter Example
Use RuntimeAIMasamuneAdapter for tests or mock implementations.
class MockAdapter extends RuntimeAIMasamuneAdapter {
const MockAdapter()
: super(
onGenerateContent: (contents, config) async {
final response = AIContent.model(text: "Mock response!");
response.complete();
return response;
},
);
}
Concrete Implementations
Firebase AI Adapter (masamune_ai_firebase)
- Provides
FirebaseAIMasamuneAdapterfor Firebase Vertex AI / Gemini models - Requires Firebase initialization and supports
FirebaseAIModel(gemini-2.0-flash, etc.) - Includes Vertex AI function-calling support for MCP tools
- See
masamune_ai_firebasepackage for detailed setup
OpenAI Adapter (masamune_ai_openai)
- Provides
OpenaiAIMasamuneAdapterfor OpenAI ChatGPT models - Requires an OpenAI API key and supports
OpenaiAIModel(gpt-4o, gpt-5-mini, etc.) - Streams responses from Chat Completions API
- Includes
OpenaiChatGPTFunctionsActionfor backend Functions integration - See
masamune_ai_openaipackage for detailed setup
AIConfig and AITool
AIConfig: holds the model name, system prompt (AIContent.system), and response schema (AISchema).AITool: defines a tool that can be invoked through MCP or function calls.
final weatherTool = AITool(
name: "weather",
description: "Get the current weather",
parameters: {
"city": AISchema.string(description: "City name"),
},
);
Conversation Management with AIThread
- Maintains a list of
AIContentexchanges and supports multi-turn conversations. - Call
initializeto set up the model, thengenerateContentto send user input and receive responses.
final thread = ref.app.controller(
AIThread.query(
threadId: "chat-1",
initialContents: [
AIContent.text("Hello!"),
],
),
);
final updatedContents = await thread.generateContent(
[AIContent.text("Tell me the latest news")],
config: AIConfig(model: "gpt-4o-mini"),
tools: {weatherTool},
);
Single Interaction with AISingle
- Use when only one response is needed.
final single = ref.app.controller(
AISingle.query(
config: AIConfig(model: "gpt-4o-mini"),
),
);
final result = await single.generateContent([
AIContent.text("Summarize this"),
]);
MCP Client Integration
McpClientloads tools from an MCP server and processes AI function calls.
final mcpClient = ref.app.controller(McpClient.query());
await mcpClient.load();
final toolResult = await mcpClient.call("weather", {"city": "Tokyo"});
- When
mcpClientConfigandmcpFunctionsare configured in the adapter,AIThreadandAISingleautomatically invoke tools through MCP duringgenerateContent.
Working with AIContent
- Represents messages exchanged with the model.
- Factory constructors support multiple data types:
AIContent.text,AIContent.png,AIContent.system, and more. - Use
AIContentFunctionCallPartandAIContentFunctionResponsePartto model structured function calls. - Streamed responses can be handled via
add,complete, anderror.
Typical Workflow
- Register a custom
AIMasamuneAdapterinMasamuneApp. - Obtain
AIThreadorAISingleviaref.app.controller. - Optionally load MCP tools.
- Call
generateContentwith userAIContent. - Render the AI response from the controller’s
value. - Handle function calls via MCP or custom logic.
Common Customizations
- System Prompt: Provide
AIConfig.systemPromptContentwithAIContent.system. - Response Schema: Define structured JSON outputs using
AISchema. - Usage Tracking: Inspect
onGeneratedContentUsagefor prompt and candidate token counts. - Content Filtering: Apply
contentFilteror per-call filters to sanitize requests. - Function Calling Control: Customize retries or forced execution with
onGenerateFunctionCallingConfig.
Development Tips
- Follow Masamune controller patterns (
appRef,ref.app.controller) for lifecycle-aware access. - Monitor the
loadingfuture onAIContentfor streaming updates. - Even without MCP, the adapter’s
onFunctionCallmust return an empty list to satisfy the interface.
GitHub Sponsors
Sponsors are always welcome. Thank you for your support!
Libraries
- masamune_ai
- This package is designed to interact with generative AI within the Masamune framework.
- models/vector