
Together AI (Flux.1 Schnell)
Generates high-quality images from text prompts using Together AI's Flux.1 Schnell model. Supports customizable dimensions and can save images to disk.
Integrates with Together AI's Flux.1 Schnell model to provide high-quality image generation with customizable dimensions, clear error handling, and optional image saving.
What it does
- Generate images from text descriptions
- Customize image dimensions and quality settings
- Save generated images as PNG files
- Handle multiple image generation requests
- Validate prompts and API parameters
Best for
About Together AI (Flux.1 Schnell)
Together AI (Flux.1 Schnell) is a community-built MCP server published by manascb1344 that provides AI assistants with tools and capabilities via the Model Context Protocol. Generate stunning images with Together AI's Flux.1 Schnell—an advanced AI image generator offering customizable dimensio It is categorized under ai ml.
How to install
You can install Together AI (Flux.1 Schnell) in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
Together AI (Flux.1 Schnell) is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
Image Generation MCP Server
A Model Context Protocol (MCP) server that enables seamless generation of high-quality images using the Flux.1 Schnell model via Together AI. This server provides a standardized interface to specify image generation parameters.
Features
- High-quality image generation powered by the Flux.1 Schnell model
- Support for customizable dimensions (width and height)
- Clear error handling for prompt validation and API issues
- Easy integration with MCP-compatible clients
- Optional image saving to disk in PNG format
Installation
npm install together-mcp
Or run directly:
npx together-mcp@latest
Configuration
Add to your MCP server configuration:
{
"mcpServers": {
"together-image-gen": {
"command": "npx",
"args": ["together-mcp@latest -y"],
"env": {
"TOGETHER_API_KEY": "<API KEY>"
}
}
}
}
Usage
The server provides one tool: generate_image
Using generate_image
This tool has only one required parameter - the prompt. All other parameters are optional and use sensible defaults if not provided.
Parameters
{
// Required
prompt: string; // Text description of the image to generate
// Optional with defaults
model?: string; // Default: "black-forest-labs/FLUX.1-schnell-Free"
width?: number; // Default: 1024 (min: 128, max: 2048)
height?: number; // Default: 768 (min: 128, max: 2048)
steps?: number; // Default: 1 (min: 1, max: 100)
n?: number; // Default: 1 (max: 4)
response_format?: string; // Default: "b64_json" (options: ["b64_json", "url"])
image_path?: string; // Optional: Path to save the generated image as PNG
}
Minimal Request Example
Only the prompt is required:
{
"name": "generate_image",
"arguments": {
"prompt": "A serene mountain landscape at sunset"
}
}
Full Request Example with Image Saving
Override any defaults and specify a path to save the image:
{
"name": "generate_image",
"arguments": {
"prompt": "A serene mountain landscape at sunset",
"width": 1024,
"height": 768,
"steps": 20,
"n": 1,
"response_format": "b64_json",
"model": "black-forest-labs/FLUX.1-schnell-Free",
"image_path": "/path/to/save/image.png"
}
}
Response Format
The response will be a JSON object containing:
{
"id": string, // Generation ID
"model": string, // Model used
"object": "list",
"data": [
{
"timings": {
"inference": number // Time taken for inference
},
"index": number, // Image index
"b64_json": string // Base64 encoded image data (if response_format is "b64_json")
// OR
"url": string // URL to generated image (if response_format is "url")
}
]
}
If image_path was provided and the save was successful, the response will include confirmation of the save location.
Default Values
If not specified in the request, these defaults are used:
- model: "black-forest-labs/FLUX.1-schnell-Free"
- width: 1024
- height: 768
- steps: 1
- n: 1
- response_format: "b64_json"
Important Notes
- Only the
promptparameter is required - All optional parameters use defaults if not provided
- When provided, parameters must meet their constraints (e.g., width/height ranges)
- Base64 responses can be large - use URL format for larger images
- When saving images, ensure the specified directory exists and is writable
Prerequisites
- Node.js >= 16
- Together AI API key
- Sign in at api.together.xyz
- Navigate to API Keys settings
- Click "Create" to generate a new API key
- Copy the generated key for use in your MCP configuration
Dependencies
{
"@modelcontextprotocol/sdk": "0.6.0",
"axios": "^1.6.7"
}
Development
Clone and build the project:
git clone https://github.com/manascb1344/together-mcp-server
cd together-mcp-server
npm install
npm run build
Available Scripts
npm run build- Build the TypeScript projectnpm run watch- Watch for changes and rebuildnpm run inspector- Run MCP inspector
Contributing
Contributions are welcome! Please follow these steps:
- Fork the repository
- Create a new branch (
feature/my-new-feature) - Commit your changes
- Push the branch to your fork
- Open a Pull Request
Feature requests and bug reports can be submitted via GitHub Issues. Please check existing issues before creating a new one.
For significant changes, please open an issue first to discuss your proposed changes.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Alternatives
Related Skills
Browse all skillsForce unrelated concepts together to discover emergent properties - "What if we treated X like Y?"
Multi-agent orchestration framework for autonomous AI collaboration. Use when building teams of specialized agents working together on complex tasks, when you need role-based agent collaboration with memory, or for production workflows requiring sequential/hierarchical execution. Built without LangChain dependencies for lean, fast execution.
Apply factory function patterns to compose clients and services with proper separation of concerns. Use when creating functions that depend on external clients, wrapping resources with domain-specific methods, or refactoring code that mixes client/service/method options together.
Molt Research 🦞 - AI research collaboration platform. Verify you're not human, propose research, contribute analysis, peer review, earn bounties, and build collective intelligence. Use when doing research, collaborating on papers, or exploring what AI agents are studying together.
A 3D voxel sandbox where AI agents build worlds together. Connect, get a lobster, place blocks.
Use this skill to implement hybrid search combining BM25 keyword search with semantic vector search using Reciprocal Rank Fusion (RRF). **Trigger when user asks to:** - Combine keyword and semantic search - Implement hybrid search or multi-modal retrieval - Use BM25/pg_textsearch with pgvector together - Implement RRF (Reciprocal Rank Fusion) for search - Build search that handles both exact terms and meaning **Keywords:** hybrid search, BM25, pg_textsearch, RRF, reciprocal rank fusion, keyword search, full-text search, reranking, cross-encoder Covers: pg_textsearch BM25 index setup, parallel query patterns, client-side RRF fusion (Python/TypeScript), weighting strategies, and optional ML reranking.