fireflies-migration-deep-dive
Execute Fireflies.ai major re-architecture and migration strategies with strangler fig pattern. Use when migrating to or from Fireflies.ai, performing major version upgrades, or re-platforming existing integrations to Fireflies.ai. Trigger with phrases like "migrate fireflies", "fireflies migration", "switch to fireflies", "fireflies replatform", "fireflies upgrade major".
Install
mkdir -p .claude/skills/fireflies-migration-deep-dive && curl -L -o skill.zip "https://mcp.directory/api/skills/download/5386" && unzip -o skill.zip -d .claude/skills/fireflies-migration-deep-dive && rm skill.zipInstalls to .claude/skills/fireflies-migration-deep-dive
About this skill
Fireflies.ai Migration Deep Dive
Current State
!npm list graphql graphql-request 2>/dev/null || echo 'No graphql packages'
Overview
Migrate to Fireflies.ai from other transcription platforms or custom recording systems. Covers historical recording import via uploadAudio, adapter pattern for gradual cutover, and data validation post-migration.
Migration Types
| Scenario | Approach | Timeline |
|---|---|---|
| Fresh start (no history) | Configure Fireflies bot, done | 1 day |
| Import recordings | Batch uploadAudio | 1-2 weeks |
| Switch from competitor | Parallel run + gradual cutover | 2-4 weeks |
| Enterprise rollout | Phased department-by-department | 1-2 months |
Instructions
Step 1: Pre-Migration Assessment
// Inventory your current meeting data
interface MigrationInventory {
totalRecordings: number;
totalHours: number;
formats: string[]; // mp3, mp4, wav, m4a, ogg
averageDuration: number; // minutes
dateRange: { oldest: string; newest: string };
platforms: string[]; // Zoom, Teams, etc.
}
// Fireflies supports: mp3, mp4, wav, m4a, ogg
// Size limits: 200MB audio, 100MB video (free), 1.5GB video (paid)
// Minimum: 50KB (can bypass with bypass_size_check: true)
Step 2: Batch Upload Historical Recordings
const FIREFLIES_API = "https://api.fireflies.ai/graphql";
interface UploadJob {
url: string; // Must be publicly accessible HTTPS URL
title: string;
attendees?: { displayName: string; email: string }[];
referenceId: string; // Your internal ID for tracking
}
async function batchUpload(jobs: UploadJob[]) {
const results: { id: string; status: string; error?: string }[] = [];
for (const job of jobs) {
try {
const res = await fetch(FIREFLIES_API, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.FIREFLIES_API_KEY}`,
},
body: JSON.stringify({
query: `
mutation($input: AudioUploadInput) {
uploadAudio(input: $input) {
success title message
}
}
`,
variables: {
input: {
url: job.url,
title: job.title,
attendees: job.attendees,
client_reference_id: job.referenceId,
webhook: process.env.WEBHOOK_URL,
},
},
}),
});
const json = await res.json();
if (json.errors) {
results.push({ id: job.referenceId, status: "error", error: json.errors[0].message });
} else {
results.push({ id: job.referenceId, status: "uploaded" });
}
} catch (err) {
results.push({ id: job.referenceId, status: "error", error: (err as Error).message });
}
// Rate limit: wait between uploads
await new Promise(r => setTimeout(r, 2000));
}
return results;
}
Step 3: Upload with Authenticated URLs
If your recordings are behind auth (S3, GCS):
// Bearer token auth (e.g., pre-signed URLs with auth headers)
const upload = {
url: "https://storage.example.com/recordings/meeting-123.mp3",
title: "Q1 Planning",
download_auth: {
type: "bearer_token",
bearer: { token: "your-storage-access-token" },
},
client_reference_id: "meeting-123",
};
// Basic auth
const uploadBasicAuth = {
url: "https://recordings.example.com/files/meeting-456.mp3",
title: "Sprint Review",
download_auth: {
type: "basic_auth",
basic: { username: "api-user", password: "api-pass" },
},
client_reference_id: "meeting-456",
};
Step 4: Direct File Upload (No Public URL)
For files that can't be made publicly accessible:
// Step 1: Get a pre-signed upload URL from Fireflies
const { createUploadUrl } = await firefliesQuery(`
mutation($input: CreateUploadUrlInput!) {
createUploadUrl(input: $input) {
url
meeting_id
}
}
`, { input: { /* file metadata */ } });
// Step 2: Upload file directly to the pre-signed URL
await fetch(createUploadUrl.url, {
method: "PUT",
body: fileBuffer,
});
// Step 3: Confirm the upload
await firefliesQuery(`
mutation($input: ConfirmUploadInput!) {
confirmUpload(input: $input) {
success
}
}
`, { input: { meeting_id: createUploadUrl.meeting_id } });
Step 5: Track Migration Progress via Webhooks
// Webhook handler tracks which uploads have completed
const migrationTracker = new Map<string, { status: string; meetingId?: string }>();
async function handleMigrationWebhook(event: any) {
if (event.eventType === "Transcription completed" && event.clientReferenceId) {
migrationTracker.set(event.clientReferenceId, {
status: "completed",
meetingId: event.meetingId,
});
// Check progress
const completed = [...migrationTracker.values()].filter(v => v.status === "completed").length;
const total = migrationTracker.size;
console.log(`Migration progress: ${completed}/${total} (${Math.round(completed/total*100)}%)`);
}
}
Step 6: Validate Migration
async function validateMigration(expectedIds: string[]) {
const results = {
found: 0,
missing: [] as string[],
hasSummary: 0,
hasSentences: 0,
};
for (const refId of expectedIds) {
const tracker = migrationTracker.get(refId);
if (!tracker?.meetingId) {
results.missing.push(refId);
continue;
}
results.found++;
// Verify transcript quality
const { transcript } = await firefliesQuery(`
query($id: String!) {
transcript(id: $id) {
id title
sentences { text }
summary { overview action_items }
}
}
`, { id: tracker.meetingId });
if (transcript.summary?.overview) results.hasSummary++;
if (transcript.sentences?.length > 0) results.hasSentences++;
await new Promise(r => setTimeout(r, 1100)); // Rate limit
}
console.log(`Validation: ${results.found}/${expectedIds.length} found`);
console.log(`With summary: ${results.hasSummary}`);
console.log(`With sentences: ${results.hasSentences}`);
console.log(`Missing: ${results.missing.length}`);
return results;
}
Step 7: Adapter Pattern for Gradual Cutover
interface TranscriptionService {
getTranscript(id: string): Promise<any>;
searchTranscripts(query: string): Promise<any[]>;
}
class FirefliesService implements TranscriptionService {
async getTranscript(id: string) {
return firefliesQuery(`
query($id: String!) {
transcript(id: $id) {
id title date duration
sentences { speaker_name text start_time end_time }
summary { overview action_items }
}
}
`, { id });
}
async searchTranscripts(query: string) {
const data = await firefliesQuery(`
query($keyword: String) {
transcripts(keyword: $keyword, limit: 20) {
id title date duration
}
}
`, { keyword: query });
return data.transcripts;
}
}
// Gradual cutover with feature flag
function getTranscriptionService(): TranscriptionService {
if (process.env.USE_FIREFLIES === "true") {
return new FirefliesService();
}
return new LegacyTranscriptionService();
}
Error Handling
| Issue | Cause | Solution |
|---|---|---|
payload_too_small | File < 50KB | Set bypass_size_check: true |
| Upload rejected | Free plan | Uploads require Pro+ plan |
| Auth download fails | Token expired | Refresh storage credentials |
| Missing transcription | Audio quality poor | Check file format and audio clarity |
| Duplicate uploads | Re-running batch | Use client_reference_id for dedup |
Output
- Pre-migration inventory assessed
- Historical recordings uploaded via
uploadAudiomutation - Migration progress tracked via webhooks and
clientReferenceId - Post-migration validation confirming transcript quality
- Adapter layer enabling gradual platform cutover
Resources
Next Steps
For monitoring the migrated system, see fireflies-observability.
More by jeremylongshore
View all skills by jeremylongshore →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
fastapi-templates
wshobson
Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.
Related MCP Servers
Browse all serversConnect Blender to Claude AI for seamless 3D modeling. Use AI 3D model generator tools for faster, intuitive, interactiv
Terminal control, file system search, and diff-based file editing for Claude and other AI assistants. Execute shell comm
Official Laravel-focused MCP server for augmenting AI-powered local development. Provides deep context about your Larave
Securely join MySQL databases with Read MySQL for read-only query access and in-depth data analysis.
AppleScript MCP server lets AI execute apple script on macOS, accessing Notes, Calendar, Contacts, Messages & Finder via
AIPo Labs — dynamic search and execute any tools available on ACI.dev for fast, flexible AI-powered workflows.
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.