langfuse-upgrade-migration
Upgrade Langfuse SDK versions and migrate between API changes. Use when upgrading Langfuse SDK, handling breaking changes, or migrating between Langfuse versions. Trigger with phrases like "upgrade langfuse", "langfuse migration", "update langfuse SDK", "langfuse breaking changes", "langfuse version".
Install
mkdir -p .claude/skills/langfuse-upgrade-migration && curl -L -o skill.zip "https://mcp.directory/api/skills/download/7934" && unzip -o skill.zip -d .claude/skills/langfuse-upgrade-migration && rm skill.zipInstalls to .claude/skills/langfuse-upgrade-migration
About this skill
Langfuse Upgrade & Migration
Current State
!npm list langfuse @langfuse/client @langfuse/tracing @langfuse/otel 2>/dev/null | head -10 || echo 'No langfuse packages found'
!pip show langfuse 2>/dev/null | grep -E "Name|Version" || echo 'Python langfuse not installed'
Overview
Step-by-step guide for upgrading the Langfuse SDK across major versions. Covers v3 to v4 (OTel rewrite), v4 to v5, breaking changes, and automated codemods.
Prerequisites
- Existing Langfuse integration
- Test suite covering traced operations
- Git branch for the upgrade
Version Roadmap
| SDK | Package | Architecture | Status |
|---|---|---|---|
| v3 | langfuse (single) | Custom, Langfuse class | Legacy |
| v4 | @langfuse/client, @langfuse/tracing, @langfuse/otel | OpenTelemetry-based | Stable |
| v5 | @langfuse/client, @langfuse/tracing, @langfuse/otel | OpenTelemetry + improvements | Latest |
Instructions
Step 1: Check Current Version and Plan
set -euo pipefail
# Check what you have
npm list langfuse @langfuse/client @langfuse/tracing 2>/dev/null
# Check latest available
npm info @langfuse/client version
npm info @langfuse/tracing version
npm info langfuse version
# Python
pip show langfuse 2>/dev/null | grep Version
pip index versions langfuse 2>/dev/null | head -3
Step 2: v3 to v4 Migration (TypeScript)
This is the biggest migration -- v4 rewrites tracing on OpenTelemetry.
2a. Install new packages:
set -euo pipefail
# Install v4+ packages
npm install @langfuse/client @langfuse/tracing @langfuse/otel @opentelemetry/sdk-node
# Keep langfuse v3 temporarily for comparison
# Remove after migration: npm uninstall langfuse
2b. Update initialization:
// BEFORE (v3):
import { Langfuse } from "langfuse";
const langfuse = new Langfuse({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_HOST,
});
// AFTER (v4+):
import { LangfuseClient } from "@langfuse/client";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { NodeSDK } from "@opentelemetry/sdk-node";
// OTel setup (once at entry point)
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();
// Client for prompts, datasets, scores
const langfuse = new LangfuseClient();
2c. Update tracing calls:
// BEFORE (v3): Manual trace/span/generation
const trace = langfuse.trace({ name: "my-op", input: data });
const span = trace.span({ name: "step-1", input: data });
await doWork();
span.end({ output: result });
const gen = trace.generation({ name: "llm", model: "gpt-4o" });
gen.end({ output: response, usage: { promptTokens: 10 } });
await langfuse.flushAsync();
// AFTER (v4+): startActiveObservation with auto-nesting
import { startActiveObservation, updateActiveObservation } from "@langfuse/tracing";
await startActiveObservation("my-op", async () => {
updateActiveObservation({ input: data });
await startActiveObservation("step-1", async () => {
updateActiveObservation({ input: data });
const result = await doWork();
updateActiveObservation({ output: result });
});
await startActiveObservation({ name: "llm", asType: "generation" }, async () => {
updateActiveObservation({ model: "gpt-4o" });
const response = await callLLM();
updateActiveObservation({ output: response, usage: { promptTokens: 10 } });
});
});
2d. Update OpenAI wrapper:
// BEFORE (v3):
import { observeOpenAI } from "langfuse";
// AFTER (v4+):
import { observeOpenAI } from "@langfuse/openai";
// npm install @langfuse/openai
2e. Update environment variable:
# BEFORE: LANGFUSE_HOST or LANGFUSE_BASEURL
# AFTER: LANGFUSE_BASE_URL (LANGFUSE_BASEURL still works in v4 but not v5)
2f. Update prompt management:
// BEFORE (v3):
const prompt = await langfuse.getPrompt("my-prompt", 2); // version as positional arg
// AFTER (v4+):
const prompt = await langfuse.prompt.get("my-prompt", {
version: 2, // version in options object
type: "text", // explicit type
});
2g. Update shutdown:
// BEFORE (v3):
await langfuse.shutdownAsync();
// AFTER (v4+):
await sdk.shutdown(); // Shuts down OTel SDK + flushes spans
Step 3: Python SDK Migration (v2 to v3)
# BEFORE (v2):
from langfuse import Langfuse
langfuse = Langfuse()
@langfuse.observe()
def my_function():
pass
# AFTER (v3):
from langfuse.decorators import observe, langfuse_context
@observe()
def my_function():
langfuse_context.update_current_observation(
metadata={"key": "value"}
)
Step 4: Run Tests and Verify
set -euo pipefail
# Run existing test suite
npm test
# Verify traces appear in dashboard
node -e "
const { startActiveObservation, updateActiveObservation } = require('@langfuse/tracing');
startActiveObservation('upgrade-verify', async () => {
updateActiveObservation({ input: { test: true }, output: { migrated: true } });
}).then(() => console.log('Migration verified'));
"
Step 5: Remove Old Package
set -euo pipefail
# After all tests pass
npm uninstall langfuse
# Verify no lingering imports
grep -rn "from ['\"]langfuse['\"]" src/ || echo "No old imports found"
Breaking Changes Quick Reference
| Change | v3 | v4+ |
|---|---|---|
| Package | langfuse | @langfuse/client + @langfuse/tracing + @langfuse/otel |
| Client class | Langfuse | LangfuseClient |
| Base URL env | LANGFUSE_HOST | LANGFUSE_BASE_URL |
| Tracing | langfuse.trace() / .span() / .generation() | startActiveObservation() / observe() |
| Flush | langfuse.flushAsync() | sdk.shutdown() |
| Prompt version | getPrompt(name, version) | prompt.get(name, { version }) |
| OpenAI | import { observeOpenAI } from "langfuse" | import { observeOpenAI } from "@langfuse/openai" |
Error Handling
| Error | Cause | Solution |
|---|---|---|
Cannot find module '@langfuse/tracing' | Package not installed | npm install @langfuse/tracing @langfuse/otel @opentelemetry/sdk-node |
langfuse.trace is not a function | Using v4 LangfuseClient for tracing | Use startActiveObservation from @langfuse/tracing |
| Flat traces (no nesting) | OTel SDK not started | Register LangfuseSpanProcessor with NodeSDK |
LANGFUSE_HOST ignored | v5 dropped legacy env var | Rename to LANGFUSE_BASE_URL |
Resources
More by jeremylongshore
View all skills by jeremylongshore →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
fastapi-templates
wshobson
Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.
Related MCP Servers
Browse all serversGet expert React Native software guidance with tools for component analysis, performance, debugging, and migration betwe
Optimize your codebase for AI with Repomix—transform, compress, and secure repos for easier analysis with modern AI tool
Beads — a drop-in memory upgrade for your coding agent that boosts context, speed, and reliability with zero friction.
Unlock seamless Figma to code: streamline Figma to HTML with Framelink MCP Server for fast, accurate design-to-code work
Connect Supabase projects to AI with Supabase MCP Server. Standardize LLM communication for secure, efficient developmen
Empower your CLI agents with NotebookLM—connect AI tools for citation-backed answers from your docs, grounded in your ow
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.