perplexity-core-workflow-b
Execute Perplexity secondary workflow: Core Workflow B. Use when implementing secondary use case, or complementing primary workflow. Trigger with phrases like "perplexity secondary workflow", "secondary task with perplexity".
Install
mkdir -p .claude/skills/perplexity-core-workflow-b && curl -L -o skill.zip "https://mcp.directory/api/skills/download/7739" && unzip -o skill.zip -d .claude/skills/perplexity-core-workflow-b && rm skill.zipInstalls to .claude/skills/perplexity-core-workflow-b
About this skill
Perplexity Core Workflow B: Multi-Query Research
Overview
Multi-turn research workflow using Perplexity Sonar API. Decomposes a broad topic into focused sub-queries, runs them with context continuity, deduplicates citations, and synthesizes a structured research document. Use sonar for fast passes and sonar-pro for deep dives.
Prerequisites
- Completed
perplexity-install-authsetup - Familiarity with
perplexity-core-workflow-a PERPLEXITY_API_KEYset
Instructions
Step 1: Conversational Research Session
import OpenAI from "openai";
const perplexity = new OpenAI({
apiKey: process.env.PERPLEXITY_API_KEY,
baseURL: "https://api.perplexity.ai",
});
type Message = OpenAI.ChatCompletionMessageParam;
class ResearchSession {
private messages: Message[] = [];
private allCitations: Set<string> = new Set();
constructor(systemPrompt: string = "You are a research assistant. Provide thorough, cited answers.") {
this.messages.push({ role: "system", content: systemPrompt });
}
async ask(question: string, model: "sonar" | "sonar-pro" = "sonar"): Promise<{
answer: string;
citations: string[];
}> {
this.messages.push({ role: "user", content: question });
const response = await perplexity.chat.completions.create({
model,
messages: this.messages,
} as any);
const answer = response.choices[0].message.content || "";
const citations = (response as any).citations || [];
// Maintain conversation context
this.messages.push({ role: "assistant", content: answer });
// Accumulate all citations across the session
citations.forEach((url: string) => this.allCitations.add(url));
return { answer, citations };
}
getAllCitations(): string[] {
return [...this.allCitations];
}
// Keep context manageable (Perplexity searches per turn)
trimHistory(keepLast: number = 6) {
const system = this.messages[0];
const recent = this.messages.slice(-(keepLast * 2));
this.messages = [system, ...recent];
}
}
Step 2: Batch Query Pipeline
interface ResearchPlan {
topic: string;
questions: string[];
}
interface ResearchReport {
topic: string;
sections: Array<{ question: string; answer: string; citations: string[] }>;
allCitations: string[];
totalTokens: number;
}
async function conductResearch(plan: ResearchPlan): Promise<ResearchReport> {
const sections: ResearchReport["sections"] = [];
const allCitations = new Set<string>();
let totalTokens = 0;
for (const question of plan.questions) {
const response = await perplexity.chat.completions.create({
model: "sonar-pro", // deeper research for each sub-question
messages: [
{ role: "system", content: `Research context: ${plan.topic}` },
{ role: "user", content: question },
],
} as any);
const answer = response.choices[0].message.content || "";
const citations = (response as any).citations || [];
sections.push({ question, answer, citations });
citations.forEach((url: string) => allCitations.add(url));
totalTokens += response.usage?.total_tokens || 0;
// Rate limit protection: 50 RPM for most tiers
await new Promise((r) => setTimeout(r, 1500));
}
return {
topic: plan.topic,
sections,
allCitations: [...allCitations],
totalTokens,
};
}
Step 3: Topic Decomposition
async function decomposeTopic(topic: string): Promise<string[]> {
const response = await perplexity.chat.completions.create({
model: "sonar",
messages: [
{
role: "system",
content: "Break this research topic into 4-6 specific, focused questions. Return one question per line, no numbering.",
},
{ role: "user", content: topic },
],
max_tokens: 500,
});
return (response.choices[0].message.content || "")
.split("\n")
.map((q) => q.trim())
.filter((q) => q.length > 10);
}
Step 4: Compile Research Report
function compileReport(report: ResearchReport): string {
let md = `# Research: ${report.topic}\n\n`;
for (const section of report.sections) {
md += `## ${section.question}\n\n`;
md += `${section.answer}\n\n`;
}
md += `## Bibliography\n\n`;
report.allCitations.forEach((url, i) => {
md += `${i + 1}. ${url}\n`;
});
md += `\n---\n`;
md += `*${report.sections.length} queries | ${report.allCitations.length} unique sources | ${report.totalTokens} tokens*\n`;
return md;
}
Step 5: Full Pipeline
async function researchTopic(topic: string): Promise<string> {
console.log(`Decomposing: ${topic}`);
const questions = await decomposeTopic(topic);
console.log(`Generated ${questions.length} sub-questions`);
const report = await conductResearch({ topic, questions });
console.log(`Found ${report.allCitations.length} unique sources`);
return compileReport(report);
}
// Usage
const markdown = await researchTopic("Impact of AI on drug discovery in 2025");
console.log(markdown);
Step 6: Python Multi-Query Research
import asyncio, os
from openai import OpenAI
client = OpenAI(api_key=os.environ["PERPLEXITY_API_KEY"], base_url="https://api.perplexity.ai")
def research_topic(topic: str, questions: list[str]) -> dict:
sections = []
all_citations = set()
for q in questions:
r = client.chat.completions.create(
model="sonar-pro",
messages=[
{"role": "system", "content": f"Research context: {topic}"},
{"role": "user", "content": q},
],
)
raw = r.model_dump()
citations = raw.get("citations", [])
sections.append({"question": q, "answer": r.choices[0].message.content, "citations": citations})
all_citations.update(citations)
return {"topic": topic, "sections": sections, "citations": list(all_citations)}
Error Handling
| Error | Cause | Solution |
|---|---|---|
429 Too Many Requests | Batch queries too fast | Add 1-2s delay between queries |
| Context overflow | Too many conversation turns | Call trimHistory() to keep last 6 turns |
| Contradictory answers | Different sources disagree | Flag contradictions for manual review |
| High cost | Using sonar-pro for all queries | Use sonar for decomposition, sonar-pro for deep dives |
Output
- Structured research document with multiple sections
- Consolidated bibliography of all cited sources
- Token usage for cost tracking
- Conversation session with context continuity
Resources
Next Steps
For common errors, see perplexity-common-errors.
More by jeremylongshore
View all skills by jeremylongshore →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
fastapi-templates
wshobson
Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.
Related MCP Servers
Browse all serversConnect Blender to Claude AI for seamless 3D modeling. Use AI 3D model generator tools for faster, intuitive, interactiv
Empower your workflows with Perplexity Ask MCP Server—seamless integration of AI research tools for real-time, accurate
AIPo Labs — dynamic search and execute any tools available on ACI.dev for fast, flexible AI-powered workflows.
TaskManager streamlines project tracking and time management with efficient task queues, ideal for managing projects sof
Access mac keyboard shortcuts for screen capture and automate workflows with Siri Shortcuts. Streamline hotkey screensho
Integrate with Salesforce CRM to manage records, execute queries, and automate workflows using natural language interact
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.