verify-claims
Verify claims and information using professional fact-checking services. Use this skill when users want to verify facts, check claims in articles/videos/transcripts, validate news authenticity, cross-reference information with trusted fact-checkers, or investigate potentially false or misleading content. Triggers include requests to "fact check", "verify this", "is this true", "check if this is accurate", or when users share content they want validated against misinformation.
Install
mkdir -p .claude/skills/verify-claims && curl -L -o skill.zip "https://mcp.directory/api/skills/download/6952" && unzip -o skill.zip -d .claude/skills/verify-claims && rm skill.zipInstalls to .claude/skills/verify-claims
About this skill
Fact-Checking Skill
Verify claims and information using professional fact-checking services from around the world.
Core Principles
- Multiple sources - Cross-reference findings from several fact-checking organizations
- Regional relevance - Prioritize fact-checkers appropriate to the content's context
- Language matching - Use fact-checkers in the native language of the content when possible
- Credible sources only - Never use fraudulent or unreliable fact-checking services
- Balanced presentation - Present both confirming and contradicting findings fairly
When to Use This Skill
Trigger this skill when the user:
- Explicitly asks to fact-check, verify, or validate information
- Shares an article, video transcript, or claim and asks "is this true?"
- Wants to check if something is misinformation or a hoax
- Asks about the credibility of specific claims or statements
- Requests verification of news, social media posts, or viral content
- Wants to cross-reference information with trusted sources
Do NOT trigger for:
- General research or information gathering (use web search instead)
- Checking grammar, spelling, or writing quality
- Verifying code functionality or technical documentation
- Questions about opinions rather than factual claims
Workflow
Step 1: Understand the Content
Before beginning verification, analyze what needs to be checked:
- Identify specific claims - Extract concrete, verifiable statements from the content
- Note the context - Identify:
- Geographic references (countries, regions, cities)
- Named individuals (politicians, public figures, organizations)
- Languages used in the content
- Time period or dates mentioned
- Subject matter (politics, health, science, etc.)
- Determine user context:
- User's native language (for selecting appropriate fact-checkers)
- User's location if relevant
Example Analysis:
- Content: "Video claiming vaccines cause autism, mentions Andrew Wakefield, references UK study"
- Claims to verify: Vaccine-autism link, Wakefield's research
- Context: Medical/health topic, UK origin, English language
- Key entities: Andrew Wakefield, MMR vaccine, UK medical establishment
Step 2: Select Fact-Checking Services
CRITICAL: Begin by fetching the current list of fact-checking services:
Fetch: https://en.wikipedia.org/wiki/List_of_fact-checking_websites
From this list, select 3-7 relevant fact-checking services based on:
Selection Criteria
-
User's language/location - Always include fact-checkers in the user's native language
-
Content language/location - If different from user's language, also include fact-checkers in the content's language and region
-
Geographic relevance - If content mentions specific countries/regions:
- Include fact-checkers from those countries
- Example: Content about French politics → include French fact-checkers
-
Subject matter specialists - Some fact-checkers specialize:
- Health/medical claims → Health Feedback, Science Feedback
- Politics → country-specific political fact-checkers
- General → Snopes, FactCheck.org, Full Fact
-
Person-specific - If content focuses on specific public figures:
- Include fact-checkers from their home countries
- Example: Claims about a US politician → include US fact-checkers
Exclusion Rule
NEVER use services listed under "Fraudulent fact-checking websites" on the Wikipedia page, regardless of how well they match other criteria.
Prioritization
When you must limit selections:
- Prioritize: User's language > Content's language > Geographic relevance
- Prefer well-established services (FactCheck.org, Snopes, Full Fact, AFP Fact Check, etc.)
- Include at least one international/general service
Example Selection:
- User: Polish speaker
- Content: English article about US vaccines
- Selected services:
- Demagog.pl (Polish, for user)
- FactCheck.org (US, for content geography)
- Snopes (US, general/medical)
- Health Feedback (health specialist)
- Full Fact (UK, English-speaking, general)
Step 3: Search Each Fact-Checking Service
For each selected service, conduct targeted searches:
Search Strategy
-
Extract 2-4 search terms from the content:
- Key person names
- Main topics/subjects
- Specific claims or events
- Important keywords
-
Translate terms to the fact-checker's native language if needed
-
Construct search queries using DuckDuckGo with site operator:
Format: site:domain.com [search terms in appropriate language] Examples: - site:fullfact.org vaccines autism - site:demagog.org.pl szczepionki autyzm - site:factcheck.org Andrew Wakefield MMR - site:healthfeedback.org vaccine safety -
Execute 1-3 searches per fact-checker (depending on content complexity)
Search Best Practices
- Keep queries concise (2-4 words typically)
- Start broad, then narrow if needed
- Don't repeat very similar queries
- If first search yields good results, proceed to analysis
- If first search yields poor results, try alternative terms
Step 4: Analyze Search Results
For each fact-checking service:
-
Review search results - Examine the first 5-10 results from each search
-
Select relevant articles - Choose articles where:
- Headline directly addresses the claim being verified
- Content appears substantial (not just brief mentions)
- Publication date is relevant (recent for ongoing issues, any date for historical debunks)
-
Fetch and read articles - Use
web_fetchto retrieve the full text of 2-4 most relevant articles per fact-checker -
Extract key findings for each article:
- Verdict - What did the fact-checker conclude? (True, False, Misleading, Mixed, Unproven, etc.)
- Evidence - What evidence did they cite?
- Context - Any important nuance or context
- Relevance - How directly does this address the user's claim?
Step 5: Synthesize and Present Results
Organize findings into a clear, user-friendly format:
Handle Fresh Content First
Before presenting results, check if the content is very recent (3 days old or less):
-
If fact-checks found: Proceed normally with presentation
-
If no fact-checks found AND content is ≤3 days old:
- Note that the content is too fresh for fact-checkers to have covered it yet
- If task scheduling is available:
- Schedule a follow-up fact-check for 3 days from now
- Inform user: "I've scheduled a follow-up check for [date]. I'll notify you if fact-checkers have published verification by then."
- If task scheduling is NOT available:
- Suggest: "This content is very recent (published [date]). Fact-checkers typically need a few days to verify claims. I recommend checking back in 3 days for updated verification."
- Offer preliminary analysis using general web search
- Proceed with any available information from general sources
-
If no fact-checks found AND content is older:
- Note that fact-checkers haven't specifically covered this
- Offer general web research instead
Structure Your Response
-
Opening summary (2-3 sentences)
- Overall consensus from fact-checkers
- Brief answer to the user's question
-
Key findings by claim (if multiple claims)
- Group related findings together
- Present contradicting evidence if it exists
-
Detailed evidence (organized by fact-checker or by claim)
- Include specific verdicts
- Cite evidence fact-checkers used
- Note any disagreements between fact-checkers
-
Important context (if relevant)
- Historical background
- Why the claim persists
- Common misconceptions
-
Source citations
- Provide direct links to all fact-checking articles referenced
- Format:
[Fact-Checker Name]: Article Title (Date if available) - [URL]
Presentation Guidelines
- Be objective - Present findings without inserting personal judgment
- Be nuanced - Avoid oversimplifying complex issues
- Be clear about uncertainty - If fact-checkers disagree or evidence is inconclusive, say so
- Be balanced - If some evidence supports and some contradicts, present both
- Use accessible language - Avoid jargon, explain technical terms
- Highlight consensus - When multiple fact-checkers agree, emphasize this
Formatting
- Use clear headers to organize different claims or themes
- Use natural prose, not bullet points, for the main findings
- Only use lists for: multiple similar items, source citations, or when explicitly helpful
- Include clickable citations throughout (not just at the end)
Example Response Structure
Based on verification from five established fact-checking organizations, the claim that vaccines cause autism has been thoroughly debunked. Multiple independent reviews of the evidence have found no causal link between vaccination and autism spectrum disorder.
The origins of this claim trace back to a fraudulent 1998 study by Andrew Wakefield, which was later retracted by The Lancet. Fact-checkers consistently note that Wakefield lost his medical license, and subsequent large-scale studies involving millions of children have found no connection.
[Full Fact reviewed the evidence in 2023](link), concluding "There is no link between the MMR vaccine and autism." Their analysis examined 12 major studies and found consistent results across different populations and methodologies.
[FactCheck.org's comprehensive analysis](link) explains that "The myth persists despite overwhelming scientific consensus against it" and details how the original study was not only retracted but shown to involve falsified data.
However, [Demagog.pl](link) notes that while the vaccine-autism link is false, concerns about vaccine safety in general are legitimate and should be addressed th
---
*Content truncated.*
More by openclaw
View all skills by openclaw →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
fastapi-templates
wshobson
Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.
Related MCP Servers
Browse all serversBridge to Clay's CRM platform for searching, managing contacts, and relationships using natural language. CRM relationsh
Build persistent semantic networks for enterprise & engineering data management. Enable data persistence and memory acro
Create modern React UI components instantly with Magic AI Agent. Integrates with top IDEs for fast, stunning design and
Empower AI with the Exa MCP Server—an AI research tool for real-time web search, academic data, and smarter, up-to-date
Web Fetcher uses Playwright for reliable data web scraping and extraction from JavaScript-heavy websites, returning clea
Securely extract text, metadata, & pages from PDFs using Adobe Acrobat PDF editor software for local & remote files.
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.