
SlopWatch
Monitors AI coding claims by tracking what AI says it implemented versus what actually gets changed in your code files. Provides accountability scoring to catch AI lies about code modifications.
6312 views4Local (stdio)
What it does
- Register implementation claims and verify against actual file changes
- Track slop scores showing AI accuracy statistics
- Generate accountability rules for AI development workflows
- Monitor file content changes for verification
- Analyze code modifications through content comparison
Best for
Developers using AI pair programming toolsTeams tracking AI coding accuracyCode review processes involving AI assistance
Real-time AI lie detectionSingle-call claim verificationCursor IDE compatible
Tools (3)
slopwatch_claim_and_verify
Register claim and verify implementation in one call - reduces from 2 tool calls to 1
slopwatch_status
Get current slop score and statistics
slopwatch_setup_rules
Generate .cursorrules file with AI accountability enforcement