windsurf-debugging-aiClaude Skill
Execute use Cascade for intelligent debugging and error analysis.
| name | windsurf-debugging-ai |
| description | Execute use Cascade for intelligent debugging and error analysis. Activate when users mention "debug with ai", "error analysis", "cascade debug", "find bug", or "troubleshoot code". Handles AI-assisted debugging workflows. Use when debugging issues or troubleshooting. Trigger with phrases like "windsurf debugging ai", "windsurf ai", "windsurf". |
| allowed-tools | Read,Grep,Glob,Bash(cmd:*) |
| version | 1.0.0 |
| license | MIT |
| author | Jeremy Longshore <jeremy@intentsolutions.io> |
| compatible-with | claude-code, codex, openclaw |
| tags | ["saas","skill-databases","debugging","workflow"] |
Windsurf Debugging Ai
Overview
This skill enables AI-assisted debugging within Windsurf. Cascade analyzes error messages, stack traces, and code context to identify root causes and suggest fixes.
Prerequisites
- Windsurf IDE with Cascade enabled
- Application with reproducible issues
- Debug configuration set up
- Error logs accessible
- Understanding of application architecture
Instructions
- Capture Error Context
- Analyze with Cascade
- Investigate Root Cause
- Apply Fix
- Document for Prevention
See ${CLAUDE_SKILL_DIR}/references/implementation.md for detailed implementation guide.
Output
- Root cause analysis
- Fix recommendations
- Debug session logs
- Prevention strategies
Error Handling
See ${CLAUDE_SKILL_DIR}/references/errors.md for comprehensive error handling.
Examples
See ${CLAUDE_SKILL_DIR}/references/examples.md for detailed examples.
Resources
Similar Claude Skills & Agent Workflows
manifest
Smart LLM Router for OpenClaw.
clawrouter
Smart LLM router — save 67% on inference costs.
chatgpt-app-builder
DEPRECATED: This skill has been replaced by `mcp-app-builder`.
use-local-whisper
Use when the user wants local voice transcription instead of OpenAI Whisper API.
add-voice-transcription
Add voice message transcription to NanoClaw using OpenAI's Whisper API.
add-ollama-tool
Add Ollama MCP server so the container agent can call local models for cheaper/faster tasks like summarization, translation, or general queries.