Android MCP

Android MCP

CursorTouch

Connects AI agents to Android devices for automated app testing and UI interaction using ADB and Android Accessibility API. Works with any LLM without requiring computer vision models.

439499 views67Local (stdio)

What it does

  • Launch Android apps and navigate interfaces
  • Tap, swipe, and input text on Android screens
  • Read UI element hierarchies and device state
  • Capture screenshots and device information
  • Execute shell commands on Android devices
  • Automate gesture sequences and keystrokes

Best for

QA engineers automating mobile app testingDevelopers building Android automation workflowsAI agents performing mobile device tasks
No computer vision pipeline requiredWorks with any LLM2-4 second action latency

Alternatives