
Android Mobile
Control Android devices remotely through AI agents with full UI interaction capabilities including touch, swipe, typing, and app management.
Enable AI agents to control Android devices
What it does
- Click and swipe on Android screens
- Type text into input fields
- Take screenshots of device screen
- Launch and list installed apps
- Extract UI elements as structured JSON
- Press system buttons (back, home, recent)
Best for
About Android Mobile
Android Mobile is a community-built MCP server published by erichung9060 that provides AI assistants with tools and capabilities via the Model Context Protocol. Enable AI agents to control Android devices easily using the Tasker app for automated tasks and seamless Android integra It is categorized under developer tools. This server exposes 9 tools that AI clients can invoke during conversations and coding sessions.
How to install
You can install Android Mobile in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
License
Android Mobile is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
Tools (9)
Initialize the Android device connection. Must be called before using any other mobile tools.
Get UI elements from Android screen as JSON with hierarchical structure. Returns a JSON structure where elements contain their child elements, showing parent-child relationships. Only includes focusable elements or elements with text/content_desc/hint attributes.
Click on a specific coordinate on the Android screen. Args: x: X coordinate to click y: Y coordinate to click
Input text into the currently focused text field on Android. Args: text: The text to input submit: Whether to submit text (press Enter key) after typing
Press a physical or virtual button on the Android device. Args: button: Button name (BACK, HOME, RECENT, ENTER)
Android Mobile MCP
Overview
Android Mobile MCP bridges the Model Context Protocol with Android device automation, enabling AI agents to interact with Android devices through UI manipulation, app management, and screen capture.
MCP Configuration
{
"mcpServers": {
"android-mobile-mcp": {
"command": "uvx",
"args": ["android-mobile-mcp"]
}
}
}
Prerequisites
- Connect Android device via USB or network
- Enable USB debugging on your Android device
- Install ADB (Android Debug Bridge)
Tools Reference
Screen Analysis
mobile_dump_ui - Extract UI elements as hierarchical JSON
- Parses screen XML to identify focusable elements and text content
- Calculates center coordinates for each interactive element
- Returns structured parent-child element relationships
mobile_take_screenshot - Capture current screen state
- Returns PNG image data for visual analysis
Touch Interactions
mobile_click - Click at specific coordinates
- Validates coordinates against current UI state
- Requires prior
mobile_dump_uicall for coordinate verification - Prevents clicking on invalid or non-interactive areas
mobile_swipe - Perform swipe gestures
- Executes directional swipes between two coordinate points
- Configurable duration for gesture speed control
Text Input
mobile_type - Input text into focused fields
- Sends text to currently active input field
- Optional automatic submission with Enter key
Navigation
mobile_key_press - Press system buttons
- Supports hardware and virtual keys: BACK, HOME, RECENT, ENTER
App Management
mobile_list_apps - List installed applications
- Filters out system apps and non-launchable packages
- Returns only user-accessible applications
mobile_launch_app - Start applications by package name
- Validates package existence before launch attempt
Alternatives
Related Skills
Browse all skillsComprehensive guide for creating professional UI/UX designs in Penpot using MCP tools. Use this skill when: (1) Creating new UI/UX designs for web, mobile, or desktop applications, (2) Building design systems with components and tokens, (3) Designing dashboards, forms, navigation, or landing pages, (4) Applying accessibility standards and best practices, (5) Following platform guidelines (iOS, Android, Material Design), (6) Reviewing or improving existing Penpot designs for usability. Triggers: "design a UI", "create interface", "build layout", "design dashboard", "create form", "design landing page", "make it accessible", "design system", "component library".
Mobile-first design and engineering doctrine for iOS and Android apps. Covers touch interaction, performance, platform conventions, offline behavior, and mobile-specific decision-making. Teaches principles and constraints, not fixed layouts. Use for React Native, Flutter, or native mobile apps.
Master Material Design 3 and Jetpack Compose patterns for building native Android apps. Use when designing Android interfaces, implementing Compose UI, or following Google's Material Design guidelines.
UI design system toolkit for Senior UI Designer including design token generation, component documentation, responsive design calculations, and developer handoff tools. Use for creating design systems, maintaining visual consistency, and facilitating design-dev collaboration.
Guide for building TypeScript CLIs with Bun. Use when creating command-line tools, adding subcommands to existing CLIs, or building developer tooling. Covers argument parsing, subcommand patterns, output formatting, and distribution.
Workflow guide when working with Android builds or the mobile/ directory.