Connect with me via the floating button
MCP skill for iOS Simulator automation — boot devices, tap, swipe, type, take screenshots, record video. Works with any MCP-compatible AI agent.
macOS only. Requires Xcode with iOS Simulators installed, idb, and xcrun on PATH.
This skill enables AI agents to control iOS Simulators through the Model Context Protocol (MCP). Connect it to Claude, or any MCP-compatible client, and automate iOS app testing with natural language.
Platform
✓macOS only (requires Xcode)
Prerequisites
Protocol
✓Model Context Protocol (MCP)
open_simulatorBoot the simulator
get_booted_sim_idGet device UDID
ui_viewVisual inspection
ui_tap / ui_type / ui_swipeInteract with UI
screenshot / record_videoCapture results
Boot the iOS Simulator with a specified device type.
Get the UDID of the currently booted simulator.
Inspect the current UI hierarchy for element identification.
Tap at a specific coordinate or element on screen.
Perform swipe gestures (up, down, left, right).
Type text into the currently focused input field.
Capture a screenshot of the simulator screen.
Record video of simulator interactions.
Once connected to an MCP client like Claude, you can control the simulator using natural language. The AI agent translates your requests into the appropriate MCP tool calls.
Built for iOS developers who want to automate testing with AI. Connect to Claude or any MCP-compatible agent and control iOS Simulators with natural language.
Star on GitHub