axiom-ios-ai

0
0
Source

Use when implementing ANY Apple Intelligence or on-device AI feature. Covers Foundation Models, @Generable, LanguageModelSession, structured output, Tool protocol, iOS 26 AI integration.

Install

mkdir -p .claude/skills/axiom-ios-ai && curl -L -o skill.zip "https://mcp.directory/api/skills/download/4991" && unzip -o skill.zip -d .claude/skills/axiom-ios-ai && rm skill.zip

Installs to .claude/skills/axiom-ios-ai

About this skill

iOS Apple Intelligence Router

You MUST use this skill for ANY Apple Intelligence or Foundation Models work.

When to Use

Use this router when:

  • Implementing Apple Intelligence features
  • Using Foundation Models
  • Working with LanguageModelSession
  • Generating structured output with @Generable
  • Debugging AI generation issues
  • iOS 26 on-device AI

AI Approach Triage

First, determine which kind of AI the developer needs:

Developer IntentRoute To
On-device text generation (Apple Intelligence)Stay here → Foundation Models skills
Custom ML model deployment (PyTorch, TensorFlow)Route to ios-ml → CoreML conversion, compression
Computer vision (image analysis, OCR, segmentation)Route to ios-vision → Vision framework
Cloud API integration (OpenAI, etc.)Route to ios-networking → URLSession patterns
System AI features (Writing Tools, Genmoji)No custom code needed — these are system-provided

Key boundary: ios-ai vs ios-ml

  • ios-ai = Apple's Foundation Models framework (LanguageModelSession, @Generable, on-device LLM)
  • ios-ml = Custom model deployment (CoreML conversion, quantization, MLTensor, speech-to-text)
  • If developer says "run my own model" → ios-ml. If "use Apple Intelligence" → ios-ai.

Cross-Domain Routing

Foundation Models + concurrency (session blocking main thread, UI freezes):

  • Foundation Models sessions are async — blocking likely means missing await or running on @MainActor
  • Fix here first using async session patterns in foundation-models skill
  • If concurrency issue is broader than Foundation Models → also invoke ios-concurrency

Foundation Models + data (@Generable decoding errors, structured output issues):

  • @Generable output problems are Foundation Models-specific, NOT generic Codable issues
  • Stay here → foundation-models-diag handles structured output debugging
  • If developer also has general Codable/serialization questions → also invoke ios-data

Routing Logic

Foundation Models Work

Implementation patterns/skill axiom-foundation-models

  • LanguageModelSession basics
  • @Generable structured output
  • Tool protocol integration
  • Streaming with PartiallyGenerated
  • Dynamic schemas
  • 26 WWDC code examples

API reference/skill axiom-foundation-models-ref

  • Complete API documentation
  • All @Generable examples
  • Tool protocol patterns
  • Streaming generation patterns

Diagnostics/skill axiom-foundation-models-diag

  • AI response blocked
  • Generation slow
  • Guardrail violations
  • Context limits exceeded
  • Model unavailable

Decision Tree

  1. Custom ML model / CoreML / PyTorch conversion? → Route to ios-ml (not this router)
  2. Computer vision / image analysis / OCR? → Route to ios-vision (not this router)
  3. Cloud AI API integration? → Route to ios-networking (not this router)
  4. Implementing Foundation Models / @Generable / Tool protocol? → foundation-models
  5. Need API reference / code examples? → foundation-models-ref
  6. Debugging AI issues (blocked, slow, guardrails)? → foundation-models-diag
  7. Foundation Models + UI freezing? → foundation-models (async patterns) + also invoke ios-concurrency if needed

Anti-Rationalization

ThoughtReality
"Foundation Models is just LanguageModelSession"Foundation Models has @Generable, Tool protocol, streaming, and guardrails. foundation-models covers all.
"I'll figure out the AI patterns as I go"AI APIs have specific error handling and fallback requirements. foundation-models prevents runtime failures.
"I've used LLMs before, this is similar"Apple's on-device models have unique constraints (guardrails, context limits). foundation-models is Apple-specific.

Critical Patterns

foundation-models:

  • LanguageModelSession setup
  • @Generable for structured output
  • Tool protocol for function calling
  • Streaming generation
  • Dynamic schema evolution

foundation-models-diag:

  • Blocked response handling
  • Performance optimization
  • Guardrail violations
  • Context management

Example Invocations

User: "How do I use Apple Intelligence to generate structured data?" → Invoke: /skill axiom-foundation-models

User: "My AI generation is being blocked" → Invoke: /skill axiom-foundation-models-diag

User: "Show me @Generable examples" → Invoke: /skill axiom-foundation-models-ref

User: "Implement streaming AI generation" → Invoke: /skill axiom-foundation-models

User: "I want to add AI to my app" → First ask: Apple Intelligence (Foundation Models) or custom ML model? Route accordingly.

User: "My Foundation Models session is blocking the UI" → Invoke: /skill axiom-foundation-models (async patterns) + also invoke ios-concurrency if needed

User: "I want to run my PyTorch model on device" → Route to: ios-ml router (CoreML conversion, not Foundation Models)

axiom-ios-build

CharlesWiltgen

Use when ANY iOS build fails, test crashes, Xcode misbehaves, or environment issue occurs before debugging code. Covers build failures, compilation errors, dependency conflicts, simulator problems, environment-first diagnostics.

91

axiom-getting-started

CharlesWiltgen

Use when first installing Axiom, unsure which skill to use, want an overview of available skills, or need help finding the right skill for your situation — interactive onboarding that recommends skills based on your project and current focus

00

axiom-ui-testing

CharlesWiltgen

Use when writing UI tests, recording interactions, tests have race conditions, timing dependencies, inconsistent pass/fail behavior, or XCTest UI tests are flaky - covers Recording UI Automation (WWDC 2025), condition-based waiting, network conditioning, multi-factor testing, crash debugging, and accessibility-first testing patterns

00

axiom-core-spotlight-ref

CharlesWiltgen

Use when indexing app content for Spotlight search, using NSUserActivity for prediction/handoff, or choosing between CSSearchableItem and IndexedEntity - covers Core Spotlight framework and NSUserActivity integration for iOS 9+

00

axiom-vision-diag

CharlesWiltgen

subject not detected, hand pose missing landmarks, low confidence observations, Vision performance, coordinate conversion, VisionKit errors, observation nil, text not recognized, barcode not detected, DataScannerViewController not working, document scan issues

00

axiom-now-playing-carplay

CharlesWiltgen

CarPlay Now Playing integration patterns. Use when implementing CarPlay audio controls, CPNowPlayingTemplate customization, or debugging CarPlay-specific issues.

00

You might also like

flutter-development

aj-geddes

Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.

643969

drawio-diagrams-enhanced

jgtolentino

Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.

591705

ui-ux-pro-max

nextlevelbuilder

"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."

318398

godot

bfollington

This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.

339397

nano-banana-pro

garg-aayush

Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.

451339

fastapi-templates

wshobson

Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.

304231

Stay ahead of the MCP ecosystem

Get weekly updates on new skills and servers.