axiom-haptics
Use when implementing haptic feedback, Core Haptics patterns, audio-haptic synchronization, or debugging haptic issues - covers UIFeedbackGenerator, CHHapticEngine, AHAP patterns, and Apple's Causality-Harmony-Utility design principles from WWDC 2021
Install
mkdir -p .claude/skills/axiom-haptics && curl -L -o skill.zip "https://mcp.directory/api/skills/download/5666" && unzip -o skill.zip -d .claude/skills/axiom-haptics && rm skill.zipInstalls to .claude/skills/axiom-haptics
About this skill
Haptics & Audio Feedback
Comprehensive guide to implementing haptic feedback on iOS. Every Apple Design Award winner uses excellent haptic feedback - Camera, Maps, Weather all use haptics masterfully to create delightful, responsive experiences.
Overview
Haptic feedback provides tactile confirmation of user actions and system events. When designed thoughtfully using the Causality-Harmony-Utility framework, axiom-haptics transform interfaces from functional to delightful.
This skill covers both simple haptics (UIFeedbackGenerator) and advanced custom patterns (Core Haptics), with real-world examples and audio-haptic synchronization techniques.
When to Use This Skill
- Adding haptic feedback to user interactions
- Choosing between UIFeedbackGenerator and Core Haptics
- Designing audio-haptic experiences that feel unified
- Creating custom haptic patterns with AHAP files
- Synchronizing haptics with animations and audio
- Debugging haptic issues (simulator vs device)
- Optimizing haptic performance and battery impact
System Requirements
- iOS 10+ for UIFeedbackGenerator
- iOS 13+ for Core Haptics (CHHapticEngine)
- iPhone 8+ for Core Haptics hardware support
- Physical device required - haptics cannot be felt in Simulator
Part 1: Design Principles (WWDC 2021/10278)
Apple's audio and haptic design teams established three core principles for multimodal feedback:
Causality - Make it obvious what caused the feedback
Problem: User can't tell what triggered the haptic Solution: Haptic timing must match the visual/interaction moment
Example from WWDC:
- ✅ Ball hits wall → haptic fires at collision moment
- ❌ Ball hits wall → haptic fires 100ms later (confusing)
Code pattern:
// ✅ Immediate feedback on touch
@objc func buttonTapped() {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // Fire immediately
performAction()
}
// ❌ Delayed feedback loses causality
@objc func buttonTapped() {
performAction()
DispatchQueue.main.asyncAfter(deadline: .now() + 0.2) {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // Too late!
}
}
Harmony - Senses work best when coherent
Problem: Visual, audio, and haptic don't match Solution: All three senses should feel like a unified experience
Example from WWDC:
- Small ball → light haptic + high-pitched sound
- Large ball → heavy haptic + low-pitched sound
- Shield transformation → continuous haptic + progressive audio
Key insight: A large object should feel heavy, sound low and resonant, and look substantial. All three senses reinforce the same experience.
Utility - Provide clear value
Problem: Haptics used everywhere "just because we can" Solution: Reserve haptics for significant moments that benefit the user
When to use haptics:
- ✅ Confirming an important action (payment completed)
- ✅ Alerting to critical events (low battery)
- ✅ Providing continuous feedback (scrubbing slider)
- ✅ Enhancing delight (app launch flourish)
When NOT to use haptics:
- ❌ Every single tap (overwhelming)
- ❌ Scrolling through long lists (battery drain)
- ❌ Background events user can't see (confusing)
- ❌ Decorative animations (no value)
Part 2: UIFeedbackGenerator (Simple Haptics)
For most apps, UIFeedbackGenerator provides 3 simple haptic types without custom patterns.
UIImpactFeedbackGenerator
Physical collision or impact sensation.
Styles (ordered light → heavy):
.light- Small, delicate tap.medium- Standard tap (most common).heavy- Strong, solid impact.rigid- Firm, precise tap.soft- Gentle, cushioned tap
Usage pattern:
class MyViewController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func viewDidLoad() {
super.viewDidLoad()
// Prepare reduces latency for next impact
impactGenerator.prepare()
}
@objc func userDidTap() {
impactGenerator.impactOccurred()
}
}
Intensity variation (iOS 13+):
// intensity: 0.0 (lightest) to 1.0 (strongest)
impactGenerator.impactOccurred(intensity: 0.5)
Common use cases:
- Button taps (
.medium) - Toggle switches (
.light) - Deleting items (
.heavy) - Confirming selections (
.rigid)
UISelectionFeedbackGenerator
Discrete selection changes (picker wheels, segmented controls).
Usage:
class PickerViewController: UIViewController {
let selectionGenerator = UISelectionFeedbackGenerator()
func pickerView(_ picker: UIPickerView, didSelectRow row: Int,
inComponent component: Int) {
selectionGenerator.selectionChanged()
}
}
Feels like: Clicking a physical wheel with detents
Common use cases:
- Picker wheels
- Segmented controls
- Page indicators
- Step-through interfaces
UINotificationFeedbackGenerator
System-level success/warning/error feedback.
Types:
.success- Task completed successfully.warning- Attention needed, but not critical.error- Critical error occurred
Usage:
let notificationGenerator = UINotificationFeedbackGenerator()
func submitForm() {
// Validate form
if isValid {
notificationGenerator.notificationOccurred(.success)
saveData()
} else {
notificationGenerator.notificationOccurred(.error)
showValidationErrors()
}
}
Best practice: Match haptic type to user outcome
- ✅ Payment succeeds →
.success - ✅ Form validation fails →
.error - ✅ Approaching storage limit →
.warning
Performance: prepare()
Call prepare() before the haptic to reduce latency:
// ✅ Good - prepare before user action
@IBAction func buttonTouchDown(_ sender: UIButton) {
impactGenerator.prepare() // User's finger is down
}
@IBAction func buttonTouchUpInside(_ sender: UIButton) {
impactGenerator.impactOccurred() // Immediate haptic
}
// ❌ Bad - unprepared haptic may lag
@IBAction func buttonTapped(_ sender: UIButton) {
let generator = UIImpactFeedbackGenerator()
generator.impactOccurred() // May have 10-20ms delay
}
Prepare timing: System keeps engine ready for ~1 second after prepare().
Part 3: Core Haptics (Custom Haptics)
For apps needing custom patterns, Core Haptics provides full control over haptic waveforms.
Four Fundamental Elements
- Engine (
CHHapticEngine) - Link to the phone's actuator - Player (
CHHapticPatternPlayer) - Playback control - Pattern (
CHHapticPattern) - Collection of events over time - Events (
CHHapticEvent) - Building blocks specifying the experience
CHHapticEngine Lifecycle
import CoreHaptics
class HapticManager {
var engine: CHHapticEngine?
func initializeHaptics() {
// Check device support
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
// Create engine
engine = try CHHapticEngine()
// Handle interruptions (calls, Siri, etc.)
engine?.stoppedHandler = { reason in
print("Engine stopped: \(reason)")
self.restartEngine()
}
// Handle reset (audio session changes)
engine?.resetHandler = {
print("Engine reset")
self.restartEngine()
}
// Start engine
try engine?.start()
} catch {
print("Failed to create haptic engine: \(error)")
}
}
func restartEngine() {
do {
try engine?.start()
} catch {
print("Failed to restart engine: \(error)")
}
}
}
Critical: Always set stoppedHandler and resetHandler to handle system interruptions.
CHHapticEvent Types
Transient Events
Short, discrete feedback (like a tap).
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 1.0 // 0.0 to 1.0
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.5 // 0.0 (dull) to 1.0 (sharp)
)
let event = CHHapticEvent(
eventType: .hapticTransient,
parameters: [intensity, sharpness],
relativeTime: 0.0 // Seconds from pattern start
)
Parameters:
hapticIntensity: Strength (0.0 = barely felt, 1.0 = maximum)hapticSharpness: Character (0.0 = dull thud, 1.0 = crisp snap)
Continuous Events
Sustained feedback over time (like a vibration motor).
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 0.8
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.3
)
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [intensity, sharpness],
relativeTime: 0.0,
duration: 2.0 // Seconds
)
Use cases:
- Rolling texture as object moves
- Motor running
- Charging progress
- Long press feedback
Creating and Playing Patterns
func playCustomPattern() {
// Create events
let tap1 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5)
],
relativeTime: 0.0
)
let tap2 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.7),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.7)
],
relativeTime: 0.3
)
let tap3 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
---
*Content truncated.*
More by CharlesWiltgen
View all skills by CharlesWiltgen →You might also like
flutter-development
aj-geddes
Build beautiful cross-platform mobile apps with Flutter and Dart. Covers widgets, state management with Provider/BLoC, navigation, API integration, and material design.
drawio-diagrams-enhanced
jgtolentino
Create professional draw.io (diagrams.net) diagrams in XML format (.drawio files) with integrated PMP/PMBOK methodologies, extensive visual asset libraries, and industry-standard professional templates. Use this skill when users ask to create flowcharts, swimlane diagrams, cross-functional flowcharts, org charts, network diagrams, UML diagrams, BPMN, project management diagrams (WBS, Gantt, PERT, RACI), risk matrices, stakeholder maps, or any other visual diagram in draw.io format. This skill includes access to custom shape libraries for icons, clipart, and professional symbols.
ui-ux-pro-max
nextlevelbuilder
"UI/UX design intelligence. 50 styles, 21 palettes, 50 font pairings, 20 charts, 8 stacks (React, Next.js, Vue, Svelte, SwiftUI, React Native, Flutter, Tailwind). Actions: plan, build, create, design, implement, review, fix, improve, optimize, enhance, refactor, check UI/UX code. Projects: website, landing page, dashboard, admin panel, e-commerce, SaaS, portfolio, blog, mobile app, .html, .tsx, .vue, .svelte. Elements: button, modal, navbar, sidebar, card, table, form, chart. Styles: glassmorphism, claymorphism, minimalism, brutalism, neumorphism, bento grid, dark mode, responsive, skeuomorphism, flat design. Topics: color palette, accessibility, animation, layout, typography, font pairing, spacing, hover, shadow, gradient."
godot
bfollington
This skill should be used when working on Godot Engine projects. It provides specialized knowledge of Godot's file formats (.gd, .tscn, .tres), architecture patterns (component-based, signal-driven, resource-based), common pitfalls, validation tools, code templates, and CLI workflows. The `godot` command is available for running the game, validating scripts, importing resources, and exporting builds. Use this skill for tasks involving Godot game development, debugging scene/resource files, implementing game systems, or creating new Godot components.
nano-banana-pro
garg-aayush
Generate and edit images using Google's Nano Banana Pro (Gemini 3 Pro Image) API. Use when the user asks to generate, create, edit, modify, change, alter, or update images. Also use when user references an existing image file and asks to modify it in any way (e.g., "modify this image", "change the background", "replace X with Y"). Supports both text-to-image generation and image-to-image editing with configurable resolution (1K default, 2K, or 4K for high resolution). DO NOT read the image file first - use this skill directly with the --input-image parameter.
fastapi-templates
wshobson
Create production-ready FastAPI projects with async patterns, dependency injection, and comprehensive error handling. Use when building new FastAPI applications or setting up backend API projects.
Related MCP Servers
Browse all serversAI Intervention Agent enables human-in-the-loop AI with real-time intervention via a web UI—review context, give feedbac
Unlock seamless Figma to code: streamline Figma to HTML with Framelink MCP Server for fast, accurate design-to-code work
Interactive MCP server for collecting user feedback and executing commands during AI-assisted development. Features a we
Effortlessly manage Google Cloud with this user-friendly multi cloud management platform—simplify operations, automate t
Optimize Facebook ad campaigns with AI-driven insights, creative analysis, and campaign control in Meta Ads Manager for
LLM Code Context boosts code reviews and documentation with smart file selection, code outlining, and multi-language sup
Stay ahead of the MCP ecosystem
Get weekly updates on new skills and servers.