Embrace
Challenge
Long‑distance couples rely heavily on video calls and messaging, but these channels filter out non‑verbal, tactile cues that usually sustain intimacy and relationship security. Over time, partners can feel physically alone, touch‑deprived, and lonely even when they talk frequently. The challenge was to design an intimate mixed‑reality experience that restores a sense of embodied presence and affectionate touch at a distance, instead of “just another Zoom call.”
User & Insights
Design Process
Research & Framing
Reviewed work on affectionate touch, loneliness, and relationship satisfaction to understand why touch is so central to feeling secure with a partner. Also explored social VR and haptic technologies (e.g., Pneumatic and Acoustic haptic suits) to see how synchronised visual and tactile cues could recreate hugs and strokes at a distance.
"5 Whys" and POV
Used a 5 Whys exercise to dig into why couples feel disconnected despite frequent AV communication, tracing it to filtered non‑verbal cues, lack of neural synchrony, and an un‑switched “loneliness alarm.”
Defined a point‑of‑view statement: long‑distance couples like Maya need ways to recreate embodied presence and affectionate touch at a distance because existing VMC tools make them feel physically alone and insecure despite “being in contact.”
Ideation
Mapped opportunities around VR/MR spaces (couch, balcony, rooftop dates), haptic touch patterns (hugs, hand holding, arm strokes), shared activities (films, co‑op games, meditation), safety and consent, and practical constraints (bulky headsets, time zones).
Applied SCAMPER to a mixed‑reality session concept: substitute standard video with a high‑fidelity VR environment, combine haptic feedback with physiological cues, adapt mirror‑touch so what you see your partner do is exactly what you feel, and eliminate over‑reliance on emojis and likes.
Flow and interaction model
Designed a session flow from launch to summary: pairing headsets and suits, selecting touch modes and allowed areas, choosing a scene (date spot, relaxing couch, MR space), interacting with 3D avatars, triggering bi‑directional haptic patterns, and ending with a gentle wind‑down and session recap.
Emphasised consent and control at every step with explicit touch‑mode selection, adjustable intensity, opt‑in touch areas, and clear pause/stop options.
Prototyping
Created low‑fidelity sketches to explore key interaction points: partner invitation, consent, mode selection, in‑room gestures, and error states (network or suit issues).
Built a high‑fidelity Vision Pro‑inspired interface in Figma following visionOS 26 guidelines and SF Symbols, blending real‑world passthrough with spatial UI for partner cards, session settings, and environment selection.
Prototype
The final Embrace prototype is an interactive mixed‑reality experience designed for Apple Vision Pro, showing how a long‑distance couple can meet in a virtual living room, configure touch boundaries, and share hugs and hand‑holding through synchronised haptic suits. It includes flows for account and avatar setup, partner pairing, touch‑mode and consent settings, scene selection, in‑room interactions (talking, environment changes, playing together), error handling, and a session summary screen.


