Embrace

A MR experience that lets long‑distance partners share an intimate virtual space and feel each other’s touch through synchronised haptic suits, turning routine calls into immersive, emotionally secure moments together

A MR experience that lets long‑distance partners share an intimate virtual space and feel each other’s touch through synchronised haptic suits, turning routine calls into immersive, emotionally secure moments together

Challenge

Long‑distance couples rely heavily on video calls and messaging, but these channels filter out non‑verbal, tactile cues that usually sustain intimacy and relationship security. Over time, partners can feel physically alone, touch‑deprived, and lonely even when they talk frequently. The challenge was to design an intimate mixed‑reality experience that restores a sense of embodied presence and affectionate touch at a distance, instead of “just another Zoom call.”

User & Insights

Primary user

Maya, a 24‑year‑old master’s student in Brighton whose partner has moved abroad, faces an 8‑hour time difference. She has a secure relationship but misses everyday affection like cuddling on the sofa and feels especially lonely at night and after stressful days.​

Pain points

  • Video calls feel emotionally important but physically flat; she misses leaning on her partner and subtle touches.

  • She feels touch‑deprived after hanging up, when the room goes quiet.

  • Time‑zone coordination makes calls feel like appointments rather than relaxed, shared experiences.​

Behaviour

She schedules 5–6 video calls per week, uses texts and voice notes, and experiments with co‑watching and online games, but conversations often slide back into sitting in front of a laptop talking.

Key insights

  • Affectionate touch has strong psychological and physiological benefits, reducing stress and loneliness in ways verbal affection alone cannot.

  • Loneliness acts like a “biological alarm” that is only switched off by specific environmental cues, especially physical contact.

  • Current AV platforms create “reduced‑cue environments” that disrupt interpersonal neural synchrony; symbolic cues like emojis can’t replace sensorimotor touch.​

  • An effective solution must combine shared, meaningful activities with reciprocal, believable haptic touch, while respecting safety, consent, and cultural differences around touch.

Primary user

Maya, a 24‑year‑old master’s student in Brighton whose partner has moved abroad, faces an 8‑hour time difference. She has a secure relationship but misses everyday affection like cuddling on the sofa and feels especially lonely at night and after stressful days.​

Behaviour

She schedules 5–6 video calls per week, uses texts and voice notes, and experiments with co‑watching and online games, but conversations often slide back into sitting in front of a laptop talking.

Pain points

  • Video calls feel emotionally important but physically flat; she misses leaning on her partner and subtle touches.

  • She feels touch‑deprived after hanging up, when the room goes quiet.

  • Time‑zone coordination makes calls feel like appointments rather than relaxed, shared experiences.​

Key insights

  • Affectionate touch has strong psychological and physiological benefits, reducing stress and loneliness in ways verbal affection alone cannot.

  • Loneliness acts like a “biological alarm” that is only switched off by specific environmental cues, especially physical contact.

  • Current AV platforms create “reduced‑cue environments” that disrupt interpersonal neural synchrony; symbolic cues like emojis can’t replace sensorimotor touch.​

  • An effective solution must combine shared, meaningful activities with reciprocal, believable haptic touch, while respecting safety, consent, and cultural differences around touch.

Design Process

  1. Research & Framing

Reviewed work on affectionate touch, loneliness, and relationship satisfaction to understand why touch is so central to feeling secure with a partner. Also explored social VR and haptic technologies (e.g., Pneumatic and Acoustic haptic suits) to see how synchronised visual and tactile cues could recreate hugs and strokes at a distance.

  1. "5 Whys" and POV

  • Used a 5 Whys exercise to dig into why couples feel disconnected despite frequent AV communication, tracing it to filtered non‑verbal cues, lack of neural synchrony, and an un‑switched “loneliness alarm.”

  • Defined a point‑of‑view statement: long‑distance couples like Maya need ways to recreate embodied presence and affectionate touch at a distance because existing VMC tools make them feel physically alone and insecure despite “being in contact.”

  1. Ideation

  • Mapped opportunities around VR/MR spaces (couch, balcony, rooftop dates), haptic touch patterns (hugs, hand holding, arm strokes), shared activities (films, co‑op games, meditation), safety and consent, and practical constraints (bulky headsets, time zones).

  • Applied SCAMPER to a mixed‑reality session concept: substitute standard video with a high‑fidelity VR environment, combine haptic feedback with physiological cues, adapt mirror‑touch so what you see your partner do is exactly what you feel, and eliminate over‑reliance on emojis and likes.

  1. Flow and interaction model

  • Designed a session flow from launch to summary: pairing headsets and suits, selecting touch modes and allowed areas, choosing a scene (date spot, relaxing couch, MR space), interacting with 3D avatars, triggering bi‑directional haptic patterns, and ending with a gentle wind‑down and session recap.

  • Emphasised consent and control at every step with explicit touch‑mode selection, adjustable intensity, opt‑in touch areas, and clear pause/stop options.

  1. Prototyping

  • Created low‑fidelity sketches to explore key interaction points: partner invitation, consent, mode selection, in‑room gestures, and error states (network or suit issues).

  • Built a high‑fidelity Vision Pro‑inspired interface in Figma following visionOS 26 guidelines and SF Symbols, blending real‑world passthrough with spatial UI for partner cards, session settings, and environment selection.

Prototype

The final Embrace prototype is an interactive mixed‑reality experience designed for Apple Vision Pro, showing how a long‑distance couple can meet in a virtual living room, configure touch boundaries, and share hugs and hand‑holding through synchronised haptic suits. It includes flows for account and avatar setup, partner pairing, touch‑mode and consent settings, scene selection, in‑room interactions (talking, environment changes, playing together), error handling, and a session summary screen.