by Gijs den Butter and Natalia Alvarez Milan – SenseGlove
Over the past decade, haptics has gone from a niche novelty to a critical pillar of immersive technology. As virtual and augmented reality continue to grow, the ability to feel virtual objects, avatars and holoported peers — through gloves, vests, and other tactile interfaces — is no longer a futuristic idea; it’s a present-day necessity. But while audio and video content have long benefited from mature, global standards, haptics has remained stuck in a fragmented ecosystem of proprietary tools, formats, and platforms.This is finally changing — and PRESENCE is at the forefront of that change.
.
Haptics Becomes a First-Class Media Citizen
In October 2021, a major shift occurred: MPEG officially recognized haptics as a core media type, placing it on equal footing with audio and video. This milestone paved the way for haptics to be encoded, streamed, and rendered within the same ecosystem that powers today’s media experiences — from mobile devices to cinema and XR headsets. It opened the door to a future where your favorite film or training simulation could include a haptic “track,” adding touch to the sensory mix.
To support this vision, MPEG released Reference Model 0 (RM0), a foundational format built on the Interhaptics technology platform. RM0 set the groundwork for what is now becoming the HJIF MPEG haptics format — a standard that we use within our PRESENCE haptics API and is pushed to be widely adapted in for instance advisory groups to Khronos as adopting this as OpenXR haptics standardization.
PRESENCE: Leading by Example
Within the PRESENCE project’s Work Package 3 (WP3), three major players in the haptics space — SenseGlove, Interhaptics, and Actronika — have joined forces to create the world’s first multi-device reference implementation of the MPEG haptics standard. This isn’t just theory or lab work — it’s a working, integrated system involving three commercial haptic devices (SenseGlove Nova 2, Skinetic vest, and the Meta Quest 3 controller), all operating through a shared, standardized PRESENCE Haptics API.
Until now, the haptics market has been fragmented into isolated technology silos. Each device spoke its own “language,” making it difficult — if not impossible — for developers to build rich, consistent experiences that work across different platforms. PRESENCE changes this by aligning with MPEG’s new standard and enabling real-time transcoding from device-specific formats into the unified HJIF format, and vice versa.
When a user touches a virtual object in a PRESENCE-haptics enabled XR scene, the system’s communication pipeline automatically decodes the appropriate haptic effect from the HJIF file and renders it through the user’s device — whether it’s a glove, a vest, or another actuator.

User immersed in an VR scene using SenseGlove Nova 2 Haptic gloves and Actronica’s Skinetec vest.
Why It Matters
This isn’t just about convenience or interoperability (though both are important). It’s about scale. With a standardized format, haptics can now be streamed alongside audio and video. It can be authored once and played anywhere. It can be reliably delivered across networks, devices, and platforms. In short: haptics is finally ready for prime time.
And this vision has been years in the making. All three PRESENCE partners have contributed to the Haptics Industry Forum, working to close the knowledge gap around haptics implementation in XR. Together, they also serve as an advisory group to Khronos, helping shape an OpenXR haptics extension that will make these standards even more accessible to developers worldwide.
What’s Next?
The PRESENCE team is now refining the transcoding pipeline,— ensuring full compatibility across the PRESENCE ecosystem, integrating more devices and also allowing for haptic content creation tools to encode their proprietary formats in the MPEG Hjif file format. At the same time, user experiments are planned to evaluate how well these standardized haptic signals perform compared to proprietary ones, helping validate the perceptual fidelity of the MPEG format.
The goal is simple but powerful: to make haptics as plug-and-play, reliable, and scalable as sound and vision. With standards in place and working implementations in the wild, PRESENCE is helping bring that future one step closer.
