New Mixed Reality Tool Brings Interactive Physical Items to Life in Virtual Environments
In an era where digital experiences increasingly shape daily life, researchers are exploring new ways to preserve personal memories through immersive technologies. A novel approach developed by the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory (CSAIL) allows users to digitally reconstruct personal items in mixed-reality settings, not just visually, but with their original interactive behaviours intact.
The tool, named InteRecon, represents a prototype system designed to animate the tactile and dynamic features of physical objects in virtual environments. Whether it’s the nodding head of a cherished bobblehead or the click of a vintage television’s rotary dial, InteRecon captures these elements to enhance realism and emotional resonance in digital spaces.
The research team developed an iPhone application to serve as the initial interface for users. This app enables the comprehensive scanning of physical items through a series of three full rotations using the mobile camera, ensuring complete digital capture. Once the item is modelled in 3D, it is imported into the InteRecon mixed reality interface, where specific areas of the model can be selected for interactive behaviour. This segmentation process—manual or automated—designates parts such as limbs, knobs, or screens for animation.
Compatible with popular mixed reality headsets like HoloLens 2 and Meta Quest, the interface allows users to apply pre-defined motion presets to selected parts. These include movements like flopping, sliding, dangling, or pendulum swings, creating a rich, customisable experience. For instance, a user can replicate the floppy ears of a toy rabbit or emulate the sliding function of an electronic device’s control.
The project highlights how InteRecon can digitise more than just the physical structure of objects. A key example presented was the recreation of a vintage television, enhanced with virtual widgets such as power buttons, rotating dials, and display screens that stream actual media content. Users can also digitise devices like iPods by embedding MP3 files and adding virtual play buttons, offering the ability to listen to music within a simulated environment.
The research team envisions applications well beyond nostalgia. In educational contexts, animated models could help demonstrate physical laws, such as the effect of gravity, or visualise complex procedures like surgical operations. Museums could use this technology to breathe life into static exhibits, animating artwork or mannequins to increase engagement and realism.
Lead researcher Zisu Li, a PhD student at the Hong Kong University of Science and Technology and a CSAIL visiting researcher, outlined the significance of preserving object interactivity. While traditional images and videos capture appearances, they fall short of retaining the essence of how objects behave. InteRecon addresses this gap by translating both form and function into mixed reality, helping users relive their memories in a more dynamic format.
User studies indicate that the tool is well-received across different professional fields. Participants found it intuitive and appreciated its ability to authentically reproduce imperfections, such as missing buttons on a toy, that often carry sentimental value. This attention to detail lends authenticity to digital replicas and supports memory preservation in meaningful ways.
Collaborating institutions included the Hong Kong University of Science and Technology, ETH Zurich, and MIT, with contributions from researchers across departments of computer science, mechanical engineering, and human-computer interaction. The team plans to improve InteRecon’s physical simulation engine, enabling more precise applications in medical training or industrial design.
Future developments may also integrate generative AI and large language models. These additions could facilitate the recreation of lost items through descriptive prompts and offer guided explanations of the interface. Researchers are also exploring the possibility of physically recreating digital twins using 3D printing technologies.
Further down the line, the project aims to scale up from individual objects to entire environments—like virtual offices or rooms—preserved with full interactivity. This would allow users not only to revisit but also to interact with entire physical spaces in digital form.
InteRecon is scheduled for presentation at the 2025 ACM CHI Conference on Human Factors in Computing Systems. The research marks a significant step in the evolution of mixed-reality technology, offering an emotionally rich and functionally robust method for preserving the personal and professional value of physical objects in virtual spaces.