MIT researchers have developed the Interactive Digital Item (IDI) concept, which enhances the digital preservation of personal items by capturing their appearance and real-world interactivity. They developed a program called InteRecon to let users scan objects through a mobile app and animate them in mixed-reality settings.
This technology can replicate physical interactions, like a bobblehead’s movements or playing videos on a vintage TV, creating more lifelike and memorable digital experiences.
InteRecon has the potential to revolutionize various fields by capturing and reconstructing objects’ interactive features. Teachers could use it to vividly demonstrate concepts like gravity, while museums could animate exhibits, such as paintings or mannequins, to engage visitors in new ways. This technology could aid in medical training by simulating surgeries or cosmetic procedures step by step.
According to CSAIL researcher Zisu Li, InteRecon enables users to preserve personal items’ appearance and interactivity, transforming static memories into dynamic, interactive digital experiences within mixed-reality settings. This innovation aims to create deeper, more vivid connections to cherished objects and moments.
To bring objects to life digitally, the team behind InteRecon started with an innovative iPhone app. Users can use the app’s camera feature to thoroughly scan an item by circling it three times, ensuring a comprehensive capture. Once the scan is complete, the item transforms into a 3D model, ready to be imported into InteRecon’s mixed-reality interface.
Here, users have the power to define interactivity. They can decide which model parts will be animated by marking specific areas, such as a doll’s arms or a bobblehead’s torso. InteRecon also offers automatic segmentation for a smoother process, making it simple to identify and animate various components.
The mixed reality interface, accessible via headsets like Hololens 2 and Quest, takes interactivity even further. It provides programmable motion options for different object parts, demonstrated through motion previews.
Users can tweak animations—experimenting with sliding, swinging, or pendulum-like movements—to achieve the perfect result, such as bunny ears flopping playfully. With this level of customization, InteRecon transforms static objects into dynamic, interactive digital creations.
The team behind InteRecon extended its capabilities to recapture the interface of physical and electronic devices, making digital interactivity even more immersive. Imagine digitizing a vintage TV—after creating a detailed 3D model of the item, users can customize its features by adding interactive virtual widgets.
For instance, you could attach an “on/off” button, a rotating knob to adjust volume, a screen to display videos and even a channel switch. Researchers demonstrated this by embedding old videos into the digital TV, bringing it to life in a mixed-reality space. The customization options don’t stop there. With InteRecon, users can explore widget motions through interactive previews.
Whether it’s a slider for DJ booth settings or a camera viewfinder screen, every movement can be tailored to match the item’s functionality. The possibilities are endless, from enhancing nostalgic gadgets to innovating new ways to engage with digital interfaces.
For music enthusiasts, imagine digitizing an iPod: upload your favorite MP3s, add a “play” button, and suddenly, your cherished device is reborn in mixed reality, ready to deliver your favorite tunes interactively.
Researchers believe InteRecon could transform virtual environments by making them more lifelike and interactive. A user study showed people from diverse fields found the tool easy to use and effective for preserving rich memories.
Users envisioned professional applications for InteRecon, such as training medical students in surgeries, helping travelers document their trips, and enabling fashion designers to test materials. To support advanced uses, the team plans to improve the precision of their simulation engine, which could enhance tasks requiring accuracy, like surgical training.
Future developments include integrating large language models and generative tools to recreate lost items as 3D models based on descriptions, automating interface explanations, and building interactive digital twins of larger environments like offices. They also aim to explore 3D printing for physically recreating lost items.
Journal Reference:
- Zisu Li, Jiawei Li et al. InteRecon: Towards Reconstructing Interactivity of Personal Memorable Items in Mixed Reality. arXiv: 2502.09973v1
Source: Tech Explorist