{"id":8501,"date":"2025-04-10T16:43:13","date_gmt":"2025-04-10T16:43:13","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/8501\/"},"modified":"2025-04-10T16:43:13","modified_gmt":"2025-04-10T16:43:13","slug":"a-new-way-to-bring-personal-items-to-mixed-reality-mit-news","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/8501\/","title":{"rendered":"A new way to bring personal items to mixed reality | MIT News"},"content":{"rendered":"<p>Think of your most prized belongings. In an increasingly virtual world, wouldn\u2019t it be great to save a copy of that precious item and all the memories it holds?<\/p>\n<p>In mixed-reality settings, you can create a digital twin of a physical item, such as an old doll. But it\u2019s hard to replicate interactive elements, like the way it moves or the sounds it makes \u2014 the sorts of unique interactive features that made the toy distinct in the first place.<\/p>\n<p>Researchers from MIT\u2019s Computer Science and Artificial Intelligence Laboratory (CSAIL) sought to change that, and they have a potential solution. Their \u201cInteRecon\u201d program enables users to recapture real-world objects in a mobile app, and then animate them in mixed-reality environments.\u00a0<\/p>\n<p>This prototype could recreate the interaction functions in the physical world, such as the head motions of your favorite bobblehead, or playing a classic video on a digital version of your vintage TV. It creates more lifelike and personal digital surroundings while preserving a memory.<\/p>\n<p>InteRecon\u2019s ability to reconstruct the interactive experience of different items could make it a useful tool for teachers explaining important concepts, like demonstrating how gravity pulls an object down. It could also add a new visual component to museum exhibits, such as animating a painting or bringing a historical mannequin to life (without the scares of characters from \u201cNight at the Museum\u201d). Eventually, InteRecon may be able to teach a doctor\u2019s apprentice organ surgery or a cosmetic procedure by visualizing each motion needed to complete the task.<\/p>\n<p><a id=\"article-video-inline\"\/><\/p>\n<p>            The \u201cInteRecon\u201d program enables users to recapture real-world objects in a mobile app, and then animate them in mixed-reality environments.<br \/>&#13;<br \/>\nVideo: MIT CSAIL        <\/p>\n<p>The exciting potential of InteRecon comes from its ability to add motions or interactive functions to many different objects, according to CSAIL visiting researcher Zisu Li, lead author of a\u00a0<a href=\"https:\/\/arxiv.org\/abs\/2502.09973\" target=\"_blank\" rel=\"noopener\">paper<\/a> introducing the tool.<\/p>\n<p>\u201cWhile taking a picture or video is a great way to preserve a memory, those digital copies are static,\u201d says Li, who is also a PhD student at the Hong Kong University of Science and Technology. \u201cWe found that users wanted to reconstruct personal items while preserving their interactivity to enrich their memories. With the power of mixed reality, InteRecon can make these memories live longer in virtual settings as interactive digital items.\u201d<\/p>\n<p>Li and her colleagues will present InteRecon at the 2025 ACM CHI conference on Human Factors in Computing Systems.<\/p>\n<p><strong>Making a virtual world more realistic<\/strong><\/p>\n<p>To make digital interactivity possible, the team first developed an iPhone app. Using your camera, you scan the item all the way around three times to ensure it\u2019s fully captured. The 3D model can then be imported into the InteRecon mixed reality interface, where you can mark (\u201csegment\u201d) individual areas to select which parts of the model will be interactive (like a doll\u2019s arms, head, torso, and legs). Alternatively, you can use the function provided by InteRecon for automatic segmentation.<\/p>\n<p>The InteRecon interface can be accessed via the mixed reality headset (such as Hololens 2 and Quest). It allows you to choose a programmable motion for the part of the item you want to animate after your model is segmented.<\/p>\n<p>Movement options are presented as motion demonstrations, allowing you to play around with them before deciding on one \u2014 say, a flopping motion that emulates how a bunny doll\u2019s ears move. You can even pinch a specific part and explore different ways to animate it, like sliding, dangling, and pendulum-like turns.<\/p>\n<p><strong>Your old iPod, digitized<\/strong><\/p>\n<p>The team showed that InteRecon can also recapture the interface of physical electronic devices, like a vintage TV. After making a digital copy of the item, you can customize the 3D model with different interfaces.<\/p>\n<p>Users can play with example widgets from different interfaces before choosing a motion: a screen (either a TV display or camera\u2019s viewfinder), a rotating knob (for, say, adjusting the volume), an \u201con\/off\u201d-style button, and a slider (for changing settings on something like a DJ booth).<\/p>\n<p>Li and colleagues presented an application that recreates the interactivity of a vintage TV by incorporating virtual widgets such as an \u201con\/off\u201d button, a screen, and a channel switch on a TV model, along with embedding old videos into it.\u00a0This makes the TV model come to life. You could also upload MP3 files and add a \u201cplay button\u201d to a 3D model of an iPod to listen to your favorite songs in mixed reality.<\/p>\n<p>The researchers believe InteRecon opens up intriguing new avenues in designing lifelike virtual environments. A user study confirmed that people from different fields share this enthusiasm, viewing it as easy to learn and diverse in its ability to express the richness of users\u2019 memories.<\/p>\n<p>\u201cOne thing I really appreciate is that the items that users remember are imperfect,\u201d\u00a0says Faraz Faruqi SM \u201922, another author on the paper who is also a CSAIL affiliate and MIT PhD student in electrical engineering and computer science. \u201cInteRecon brings those imperfections into mixed reality, accurately recreating what made a personal item like a teddy bear missing a few buttons so special.\u201d<\/p>\n<p>In a related study, users imagined how this technology could be applied to professional scenarios, from teaching medical students how to perform surgeries to helping travelers and researchers log their trips, and even assisting fashion designers in experimenting with materials.<\/p>\n<p>Before InteRecon is used in more advanced settings, though, the team would like to upgrade their physical simulation engine to something more precise. This would enable applications such as helping a doctor\u2019s apprentice to learn the pinpoint accuracy needed to do certain surgical maneuvers.<\/p>\n<p>Li and Faruqi may also incorporate large language models and generative models that can recreate lost personal items into 3D models via language descriptions, as well as explain the interface\u2019s features.<\/p>\n<p>As for the researchers\u2019 next steps, Li is working toward a more automatic and powerful pipeline that can make interactivity-preserved digital twins of larger physical environments in mixed reality for end users, such as a virtual office space. Faruqi is looking to build an approach that can physically recreate lost items via 3D printers.<\/p>\n<p>\u201cInteRecon represents an exciting new frontier in the field of mixed reality, going beyond mere visual replication to capture the unique interactivity of physical objects,\u201d says Hanwang Zhang, an associate professor at Nanyang Technological University&#8217;s College of Computing and Data Science, who wasn\u2019t involved in the research. \u201cThis technology has the potential to revolutionize education, health care, and cultural exhibitions by bringing a new level of immersion and personal connection to virtual\u00a0environments.\u201d<\/p>\n<p>Li and Faruqi wrote the paper with the Hong Kong University of Science and Technology (HKUST) master\u2019s student Jiawei Li, PhD student Shumeng Zhang, Associate Professor Xiaojuan Ma, and assistant professors Mingming Fan and Chen Liang from HKUST; ETH Zurich PhD student Zeyu Xiong; and Stefanie Mueller, the TIBCO Career Development Associate Professor in the MIT departments of Electrical Engineering and Computer Science and Mechanical Engineering, and leader of the HCI Engineering Group. Their work was supported by the APEX Lab of The Hong Kong University of Science and Technology (Guangzhou) in collaboration with the HCI Engineering Group.<\/p>\n","protected":false},"excerpt":{"rendered":"Think of your most prized belongings. In an increasingly virtual world, wouldn\u2019t it be great to save a&hellip;\n","protected":false},"author":2,"featured_media":8502,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3162],"tags":[5546,5542,5543,5540,3596,5539,5541,5544,5547,53,16,15,3243,3244,5545],"class_list":{"0":"post-8501","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-virtual-reality","8":"tag-faraz-faruqi","9":"tag-interactive-3d-reconstruction","10":"tag-interactive-digital-items","11":"tag-interecon","12":"tag-meta-quest","13":"tag-mit-csail","14":"tag-mixed-reality","15":"tag-personal-memory-archive","16":"tag-stefanie-mueller","17":"tag-technology","18":"tag-uk","19":"tag-united-kingdom","20":"tag-virtual-reality","21":"tag-vr","22":"tag-zisu-li"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114314674428084276","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/8501","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=8501"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/8501\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/8502"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=8501"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=8501"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=8501"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}