Researchers at UNIST have developed an innovative AI technology capable of reconstructing highly detailed three-dimensional (3D) models of companion animals from a single photograph, enabling realistic animations. This breakthrough allows users to experience lifelike digital avatars of their companion animals in virtual reality (VR), augmented reality (AR), and metaverse environments.
Category: virtual reality


Identifying a compass in the human brain
Zhengang Lu and Russell Epstein, from the University of Pennsylvania, led a study to explore how people maintain their sense of direction while navigating naturalistic, virtual reality cities.
As reported in their JNeurosci paper, the researchers collected neuroimaging data while 15 participants performed a taxi-driving task in a virtual reality city. Two brain regions represented a forward-facing direction as people moved around. This neural signal was consistent across variations of the city with different visual features.
The signal was also consistent across different phases of the task (i.e., picking up a passenger versus driving a passenger to their drop-off location) and various locations in the city. Additional analyses suggest that these brain regions represent a broad range of facing directions by keeping track of direction relative to the north–south axis of the environment.


Nanodevice uses sound to sculpt light, paving the way for better displays and imaging
Light can behave in very unexpected ways when you squeeze it into small spaces. In a paper in the journal Science, Mark Brongersma, a professor of materials science and engineering at Stanford University, and doctoral candidate Skyler Selvin describe the novel way they have used sound to manipulate light that has been confined to gaps only a few nanometers across—allowing the researchers exquisite control over the color and intensity of light mechanically.
The findings could have broad implications in fields ranging from computer and virtual reality displays to 3D holographic imagery, optical communications, and even new ultrafast, light-based neural networks.
The new device is not the first to manipulate light with sound, but it is smaller and potentially more practical and powerful than conventional methods. From an engineering standpoint, acoustic waves are attractive because they can vibrate very fast, billions of times per second.


New 3D headset uses holograms and AI to create lifelike mixed reality visuals
Using 3D holograms polished by artificial intelligence, researchers introduce a lean, eyeglass-like 3D headset that they say is a significant step toward passing the “Visual Turing Test.”
“In the future, most virtual reality displays will be holographic,” said Gordon Wetzstein, a professor of electrical engineering at Stanford University, holding his lab’s latest project: a virtual reality display that is not much larger than a pair of regular eyeglasses. “Holography offers capabilities that we can’t get with any other type of display in a package that is much smaller than anything on the market today.”
Holography is a Nobel Prize-winning 3D display technique that uses both the intensity of light reflecting from an object, as with a traditional photograph, and the phase of the light (the way the waves synchronize), to produce a hologram, a highly realistic three-dimensional image of the original object.


New haptic technology adds the sense of touch to virtual reality
USC scientists have developed a wearable system that enables more natural and emotionally engaging interactions in shared digital spaces, opening new possibilities for remote work, education, health care and beyond.
Touch plays a vital role in how humans communicate and bond. From infancy through adulthood, physical contact helps foster emotional bonds, build trust and regulate stress. Yet in today’s increasingly digital world, where screens mediate many of our relationships, it is often missing.
To bridge the gap, researchers at the USC Viterbi School of Engineering have developed a wearable haptic system that lets users exchange physical gestures in virtual reality and feel them in real time, even when they’re miles apart. Their paper is published on the arXiv preprint server.
Virtual reality nature scenes ease pain sensitivity, especially with strong sense of presence
Immersing in virtual reality (VR) nature scenes helped relieve symptoms that are often seen in people living with long-term pain, with those who felt more present experiencing the strongest effects.
A new study led by the University of Exeter, published in the journal Pain, tested the impact of immersive 360-degree nature films delivered using VR compared with 2D video images in reducing the experience of pain, finding VR almost twice as effective.
The paper is titled “Immersion in nature through virtual reality attenuates the development and spread of mechanical secondary hyperalgesia: a role for insulo-thalamic effective connectivity.”