AIResearch AIResearch
Back to articles
AI

Virtual Reality Now Lets You Feel Fluids and Solids Together

A new simulation framework enables real-time touch interaction with complex physics in VR, allowing users to manipulate objects in fluid and feel accurate reaction forces for educational applications.

AI Research
March 26, 2026
4 min read
Virtual Reality Now Lets You Feel Fluids and Solids Together

Imagine reaching into a virtual reality environment and not just seeing water flow around your hand, but actually feeling the resistance and buoyancy as you push through it. This level of tactile realism has been a significant in VR development, especially when trying to simulate interactions between different materials like fluids and solids. Researchers have now developed a unified framework that enables real-time, bidirectional haptic interaction with rigid bodies, deformable objects, and Lagrangian fluids in virtual reality, creating physically meaningful tactile responses that enhance immersion and educational value.

The key finding from this research is that users can now manipulate virtual objects immersed in fluid and feel reaction forces consistent with fluid-structure behavior in real time. The framework integrates Smoothed Particle Hydrodynamics (SPH) with two-way force coupling and feedback smoothing to maintain stability while producing accurate tactile responses. This means that when you stir virtual water with a paintbrush or perform a simulated surgical incision, the haptic device provides force feedback that matches what you would expect from real-world physics, allowing for direct interaction with simulated objects through both motion and touch.

Ology centers on using SPH as a unified particle-based approach to represent fluids, soft bodies, and rigid bodies within a single computational model. SPH works by calculating forces on discrete particles that carry mass, momentum, and thermodynamic properties, with interactions governed by smoothing kernels. To handle coupled interactions, each particle is assigned a type—fluid, rigid, or soft—with specific integration features enabled based on that type. For example, rigid bodies are represented as tightly packed particles that repel fluids while maintaining constant relative displacement from their parent object's transformation, whereas soft bodies use spring forces to maintain distance between particles smoothly over time. The simulation runs in the Unity game engine with compute shaders on the GPU, utilizing spatial hashing with a uniform grid to efficiently find neighboring particles and marching cubes algorithms for real-time surface generation.

Demonstrate that the framework performs well under various conditions, with performance metrics showing real-time operation around 60 Hz for particle counts up to approximately 50,000. In demonstration scenarios, such as one with three balls of different densities interacting with water and a cloth plane, the simulation maintained stability over time with average framerates of 42 FPS for 45,284 particles. Another scenario involving a surgical scene with a scalpel and soft body representing skin achieved 89 FPS with 24,217 particles, while a painting simulation with color mixing reached 118 FPS with 7,734 particles. The haptic integration, which uses a 6-DoF device to apply forces from grabbed rigid bodies, was stabilized through rolling averages over several update cycles to damp oscillations without reducing responsiveness, as shown in Figure 1 of the paper.

Of this work are significant for educational applications in virtual reality, where it can reduce constraints like travel, material setup, and safety risks. By enabling physically coherent interactive experiences that merge visual immersion with tactile realism, the framework supports learning in fields such as medical procedures, industrial operations, and emergency response. For instance, it allows learners to practice complex tasks like surgical incisions or fluid dynamics experiments in a safe, controlled environment, with force feedback helping them understand material characteristics and dynamic behavior better than visual cues alone.

However, the research acknowledges limitations, including that the force feedback magnitude is not always well matched to expectations, requiring empirical tuning of parameters. This from the simulation's resolution and the implementation of haptic interaction handling under low-granularity data constraints necessary for real-time applications. Additionally, while the framework is scalable on consumer hardware like an NVIDIA RTX 3080 Mobile GPU, further optimizations in SPH algorithms and better integration with game engine physics systems could improve accuracy and stability for more complex scenarios.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn