[Back]


Diploma and Master Theses (authored and supervised):

G. Sörös:
"We present a physically based real-time water simulation and rendering method that brings volumetric foam to the real-time domain, significantly increasing the realism of dynamic fluids. We do this by combining a particlebased fluid model that is capable of accounting for the formation of foam with a layered rendering approach that is able to account for the volumetric properties of water and foam. Foam formation is simulated through Weber number thresholding. For rendering, we approximate the resulting water and foam volumes by storing their respective boundary surfaces in depth maps. This allows us to calculate the attenuation of light rays that pass through these volumes very efficiently. We also introduce an adaptive curvature flow filter that produces consistent fluid surfaces from particles independent of the viewing distance.";
Supervisor: P. Rautek; Institut für Computergraphik und Algorithmen, 2010.



English abstract:
Contemporary visualization systems often make use of large monitors or projection screens to display complex information. Even very sophisticated visualization systems, that offer a wide variety of interaction possibilities and exhibit complex user interfaces, do usually not make use of additional advanced input and output devices. The interaction is typically limited to the computer mouse and a keyboard. One of the reasons for the lack of advanced interaction devices is the high cost of special hardware. This thesis introduces the idea of Augmented Visualization. The aim of the project is to develop a novel interaction solution for projection walls as well as PC monitors using cheap hardware such as mobile phones or tablets. Several features of mobile devices will be exploited to improve the interaction experience. The main technical challenge of the project is to implement a solution for markerless visual tracking of the changing visualized scene. In the proposed setup, this also requires real-time wireless video streaming between the mobile device and the PC. The realtime tracking of the visualized scene will allow to estimate the six-degrees-of-freedom pose of the mobile device. The calculated position and orientation information can be used for advanced interaction metaphors like magic lenses. Moreover, for a group of experts who are analyzing the data in front of the same screen, we can provide a personal augmented view of the visualized scene, for each user on his/her personal device. The thesis discusses the design questions and the implementation steps of an Augmented Visualization System, describes the prototype setup and presents the experimental results.


Electronic version of the publication:
http://publik.tuwien.ac.at/files/PubDat_217621.pdf


Created from the Publication Database of the Vienna University of Technology.