The IRTG is one of Prof. Dr. Hans Hagen’s most successful research projects.

The Training Group finances 22 Ph.D. Students and six PostDocs. The students are provided with excellent education based on international and interdisciplinary research in the field of visualization of large and unstructured data sets.

A visiting scientists program has been established where distinguished researchers from important universities and research institutes all over the world are invited to share their knowledge. Additionally, every IRTG student spends 9 months in the US, for example at UC Davis and continues to work on his project, there.



The aims of this International Research Training Group (IRTG) are twofold:

  • First of all, we want to provide the participating students with an excellent education based on international and interdisciplinary research and additional courses and talks given by excellent and renowned researchers.
  • On the other side, we want to perform excellent interdisciplinary research in the field of the visualization of large and unstructured data sets. In implementing the IRTG according to the above goals we establish an institution that provides the brightest perspectives to our most talented students for further scientific work.

This is a further example of work done by IRTG students: The graph shows the simulated temperature of a heptane fire using raycasting algorithms.

The way, how the research and educational objectives within the IRTG are linked to practical problems, can be described best using the following example scenario:

One method of conducting experiments, e.g., on the effects of earthquakes is to simulate them within a centrifuge (as it is done in the Center for Geotechnical Modeling at the University of California, Davis). Here a scaled down model is shaken according to measured earthquake data. These experiments result in large and unstructured data sets that describe the behavior of the model according to the respective stimuli. Visualization now is the central tool to make these data sets understandable to the experimenters.

This starts already during the design of the experiment. Here visualization methods can be used as quality control tools for the simulation set-up and to visualize several possible simulation alternatives.

During the execution of the experiment, visualization tools now already serve as one first instance to supervise the experiment and get a first impression of its possible outcome. This is especially important when the time of access to complex experiment facilities is limited. Here with the use of visually processed result data, the experimenters can decide already during the execution of their experiment how it should be continued.