Visual Support for Architectural Design and Construction

Research Project 28-1 (RP 28-1)

VISUAL SUPPORT FOR ARCHITECTURAL DESIGN AND CONSTRUCTION

This project will develop methods to support and evaluate design and construction processes in an architectural context. We will achieve this by incorporating eye-tracking methodology combined with visualisation and human-computer interaction techniques. This combination will help to investigate such processes from a perceptual and cognitive perspective, providing insights into how people work with new visual interfaces and how they perceive virtual and real-world architecture. New techniques will be developed to handle spatio-temporal data in such a context, with post-experimental evaluation as well as live processing with gaze-based interaction techniques. In particular, we will focus on the analysis and support of collaborative procedures (e.g. gigamapping) and design parameter space exploration. Both scenarios present new challenges for analysis in social and desktop-based situations. We will then extend our research to scenarios in spatial contexts, i.e. movement through real, virtual, and augmented buildings. Finally, our research will be extended by scenarios in fabrication and construction environments where, in addition to evaluation procedures, the development of gaze-based human-machine interaction techniques promises new ways to support work routines.

 

INDEPENDENT JUNIOR RESEARCH GROUP LEADER

Dr. Kuno Kurzhals
Visualization Research Center (VISUS), University of Stuttgart

TEAM

Nelusa Pathmanathan (VISUS)

 

PEER-REVIEWED PUBLICATIONS

  1. 2024

    1. Pathmanathan, N., Rau, T., Yang, X., Calepso, A. S., Amtsberg, F., Menges, A., Sedlmair, M., & Kurzhals, K. (2024). Eyes on the Task: Gaze Analysis of Situated Visualization for Collaborative Tasks. 2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR), 785–795. https://doi.org/10.1109/VR58804.2024.00098
  2. 2023

    1. Chen, K.-T., Ngo, Q. Q., Kurzhals, K., Marriott, K., Dwyer, T., Sedlmair, M., & Weiskopf, D. (2023). Reading Strategies for Graph Visualizations That Wrap Around in Torus Topology. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications. https://doi.org/10.1145/3588015.3589841
    2. Koch, M., Kurzhals, K., Burch, M., & Weiskopf, D. (2023). Visualization Psychology for Eye Tracking Evaluation. In D. Albers Szafir, R. Borgo, M. Chen, D. J. Edwards, B. Fisher, & L. Padilla (Eds.), Visualization Psychology (pp. 243--260). Springer International Publishing. https://doi.org/10.1007/978-3-031-34738-2_10
    3. Kurzhals, K. (2023). Privacy in Eye Tracking Research with Stable Diffusion. Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, 1–7. https://doi.org/10.1145/3588015.3589842
    4. Pathmanathan, N., Öney, S., Becher, M., Sedlmair, M., Weiskopf, D., & Kurzhals, K. (2023). Been There, Seen That: Visualization of Movement and 3D Eye Tracking Data from Real-World Environments. Computer Graphics Forum, 42(3), Article 3. https://doi.org/10.1111/cgf.14838
    5. Öney, S., Pathmanathan, N., Becher, M., Sedlmair, M., Weiskopf, D., & Kurzhals, K. (2023). Visual Gaze Labeling for Augmented Reality Studies. Computer Graphics Forum, 42(3), Article 3. https://doi.org/10.1111/cgf.14837
  3. 2022

    1. Abdelaal, M., Schiele, N. D., Angerbauer, K., Kurzhals, K., Sedlmair, M., & Weiskopf, D. (2022). Comparative Evaluation of Bipartite, Node-Link, and Matrix-Based Network Representations. IEEE Transactions on Visualization and Computer Graphics, 1–11. https://doi.org/10.1109/TVCG.2022.3209427
    2. Koch, M., Kurzhals, K., Burch, M., & Weiskopf, D. (2022). Visualization Psychology for Eye Tracking Evaluation. arXiv. https://doi.org/10.48550/ARXIV.2204.12860
    3. Koch, M., Weiskopf, D., & Kurzhals, K. (2022). A Spiral into the Mind: Gaze Spiral Visualization for Mobile Eye Tracking. Proceedings of the ACM on Computer Graphics and Interactive Techniques, 5(2), Article 2. https://doi.org/10.1145/3530795
    4. Kurzhals, K., Becher, M., Pathmanathan, N., & Reina, G. (2022). Evaluating Situated Visualization in AR with Eye Tracking. 2022 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV), 77–84. https://doi.org/10.1109/BELIV57783.2022.00013
  4. 2021

    1. Franke, M., Martin, H., Koch, S., & Kurzhals, K. (2021). Visual Analysis of Spatio-temporal Phenomena with 1D Projections. Computer Graphics Forum, 40(3), Article 3. https://doi.org/10.1111/cgf.14311
    2. Kurzhals, K. (2021). Image-Based Projection Labeling for Mobile Eye Tracking. ACM Symposium on Eye Tracking Research and Applications, 1–12. https://doi.org/10.1145/3448017.3457382

OTHER PUBLICATIONS

  1. 2024

    1. Kurzhals, K. (2024). Anonymizing eye-tracking stimuli with stable diffusion. Computers & Graphics, 119, 103898. https://doi.org/10.1016/j.cag.2024.103898

DATA SETS

  1. 2023

    1. Pathmanathan, N., Öney, S., Becher, M., Sedlmair, M., Weiskopf, D., & Kurzhals, K. (2023). Replication Data for: Been There, Seen That: Visualization of Movement and 3D Eye Tracking Data from Real-World Environments. DaRUS. https://doi.org/10.18419/darus-3383
    2. Öney, S., Pathmanathan, N., Becher, M., Sedlmair, M., Weiskopf, D., & Kurzhals, K. (2023). Replication Data for: Visual Gaze Labeling for Augmented Reality Studies. DaRUS. https://doi.org/10.18419/darus-3384
  2. 2022

    1. Abdelaal, M., Schiele, N. D., Angerbauer, K., Kurzhals, K., Sedlmair, M., & Weiskopf, D. (2022). Supplemental Materials for: Comparative Evaluation of Bipartite, Node-Link, and Matrix-Based Network Representations. DaRUS. https://doi.org/10.18419/DARUS-3100

    

To the top of the page