Exploring dimensionality reduction effects in mixed reality for analyzing tinnitus patient data

  • Burkhard Hoppenstedt  
  • bManfred Reichert  
  • cChristian Schneider  
  • Klaus Kammerer  
  • Winfried Schlee  
  • Thomas Probst  
  • Berthold Langguth  
  • Rüdiger Pryss  
  • abdInstitute of Databases and Information System, Ulm University
  • cegDepartment of Psychiatry and Psychotherapy, University of Regensburg
  • Department for Psychotherapy and Biopsychosocial Health, Danube University, Krems
Cite as
B. Hoppenstedt, M. Reichert, C. Schneider, K. Kammerer, W. Schlee, T. Probst, B. Langguth, R. Pryss (2018). Exploring dimensionality reduction effects in mixed reality for analyzing tinnitus patient data. Proceedings of the 4th International Conference of The Virtual And Augmented Reality In Education (VARE 2018), pp. 163-170. DOI: https://doi.org/10.46354/i3m.2018.vare.024

Abstract

In the context of big data analytics, gaining insights into high-dimensional data sets can be properly achieved, inter alia, by the use of visual analytics. Current developments in the field of immersive analytics, mainly driven by the improvements of smart glasses and virtual reality headsets, are one enabler to enhance user-friendly and interactive ways for data analytics Along this trend, more and more fields in the medical domain crave for this type of technology to analyze medical data in a new way. In this work, a mixed-reality prototype is presented that shall help tinnitus researchers and clinicians to analyze patient data more efficiently. In particular, the prototype simplifies the analysis on a high-dimensional real-world tinnitus patient data set by the use of dimensionality reduction effects. The latter is represented by resulting clusters, which are identified through the density of particles, while information loss is denoted as the remaining covered variance. Technically, the graphical interface of the prototype includes a correlation coefficient graph, a plot for the information loss, and a 3D particle system. Furthermore, the prototype provides a voice recognition feature to select or deselect relevant data variables by its users. Moreover, based on a machine learning library, the prototype aims at reducing the computational resources on the used smart glasses. Finally, in practical sessions, we demonstrated the prototype to clinicians and they reported that such a tool may be very helpful to analyze patient data on one hand. On the other, such system is welcome to educate unexperienced clinicians in a better way. Altogether, the presented tool may constitute a promising direction for the medical as well as other domains.

References

  1. Arms, L., Cook, D., & Cruz-Neira, C., 1999. The benefits of statistical visualization in an immersive environment. Virtual Reality, 1999. Proceedings: 88-95.
  2. Bakker, S., van den Hoven, E., & Antle, A. N., 2011. MoSo tangibles: evaluating embodied learning. Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction,85-92.
  3. Chandler, T., Cordeil, M., Czauderna, T., Dwyer, T., Glowacki, J., Goncu, C., ... & Wilson, E., 2015. Immersive analytics. Big Data Visual Analytics (BDVA), 2015: 1-8.
  4. Chen, X., Self, J. Z., House, L., & North, C., 2016. Be the data: A new approach for Immersive analytics. Immersive Analytics (IA), 2016 Workshop on: 32-37.
  5. Clausen, C., & Wechsler, H., 2000. Color image compression using PCA and backpropagation learning. pattern recognition, 33(9), 1555-1560.
  6. Donalek, C., Djorgovski, S. G., Cioc, A., Wang, A., Zhang, J., Lawler, E., ... & Davidoff, S., 2014. Immersive and collaborative data visualization using virtual reality platforms. Big Data (Big Data), 2014 IEEE International Conference on, 609-614.
  7. Evans, G., Miller, J., Pena, M. I., MacAllister, A., & Winer, E., 2017. Evaluating the Microsoft HoloLens through an augmented reality assembly application. Degraded Environments: Sensing, Processing, and Display 2017 (Vol. 10197): p.101970V.
  8. Gracia, A., González, S., Robles, V., Menasalvas, E., & Von Landesberger, T., 2016. New insights into the suitability of the third dimension for visualizing multivariate/multidimensional data: A study based on loss of quality quantification. Information Visualization, 15(1): 3-30.
  9. Grinberg, M., 2018. Flask web development: developing web applications with python. O'Reilly Media, Inc.
  10. Hoffman, K., & Kunze, R., 1971. Characteristic values in linear algebra. Prentice-Hall, New Jersey.
  11. Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe, R., Kohli, P., ... & Fitzgibbon, A., 2011. KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. Proceedings of the 24th annual ACM symposium on User interface software and technology, 559-568.
  12. LaValle, S., 2013. Sensor Fusion: Keeping It Simple. Available from: https://developer.oculus.com/blog/sensor-fusionkeeping-it-simple/ [accessed 12 May 2018]
  13. Milgram, P., & Kishino, F., 1994. A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems, 77(12), 1321-1329.
  14. Pedregosa, F., et al., 2011. Scikit-learn: Machine learning in Python. Journal of machine learning research 12: 2825-2830.
  15. Peña, J. M., 2013. Reading dependencies from covariance graphs. International Journal of Approximate Reasoning, 54(1): 216-227.
  16. Probst, T., Pryss, R., Langguth, B., & Schlee, W., 2016. Emotional states as mediators between tinnitus loudness and tinnitus distress in daily life: Results from the “TrackYourTinnitus” application. Scientific reports, 6, 20382.
  17. Pryss, R., Probst, T., Schlee, W., Schobel, J., Langguth, B., Neff, P., ... & Reichert, M., 2018. Prospective crowdsensing versus retrospective ratings of tinnitus variability and tinnitus–stress associations based on the Track Your Tinnitus mobile platform. International Journal Analytics, 1-12.of Data Science and
  18. Raja, D., Bowman, D., Lucas, J., & North, C., 2004. Exploring the benefits of immersion in abstract information visualization. Proc. Immersive Projection Technology Workshop: 61-69.
  19. Schlee, W., Pryss, R. C., Probst, T., Schobel, J., Bachmeier, A., Reichert, M., & Langguth, B., 2016. Measuring the moment-to-moment variability of tinnitus: the TrackYourTinnitus smart phone app. Frontiers in aging neuroscience, 8, 294.
  20. Sedlmair, M., Munzner, T., & Tory, M., 2013. Empirical guidance on scatterplot and dimension reduction technique choices. IEEE transactions on visualization and computer graphics, 19(12): 2634-2643.
  21. Technologies, U., 2015. Unity - Manual: Unity Manual. [online] Docs.unity3d.com. Available at: http://docs.unity3d.com/Manual/index.html [Accessed 03 April 2018].
  22. Van Der Maaten, L., Postma, E., & Van den Herik, J., 2009. Dimensionality reduction: a comparative. Journal of Machine Learning Research, 10, 66-71.
  23. Wagner Filho, J. A., Rey, M. F., Freitas, C. M., & Nedel, L., 2017. Immersive Analytics of Dimensionally-Reduced Data Scatterplots. Available from: http://immersiveanalytics.net/ [accessed 19 April 2018]
  24. Walt, S. V. D., Colbert, S. C., & Varoquaux, G., 2011. The NumPy array: a structure for efficient numerical computation. Computing in Science & Engineering, 13(2): 22-30.
  25. Wold, S., Esbensen, K., & Geladi, P., 1987. Principal component analysis. Chemometrics and intelligent
    laboratory systems, 2(1-3), 37-52.
  26. Yang, J., Zhang, D., Frangi, A. F., & Yang, J. Y., 2004. Two-dimensional PCA: a new approach to appearance-based face representation and recognition. IEEE transactions on pattern analysis and machine intelligence, 26(1), 131-137.