Presenters and audience have individual dashboards with recording features for smooth data presentation and self-reflection, supporting both 3D and 2D data visualization for better content control.
Individual Dashboard - Presenter
Individual Dashboard -Reflection
Experience data presentation beyond 2D screens, where students can interact with 3D data like scatter plots and spatial GIS maps. By dragging files from the dashboard, they can scale, highlight, and zoom data, switching between a centralized table view and a room view for a fully immersive experience.
Walk-through Experience Mode
Centralized Presentation Mode
Data Presentation in AR Experience
Switching the Level of Immersion
Hybrid working and learning are becoming more common, but interpersonal interaction is missing. DoryVR addresses this by allowing users to easily switch levels of immersion, making meetings more accessible and connected.
We identified two research questions that would help us hone our understanding of education and data presentation in VR.
Competitive Analysis
To answer the first question, we conducted a literature review on data storytelling, VR education, and VR data visualization. Additionally, we performed a comparative analysis of four prominent VR data visualization tools to create a foundational map of what a data storytelling experience in VR could look like.
For this question, we conducted 2 classroom observations to shadow students and instructors. This allowed us to observe both teaching and learning processes in action.
We followed up with 5 contextual inquiries with students to understand their behavioral patterns, issues, and goals regarding VR presentations. We also interviewed 5 experts in data visualization, storytelling, and VR education technology to gain contextual knowledge and field insights.
I attended their class, which was hosted in VR!
After collecting data, we used an affinity diagram to identify common themes and created an empathy map to better understand our findings, which were categorized into general presentation in VR, presenting data in 3D, and the advantages of AR versus VR.
A multi-sensory experience in XR cannot be achieved on a traditional flatscreen display. An Educational XR Technology researcher highlights that “XR enables us to walk through the [virtual] galaxy, or explore the intricacies of the human heart.” Students also points out how VR changed their spatial perception and communication.
However, the current app students use does not take advantage of how individual user views can be different in VR; Every user sees all the materials that exist in the VR space. Moreover, students often felt a lack of control in their presentation and found it difficult to present without their speaker notes.
Finally, students do not feel they receive enough tailored tips or feedback for self-improvement.
“Most of the time, I would just explore the product on my own because it’s the fastest way to get familiar with this type of product – by trying on your own time” – Student P1
Students present in an unconventional classroom setting: the woods
Our interviews with experts and literature reviews helped us discover common issues with 2D data visualization and storytelling. These include visual clutter (“elements that take up space but don’t increase understanding” [Storytelling with Data] and information overload. We observed students struggling with conveying the right amount of context and information to the audience due to the limited dimensions of 2D flat screens.
We further broke down the pros and cons of 2D and 3D visualization to identify the best ways to present different types of data.
Despite the class being held fully in VR, 80% of our participants reported varying levels of dizziness or motion sickness, suggesting accessibility issues within VR.
In response, we explored Augmented Reality (AR) as an alternative, as it tends to cause fewer of these symptoms. AR allows students to move freely in their physical space while interacting with virtual class elements.
Our competitive analysis and expert interviews suggest that AR will become increasingly popular for business meetings, with one expert stating, “AR is going to be a much more important factor to meeting apps moving forward.”
In order to find a direction for our ideation and prototyping, we asked ourselves 3 how-might-we questions. These three questions helped us to isolate three areas of concern regarding teaching in Virtual Reality and presenting data within XR spaces:
Based on our research insights and three key questions, our team used crazy 8 to brainstorm design solutions on a virtual wall to discuss and compare ideas. We then conducted a design charrette for 8 ideas, sketching on a whiteboard to refine our concepts. This process allowed us to narrow our focus for the lo-fi prototyping stage.
Crazy 8 Ideation
Design Charette
From our user study, we identified several pain points in how people currently present, such as no control of the slides, over-emphasis on reciting, and action/emotion disconnection in sharing content.
From our user study, we identified several pain points in how people currently present, such as no control of the slides, over-emphasis on reciting, and action/emotion disconnection in sharing content.
Based on the feedback, we selected to focus on the individual dashboard (with the pointer tool feature) and a feedback interface, as these features were well-received and helped improve our ultimate goal of improving the data storytelling experience in VR.
Created storyboards for all ideas/parallel testings
Used 2D graphics & Spatial to prototype
Individual Dashboard Mid-fi Prototype
Self-reflection Mid-fi Prototype
In our mid-fi testing, we recruited 5 students from the class, and they responded positively to our design. To further test this idea along with other features in an integrated solution, we implemented the hi-fi version of the dashboard using Unity, which I will introduce later in the other design challenges.
In the ideation phase, we proposed the concept of data presentation table, where students can present their data in different scales. This idea received great feedback, with people imagining various use cases and perspectives for different modes.
However, when we started lo-fi prototyping, we realized it was crucial to go beyond 2D flat screens. The key question for us became: How might we prototype an immersive and interactive experience?
Centralized Mode & Immersive Mode Storyboards
After some exploration, we decided to use Minecraft for prototyping because it renders a simplified 3D experience that allows us to roughly prototype the experience of both table-style presentations and walk-in style presentations.
Lo-fi prototype
While we received positive feedback on the scalable and interactive data feature during storyboard testing, when we presented the scenes using Minecraft, users felt the room was too small to engage with the scaled-up data. Users also expressed some feelings of being overwhelmed by the 3D prototype.To better verify this idea, we need to create a more spacious environment that facilitates user interaction with the main feature.
To better verify this idea, we need to create a more spacious environment that facilitates user interaction with the main feature.
For our mid-fi prototype, we built an open-air circular auditorium in the middle of an endless ocean to evoke a sense of infinite space and focus audience attention on the presenter. The calm ocean setting and altered game shaders were chosen to mimic a VR aesthetic and improve feedback accuracy.
Mid-fi prototype
After testing with our users and communicating with the client, while this idea was interesting, it was not effectively delivering any information from the storytelling aspect. For instance, it was hard to compare the bar graphs in the prototype.
The perception of 3D pie charts is not ideal
Datasets with 3 or more dimensions would be great in VR
Different Views on Dashboard
Centralized Presentation Mode & Walk-through Experience Mode
Almost all students experienced varying levels of motion sickness and fatigue during their 1 hour and 20 minute class. Since in-person attendance was not mandatory, some students felt more comfortable staying at home while others attended in the classroom.
To make the tool more accessible and adaptable to hybrid learning and collaboration, we proposed allowing users to control their level of immersion from VR to AR or vice versa. Due to technical limitation, this idea wasn’t prototyped until we started using Unity.
Storyboarding
Mid-fi Version
How do we decide in what cases users need to change the level of immersion and when would it be beneficial to the users? Here, we considered 3 typical use cases: Hybrid, All remote, and All in-person.
Level of immersion User Flow
For In-person meetings, this feature is the most beneficial when students can work in AR mode, where they can interact with 3D datasets while being able to see other people. For fully virtual meetings, users can adjust immersion based on their comfort and environment.
For Hybrid meetings, while possible, are less ideal due to overlapping virtual and physical objects impacting immersion. Consequently, we focused our prototype on the first two scenarios.
Previously, we tested the Level of Immersion feature separately from our other features. However, participants often connected all the features together, noting that if switching between AR and VR is possible, then data could go beyond common graphs. Users could experience "live" data immersively or explore it on physical objects.
This insight led us to realize we could dive deeper into an integrated solution, envisioning how users present data in new places and learning by interacting, not just seeing.
Connecting the Dashboard Design with Virtual Materials
Individual Dashboards for Presenters and Students
Previously, we tested the Level of Immersion feature separately from our other features. However, participants often connected all the features together, noting that if switching between AR and VR is possible, then data could go beyond common graphs. Users could experience "live" data immersively or explore it on physical objects.
This insight led us to realize we could dive deeper into an integrated solution, envisioning how users present data in new places and learning by interacting, not just seeing.
There was a miscommunication between me and the engineer, and I didn't know we only had an iOS version, so I didn't include that in the screener. This led to serious issues with recruiting participants as not all of them had phones with iOS system. Although we found ways to help each participant obtain a device, this was a lesson for me to make sure that I needed to check every single details of the testing process.
I joined the team as the sole researcher, while the rest of the team members had been collaborating with each other for a long time. Because it was such a small team, I decided to start by learning contextual information and identifying what people were frustrated or concerned about before making any research plans. This not only helped me to adjust research focus and synthesize results that were more useful for each member but also helped in gaining support and trust from them.