Virtual, augmented, and mixed reality in education: a NERCOMP workshop

Yesterday I organized a NERCOMP event on virtual, augmented, and mixed reality in education.  I’d like to share some of the proceedings which seem especially important and also cool.

Most of the actual work was done by Emory Craig (University of New Rochelle, and of Digital Bodies fame), who hurled the audience into a deep exploration of this technology.  It was a pleasure to see him work, both as presenter and as patient hands-on guide to the hardware. We were also joined by professor Dmitry Korkin (Worcester Polytech), who’s done pioneering work on using mixed reality to visualize biology.

NERCOMP VR 2018_KorkinFirst, we began by asking the audience to describe where their institutions were with this technology.  Overall, they were in very early days, and unevenly.  Interesting snapshot of VR etc in 2018:

  • Some were doing nothing programatically, but people (faculty, admin, IT, librarians) were curious.
  • One is about to conduct an inventory of faculty interest and projects.  This suggests bottom-up work going on.
  • One campus maintains a VR cart.  Interesting practice.
  • Several have a VR lab or other fixed station, like a Knowledge Lab or sandbox
  • Some technology in use: Augment, Unity, 360 video, Google Cardboard.
  • There were several who connected VR/AR/MR work to Maker spaces.
  • Disciplines exploring the technology: chemistry; a “cyborg self” class.
  • Example uses: a virtual fossil; an augmented bus stop redesign.
  • Also on campus: sports analysis and training; campus tours.
  • One looming challenge: accessibility.
  • Simulations seemed especially amenable to VR/AR/MR.

Then I offered my quick introduction to the topic.

Second, professor Korkin presented on his work in bioinformatics.  He described the challenges of integrating structural bioinformatics, network biology, and next generation sequencing analysis.  One problem is that visualizing human proteins for network analysis is extremely difficult, as the results usually resemble “a hairy ball” full of edge/node occlusions.  Worse, any visualization needed to scale up to thousands or millions of nodes and edges, while being receptive to search and analysis.

Korkin turned to Microsoft Hololens, in part for its built-in gaze, voice, and gesture recognition, and partly because of the chance to get pre-commercial tech.  His team was also able to reproduce both network and spatial relationships within proteins.  They assembled a data workflow: using BioLayout to embed 3d data -> data preprocessing ->3d network reconstruction.  They called it HoloNet.

Users can see proteins and networks in space, and manipulate them by voice and gesutre. The results are pretty awesome:

Looking ahead, professor Korkin described applying this approach to mapping brain data, allowing the integration of fMRI and neural circuit information.  “We are now forming a corps of mixed reality enthusiasts.”  He showed us an early demo – built earlier that week! – which was mind-blowing.

Third, during discussion throughout the day the audience surfaced more ideas and issues:

    • Live streaming classes in 360, pros and cons (make classrooms more present, but tech challenges)
    • A recommendation for Jeremy Bailenson‘s Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Do
    • Tentative interest in where the porn industry is going
    • Questions of accessibility
    • ” ” applying major visualizations to undergraduate learning
    • How will faculty get inspired by other faculty members?
    • Combining 3d visualization with 2d (video) live supplement: a useful practice

Meanwhile, on Twitter Steve Covello shared this resource:

Coming up next month is a NERCOMP webinar on “Getting Started with Virtual Reality.”  That looks good.

Summing up: academic use of VR/AR/MR is still in early days.  Call it the heroic phase, where people are creating stuff from scratch and/or using first-generation commercial technology.  Professional development and institutional support is in a nascent stage.  This is a time of exploration.

How far will education go with this technology?

Liked it? Take a second to support Bryan Alexander on Patreon!
This entry was posted in education and technology, technology. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *