On March 8, Dene Grigar participated in the TedxTalk at Marshall University. The talk, entitled “Making the Virtual Real and the Real Virtual,” focused on the Virtual Reality implementation in the Viz space at The NEXT, a project partially funded by a 2024 WSU Vancouver Research Mini-Grant.
Joining her on stage was Andrew Thompson, the XR programmer in the Electronic Literature Lab, built the site in open Web languages and WebXR, making the code open to the public for use.
Here is the script from the event:
“I remember standing in front of a display case of ancient Greek pottery, mesmerized by the diversity of their shapes and designs. It was at the National Museum in Athens, and, at the time, I was just starting to study ancient Greek literature and language. It was hard to see the some of the inscriptions on the vases in the case in front of me. It was also hard to get an idea of the vases’ weight and, in some cases, size.
What I really wanted to do was hold them––one at a time––and turn around them in my hands. Read the writing that was there. Look carefully at the figures painted on them. Get a sense of their wear by seeing the cracks and chips. Look at the parts of the vases I could not see, like behind them. But I couldn’t, right? I mean, they were behind glass for a reason: They are precious physical artifacts from the Neolithic Period of Greek history. No curator in their right mind would let the public handle them.
I had traveled 1000s of miles to see them, yet standing just a foot away, I felt so far from them. At the same time, I was glad they were being so well taken care of by the museum and held on to for posterity. Thousands of years had passed since potters turned their wheels to produce these pots, yet here they were, right there, in front of me, to enjoy and study.
Over 30 years have passed since my experience at the National Museum, and I’ve traveled to many other places to study art and culture since that time, but my yearning to touch the things I see has not passed. In fact, the pandemic––when we all could not travel––only amplified this desire.
A couple of years before the pandemic I led the effort to build an online repository of digital art, literature, and games––what we eventually built into virtual museum and library called The NEXT. Along with the digital files we were archiving and curating came many, many boxes of physical archives that were also donated––objects, like artists’ notebooks, flyers announcing artists’ readings, floppy disks containing their work, and even beach balls they used in their performances. Among these boxes were some amazing historical artifacts dating back to the mid-1980s when computers were just starting to be used more readily to make artistic things, what constitutes the beginning of our “digital history.” I thought people into this type of thing would find these artifacts very interesting, just as I did the pots of Greek prehistory.
So, in 2020 at the start of the pandemic, my lab––the Electronic Literature Lab––began to create 3D models of some of these artifacts and feature them at The NEXT on a space we ended up calling the Visualization space, or “Viz space.” This means that when people visit the space on the Web, they can manipulate the models in the browser, turn them around with their hands and read the words, see the images, even the cracks and chips, on the entire artifact the models represent. The idea we were shooting for was providing the opportunity for visitors to interact with the artifacts.
But interaction wasn’t enough. Even though we could simulate the experience of handling the objects, we couldn’t build a sense of immersion, which I see as a way to better understand things. Immersion, you see, suggests being submerged into something, to be inside it. Immersion helps us to go beyond the border of reality and lose the feeling of separation with a thing by closing the gap of space between it and us, between the real and the unreal. It makes the experience feel immediate, helping us imagine a 3D model as something more than a reproductions of the artifact but, rather, the artifact itself.
Immersion can be created in many ways, but what we chose for the artifacts in the Viz space was Virtual Reality, or “VR.” So, a year ago, we began programming the Viz space so that anyone, anywhere, with a Quest 2, 3 or an Apple Vision Pro can put on their headset and immerse themselves in the space with the artifacts. In that VR environment, a visitor living here in Huntington, West Virginia, can, for example, experience John McDaid’s “chocolate box of death” that holds his sci-fi, mystery, hypermedia novel, Uncle Buddy’s Phantom Funhouse. This black box contains clues for solving the mystery: five floppy disks, two music cassettes, a short story for Vortex magazine, a letter from the magazine’s editor, and a user’s manual. Using their hands, visitors interact with the music cassettes packed in the box, turn the box around, read the words on the floppy disks, and inspect the wear and tear of the box itself.
Another artifact that speaks to what we think of as historic digital art is Deena Larsen’s hand-made pinwheel that she created in the 1990s to teach people, especially those who did not have much experience with computers, about interactive digital poetry. Visitors to the Viz space can hold the pinwheel in their hand and turn its blades, reading the lines of poetry that appear as the blades turn.
Both of these artifacts are precious to those of us working with digital art, literature, and games because they speak of a time when artists were pioneering ideas and techniques that today we take for granted—but was, at the time, considered strange and new. For many of us working in this area of study, we hope that these artifacts will be as important to future generations as the ancient Greek pottery was to me.
I should mention that all of the code we produced for this project is open source and available for anyone to download and create their interactive, digital artifacts.
I think about the advantages of using VR for experiencing artifacts in a museum or library. I could not imagine teachers able to take their students to the National Museum in Athens to see pottery they are studying in class. The cost alone would be prohibitive except, perhaps, for the wealthiest of school districts. But I can imagine a teacher asking their school to purchase a refurbished headset for $250 and having their students enter into a VR space to experience Neolithic pottery held by the museum, re-created as 3D models. That seems totally possible to me. I also imagine museums and libraries that offer some of their artifacts to the public as VR and Augmented Reality experiences cultivating young patrons who cut their teeth on Pokemon Go and games in Meta like Beat Saber and, so, are not just familiar with Extended Reality but prefer these kinds of experiences.
And finally, I can imagine that even when I was standing in front of the case of pottery 30 years ago how helpful it would have been if I could have donned a headset right there in the museum and interacted with the very pot that fascinated me the most, touching it and looking at all of it, as much as I wanted.