Behind all of the ugly construction surrounding Barus and Holley, the 180 George building houses an unparalleled virtual reality system. Named for its shape, furbished with 69 projectors, and a pixel resolution equivalent to retinal display (the human eye could not perceive anything more detailed), the Yurt Ultimate Reality Theater opened in the summer of 2015. The field of virtual reality isn’t new to Brown; before we had this machine, there was the Cave Automatic Virtual Environment (CAVE). Technology like the CAVE allowed scientists and artists alike to explore their fields with three dimensional visualizations. MRI scans became interactive for medical students, while poets experimented with words that literally jumped off the projector. In 2009, Brown decided it was time for an upgrade, involving a 360 degree display with interactive floors and ceilings.
In the spirit of educating campus about this exciting new feature, we hung out with Computer Science Professor David Laidlaw to talk about his brainchild, and the functions of high quality virtual reality. Before diving into the interview, you should know that Professor Laidlaw is a busy man. It took over two weeks to set up a time to see YURT, and right before our rendezvous, this texting exchange occurred:
Upon greeting me (after class, of course), his first question was, “what happened to your mustache?” Let it also be noted that I initially forgot to take my shoes off inside the machine and that every time Professor Laidlaw handed me a piece of equipment, static shocks reverberated in both our hands. Basically, the interview was electric. Jokes aside, it was time to get down to serious business. A.K.A. it was time to play 3D Minecraft, or as David affectionately called it, YURTCraft.
As someone who has toyed around with an Oculus Rift headset before, I was really impressed at how minimalist and comfortable the 3D glasses were. But, whoever is “driving” (whoever’s head motions the YURT is tracking) has to wear some particularly silly glasses with antennae attached. The only other tool to operate the YURT from within is a small, handheld controller. It functions as an efficient weapon in YURTCraft.
The 360 degree display is so captivating, I can’t imagine an occasion where one would remember to look up, and see the only opening to the outside world. Even the floor is capable of projecting whatever virtual reality you happen to find yourself in. A computer outside of the immediate machine is used to control what program is running. This isn’t the only computer linked to YURT. In fact, in the room next door, there are 20 more.
Moving on from video games, I was able to check out a huge satellite image of the Moon. As I was wearing the funky, driving glasses, the image would re-adjust to appear straight from my perspective, depending on my position. This one was particularly exciting, because it was actually projected in thin air, thus I was able to walk through it, and see it from behind. The behind view wasn’t any more fascinating, but hasn’t everyone dreamed of walking through a virtual barrier and pretending they entered a different dimension?
After running me through a brief demo of a couple of more things that YURT can do – explore the surfaces of other planets, visualize mathematical spirals in four dimensions, project an image of a 300 foot parchment that changes position, and zoom based on eye-tracking 3D glasses – Professor Laidlaw agreed to answer some questions:
BlogDH: For people who aren’t familiar with the literature behind this, what is your most layman’s description of what YURT is?
BlogDH: What do you think the role of sci-fi is in the inspiration and implementation of technology like this?
L: I think that displays and faking out our perceptual systems, and our brains is a big piece of Sci-Fi. In that sense, it’s kind of an inspiration. Sometimes it’s a good metaphor to explain what we’re trying to do. I think of the Matrix as another example of what this is, but not quite . . .
BlogDH: Minus the controlling robots!
L: Well, also minus the rest of your senses. It’s really only hitting vision, and that’s true for the Holodeck too. The Holodeck is not just vision. You can ride the horses, and touch the ruins, and open the doors, and you can’t do that in here. Also, [the Holodeck] works for a bunch of people at the same time, and this only works for one (referring to the one head tracker device), so they’re not perfect metaphors. For now, they serve as metaphors, or as a vision for where we might want to go in the future.
BlogDH: Do you think that, in the future, maybe there will be multi-sense virtual reality?
L: The Atlantic asked about haptics – haptics is your sense of touch, and I basically said it’s not going to happen for 100 years. So I think, it’s just too hard.
BlogDH: You talk a lot in interviews about how this is great compared to earlier iterations, because it’s moving towards more suspended disbelief. Is that just important for aesthetics, or is that actually important to doing the research, and stuff that takes place in here?
L: Well one of the things that I believe, based on 15 years of playing with this stuff, is that there’s some magic here. When you’re in there, and you can get you’re disbelief suspended – I know it’s good for some things. I know it’s good for architectural walk throughs, and maybe for looking for complex 3D things like brains. It’s good for training people to use really expensive pieces of equipment you don’t want to crash, like giant ships and airplanes, and I think its also good for some other things too. I don’t know what those are, but I want to figure that out. I want to be able to characterize, what do you need 360 degrees surround to do, are there sciences that would benefit from that? I think [Geology Professor] Jim Head would say “Oh yeah, there are sciences that benefit from that, I can walk on the surface of Mars now, and I couldn’t before.”
BlogDH: Which came first, the professional clamor for this, or the idea to advance technology and see what this is good for?
L: You know, it’s a chicken and egg thing, and the way you phrased it, it ends up that way in particular. There have been stereo monitors, and stereo big screens, and stereo cubes, and now stereo – I don’t know what shape to call this – yurts, and on the way people have figured out applications, and other applications, and things that don’t work. It’s really been a cycle that’s gone back and forth many times; I don’t know what the first one was *laughs*. I suspect that someone had a use that they thought it was important for, and then they built something.
BlogDH: Lastly, does anyone ever get motion sick in there?
L: Yes, all the time. It’s a very common occurrence.
BlogDH: I don’t know if I was getting it in there, but a little bit, like now that I’m out, I’m feeling a little funky . . .
L: And it can actually persist for a little while, so if you don’t feel well, definitely sit down and take it easy!
Right now, the YURT is not hugely accessible to the average student population. Demos are time consuming, as you can only fit between 12 and 15 people in the YURT at once, before the sheer amount of bodies interferes with a participant’s ability to see everything. (Fun fact: the engineers say you can fit 27 people in before it breaks – YURT-PARTY!) It also inhibits anyone from doing actual work with the machine while the demo is underway. If one were really interested, contacting the Center for Computing and Visualization department could be a good start, or seeking out the old Cave, which is now located in the Granoff building. In terms of utilizing the work for independent projects, Professor Laidlaw said, “If someone really knows what they want to do . . .they have a vision for it . . .and they have the background – whether that’s modeling experience, or programming experience, or 3D something experience – I’m certain that we can work something out.”
Hopefully, in the future, the YURT team will be equipped with the resources for any student to come and check out this instrument. On the bright side, Professor David Laidlaw is very accessible if you were to enroll in one of his Computer Science courses, such as CS16!
As an afterthought, we asked Professor Laidlaw, “Why it was so important that technology like this gets funded, and created?”
“I think that it has the potential to accelerate a lot of things, to teach science at the pace of technology . . .This is another one of those kinds of instruments that can be used to figure out how brains work, how muscles work, how joints work, and how people work. It’s all part of advancing knowledge; sometimes, you need to spend some money to find some knowledge.”
Images via Matteo Mobilio ‘16.5, and Caitlin Dorman ’16.