Tumgik
#virtual vritual reality
canmom · 1 year
Text
observations of virtual reality: movement and touch and cameras
So. Since I’ve got this job the last few weeks I’ve been getting a very rapid introduction to VR games. I’m coming in after a good few years and several generations of VR games, so by this point, tracking is very good, and new tech like AR passthrough and hand tracking is starting to become more commmon.
Predictably, it’s been absolute fascinating! On the surface VR games should be able to do everything that “flat screen” games can do and more, with the added bonus of head tracking and binocular vision. You are adding, not removing. And in some ways this is true. Moving through a VR space is incredibly intuitive, it’s very much like moving in a real physical space. If you want to get a close look at something, you stick your face up close to it. Compared to controlling an orbiting camera with an analogue stick it’s night and day. You never have to ‘fight’ the camera.
But... there’s many exceptions to that ‘same but more’. The weight of the headset is one. Especially with standalone headsets, where you’re essentially strapping a smartphone to your face. VR games often ask for a lot of physical movement and this is sweaty; the foam pad around the headset always gets very damp when you’re done.
But most of all, when you’re designing these things, you’re fighting the devil called ‘motion sickness’. This imposes a hard limit on performance: you cannot afford to drop frames, 72fps is a hard minimum. It also has big implications for game design.
(below: a little tour of design considerations in VR games: movement, interaction and camera mechanics)
One of the most fundamental verbs of videogames is movement. Nearly every genre of 3D game has you moving a character, or at least a camera, through its space. And while motion sickness certainly does affect players, it’s relatively rare, at least among my generation who grew up playing these games. We have, it seems, been able to train ourselves to feel comfortable behind a first or third-person character controller - at least when we’re the ones holding the mouse so the brain can calibrate expectations. (It’s different when watching a stream for many people.) Perhaps, even when you’re not paying attention to it, the presence of a stationary world outside the screen helps anchor the brain’s perception so it doesn’t find a clash between eyes saying ‘movement’ and inner ear saying ‘not movement’.
In VR, though, it’s a puzzle. The most natural way to move through a virtual space would be to physically walk around it, but most players don’t have a play area the size of a typical game level. And very very few people in the world could afford to own an omnidirectional treadmill!
So you have to in some way apply movement that is ‘unphysical’, moving the character without moving the player’s physical body (or equivalently, moving the world around the character). The obvious way to do this would be to use the same analogue sticks as in typical character controller, but in practice this is disfavoured because it’s very, very likely to cause motion sickness.
Instead, game designers have to adapt their own games. The usual solution is to give players a means to teleport. For some reason the brain is a lot more comfortable with instant teleportation. In Superhot for example, the game teleports you through a series of preset positions; Half Life: Alyx has a variety of movement options but the default is a targeted teleport. A whole lot of other games have you standing in one place while objects come towards you; more recently there are AR games which superimpose vritual objects into your space.
What’s strange is there are some odd exceptions to that motion sickness rule. While moving along the ground seems prone to motion sickness, flying often seems to be a lot more comfortable, at least if the flight is relatively smooth and slow. There’s a whole category of flying games available on the store. Perhaps this is because in these circumstances you wouldn’t expect a very strong acceleration signal from the inner ear, so it’s easier to get over the lack of signal.
There’s one game I tried a bit of, called Echo Arena, which is a multiplayer zero-g football-like game where you have to get a disc to a goal. The game includes a variety of movement mechanics: you can thrust in the direction you’re looking, you can use your hands to apply smaller thrusts, and you can push off solid surfaces. Despite the fact that I was standing on a solid floor (my legs got a bit tired!), the zero-g effect was quite compelling; it was easier to suspend disbelief than I thought it might be.
I imagine you could do some incredibly cool movement tech once you got used to it. Compared to other zero-g games I’ve played like Shattered Horizon, it felt a lot more intuitive to actually be immersed in the 3D space. However, it also had some limitations: making small turns was easy, but to make larger turns you either had to physically turn around on the floor (which feels unintuitive when you’re floating around in a zero-g world!) or use the analogue stick to turn in abrupt 12.5-degree increments. This is a lot fiddlier than mouse-look.
I have a pretty robust stomach against motion sickness, so I don’t know if the average player of Echo Arena would find it too much. But I found it surprising how something I thought would be a route to motion sickness - floating around at fairly high speeds - actually didn’t prove problematic at all. I think it’s perhaps that the majority of the time, movement in Echo Arena is purely inertial with no acceleration. You accelerate in brief boosts. By contrast, if you could constantly move with the analogue sticks, your acceleration would be a lot more variable.
I do wonder how it compares to really floating around in a spacesuit in zero-g. There’s probably a single digit number of people who would be able to make the comparison!
Another surprising exception is the game Holoception, made by my new employer a few years before I joined the company. This game’s really cute, it’s like the ultimate form of Newgrounds stick figure fighting games. You control a third-person character, viewing the level from above. Your hands control the character’s hands, so you can swing weapons physically, and you can make them walk around with the analogue stick. As strange as it sounds on the surface (a blend of traditional control schemes and VR), it works startlingly well. I found I got used to controlling the little puppet very quickly; the only problems I had were more to do with occasional physics jank. And, oddly, moving with the analogue stick works just fine and does not cause motion sickness when you’re a floating eye in the sky.
The brain is a weird thing.
Anyway, a disproportionate number of VR players seem to be kids, at least going off the voices of people who talk aloud in VC. I don’t know if that’s just because kids find it easier to acclimate to a new way of relating to 3D space, or if it’s just the demographics of videogame players at large, or just the time of day I was playing when most adults would be at work, or maybe it’s just that kids find it easier to stand up for long periods to play games lmao. [Accessibility for people who can’t stand for long periods is a big problem for the current generation of VR games.] In any case, I wonder if it might be the case that if VR ends up getting popular enough, the next generation might find VR movement as intuitive as I find movement in an FPS. (My dad, by contrast, finds it extremely hard to get used to movement in 3D games. He tried Portal once and found it completely overwhelming.)
For any cyberpunk worldbuilders out there, perhaps one day we’ll get inner ear implants that override the sense of acceleration when in VR. Then maybe we get some Ghost in the Shell type scenario where you can hack someone’s cochlear implant and give them severe motion sickness. Or, perhaps we’ll get as comfortable disregarding our inner ear in VR as we do when playing an FPS...
The other interesting challenge of VR is touching solid surfaces. The amount of haptic feedback on most platforms is: you can make the controller vibrate. That’s it. For a game like Beat Saber, that’s enough: the controller gives a little jolt every time you hit a block, which is enough confirmation.
My employer has a line of games based around hand tracking, in the form of Hand Physics Lab and Surgineer. HPL is a collection of small puzzles designed around hand tracking, and the way it works is quite interesting. Essentially you have a virtual hand that is attached to your real hand by springs. The virtual hand will try to follow the position of your real hand as closely as possible, but it is a physics-simulated object and it will collide with the environment. So, for example, you can grasp an object by closing your hand around it. The springs will pull your virtual fingers onto the object. (There is also a grab assist which will glue an object to your hand when it detects a grab motion).
Your fingers will feel the pressure of closing against your hand, and even though you don’t feel the weight and texture of the virtual object, it works well enough to sell the suspension of disbelief.
It is, inevitably with this hardware, a bit jank and fiddly. The hand tracking has its limits, and sometimes your virtual hand will get caught on something or bent in a funny way. Amusingly, a lot of the minigames are toys we give to babies: it is like we are relearning how to move, just as we did when we were fresh new brains awash with sense-data. But despite that, it works way better than you’d expect it to. You build an intuitive sense of how your ‘ghost hand’ relates to your real hand, and how to get it to do certain things. It doesn’t feel like really interacting with solid objects, but you can interact with objects with a great deal more dexterity than you can in a regular 3D physics-manipulation game (Garry’s Mod or something).
The VR controller is a small and fairly light piece of plastic. The hand tracking is literally an empty hand. In games, it can become all sorts of things. Most often it’s a hand, or else a weapon like a sword or a gun. The player can swing their hand or controller around freely, so how do you communicate a sense of weight? Well, proprioception - your body’s sense of where your limbs are - isn’t actually that precise. If you push against a heavy object, even if your real hands go straight through it, if your virtual hands strain to get it moving then your brain will override the proprioception with what it’s seeing, and it will still feel like you’re pushing something heavy.
Surgineer builds on HPL’s hand tracking concept to have you play the role of a surgeon, picking up tools to cut a patient’s skin and bones. It’s deliberately quite silly - the surgeries you’re performing quickly escalate to things like brain transplants, and you have some pretty magical tools - but it is a game of fairly fine manipulation. I found this one tended to work a lot better with controllers (the controller buttons are used to determine the positions of your fingers, with the trigger and grip button causing your virtual hand to close), and I was able to complete most of the game, if not with very high scores. It’s a funny game, though quite difficult! I think it would be great for streams.
Now, the final level has you manipulating a robot arm with a joystick. This is where it kind of fell apart for me - a real joystick has resistance against being pushed, which the controller, held freely in your hand, doesn’t, so it’s difficult to rotate the virtual joystick without pulling your hand out of position. There were just too many layers of slippage in between me and the robot arm - real hand position to virtual hand position to virtual joystick position to robot arm position - and it was too hard to predict how the robot would move.
But that’s interesting in itself, for showing the limits of these methods. If you attached the controller to a pivot on your desk, it would probably feel a lot easier to manipulate the virtual joystick, since the physics of the real and virtual object would be similar.
Current VR systems track your hands and head, and that’s it. So if the player is playing as a human, the question is what you do with the rest of the body. If the player is visible to others, or you want them to look down and see their own body, you have to simulate it somehow - a combination of an animation system with IK to match the head and hand positions.
And that adds its own design considerations. If it’s a combat game, is their body a target? Most games seem to elect to make the head the only point of vulnerability. For example in Beat Saber, it doesn’t matter if you’re standing in an obstacle as long as your head is clear. I’m not entirely sure if your body can get hit in Superhot. In Echo Arena, you have to hit other players on the head.
I’m not sure if anyone’s made a VR fencing game that’s any good. It seems like a very natural fit - a fencing piste is a limited area, and controllers map to swords very nicely. But also it would have a bunch of problems: if your sword gets parried, it will detach from your hand position, which will get very confusing very quickly. If you’re hit, you won’t feel it. I’ll have to look into that, it must have been done...
Another thing you lose out in VR is the matter of framing. VR films do exist, I watched a few, but they have a big problem: the camera frame is so vitally important to how regular film communicates a story. Think of Sergei Eisenstein and montage theory; the language of film is the cut, the shot length, the angle.
Videogames already have the problem that you can’t rely on the player looking in a particular direction. They solve this with clever level design (e.g. you walk out of a tunnel into a wide open vista) or by seizing control of the camera in a cutscene. But mostly, they define the ways the camera can move. Indeed, many of the genres of game are defined in large part by their camera mechanics - a side-scrolling shmup versus a 2D platformer versus a third person shooter.
So if you think about games that make heavy use of the camera, such as NieR - how would this translate to VR? All the narrative VR games that I’ve played tend to be fully ‘immersive’ in the manner of Half-Life - you never leave the POV of your character. Perhaps a cutscene is possible, but you still have to accept that the player can move their POV around. A VR cutscene is more akin to a moving diorama than a film.
Before I started playing around in VR, I imagined that we’d have to find ways to realistically simulate the sensations of touch, and VR would never really feel ‘real’ without an incredibly fancy haptic feedback suit, omnidirectional treadmill, etc. The reality is in a way more interesting: videogames have always been about approximations and abstractions, and the same is just as true with VR. We are trying to find suitable representations to communicate what is intuitive in 3D space, give the brain enough hooks that it can adapt itself to a new form of interaction with the world. And brains are plastic! Look how well we’ve all adapted to becoming computer touchers.
Roger (my new boss ^^) compared current VR to the first generation of 3D games, when they were still figuring out what mechanics and control schemes would make sense. (Nowadays nearly every 3D game controls the same; it’s essentially a solved problem.) That’s where VR is, there’s a lot of experiments and some standard patterns but a lot of room for experimentation still. So, even if mostly my role is in visuals at the moment, it’s exciting to be part of a new medium being born.
VR definitely isn’t going to replace flatscreen games. There are many genres of game that simply do not have any reason to be in VR at all outside of a gimmick. While VR motion tracking is now very good, and you can use the controllers as laser pointers (which is how most menus in VR work), they do not have the precision of a mouse. Serving information to the player is another tricky problem; text in VR games tends to be big and has a specific 3D location, it’s hard to match the density of information you can get on a screen, and I’m told that information hovering around your peripheral vision does not feel good in VR.
That means, for example, complicated strategy games are not a good fit. On the other hand, first-person shooting is very good in VR, you’re physically aiming your gun. It’s less accurate than mouse-driven first-person shooting where you’re always perfectly aligned with your sights, but the manipulation of the gun feels more satisfying, it’s good at selling the fantasy. (Though you do lose out on the whole medium of first person gun-interaction animations!)
It’s funny - something like Superhot is in many ways an evolution of the light gun games from the PS2 era. They’re having a moment once again.
So we’re working out a new set of genres for a new medium. In some years, probably design patterns will settle down, we’ll work out what feels good, and games will start to mature as they did on PC and consoles. A lot of VR games now are built around one specific high concept or mechanic, in contrast to games on other consoles where the mechanics are for the most part well-established and it’s more about fleshing them out with a cohesive package of stories and visuals. In a way, the majority of VR games feel a lot more like indie games than AAA games.
Will there be a THRUST//DOLL VR version one day? No promises, and the game will always be flatscreen first, but it’s going to be fun to see if it would work at all.
32 notes · View notes
artificerstimetable · 2 years
Text
Tumblr media
Using @arocoomer 's chaz design!! it's just so fun we love it.
11 notes · View notes