Posted by JibbSmartJibbSmart on 01 Jul 2021 05:10

Gyro aiming turns your whole controller into a frictionless mouse. It offers precision far beyond traditional stick-only aiming, and is much easier for new players to learn.

But the gyro in modern controllers detects rotation (or actually speed of rotation — angular velocity) in 3 axes, but a mouse only detects motion in 2. The purpose of this article is to compare the two most common spaces for converting gyro input into a mouse-like input (local space and world space), and introduce player space as more robust, more versatile solution that gets the best of both where it matters most.

You can read in the Additional Material about why we're primarily interested in using the gyro as a mouse (as opposed to giving players 3 degrees of rotational freedom), but that's secondary.

Too long? Read this: This article might look very long, but you can probably ignore a lot of it. It aims to give thorough explanations and examples of different gyro spaces, including code examples. Even if you don't want to use the player space solution advocated for here, you can find examples of robust world space and local space solutions. Since local and world space are relatively well-known concepts, feel free to skip those parts if they don't interest you.

The first section, The Basics, is targeted at players looking to better understand the options that should be available to them. This surface level look is also useful for developers before getting into code examples. Code examples and break-downs can be found in the In Depth and In Code section. It gets into implementation details of player space gyro as compared with good examples of local and world space.

Finally, curious developers may want to explore some of the Additional Material. This includes the problem of "turn" players vs "lean" players, future work for one-size-fits-all gyro controls, examples of how to calculate the gravity vector from the controller's gyroscope and accelerometer, and why gyro as a mouse is so important.

I've spent a fair bit of time playing with player space gyro controls and I can't imagine going back. I shared the first public implementation in JoyShockMapper version 3.2 (before it was called "player space"), and the user response was extremely positive. Players noted some room for improvement at extreme angles, and so version 3.2.1 has the new and improved player space gyro controls described in this article. It also has the option to use the world space solution detailed here or stick with the default local space gyro. Look up the GYRO_SPACE setting in the readme to see what options are available. This article details the most important options there: LOCAL, WORLD_TURN, and PLAYER_TURN.

In the end, player space gyro controls are very simple. 4 lines of code, give or take:

GyroCameraPlayer(Vec3 gyro, Vec3 gravNorm) {
   // use world yaw for yaw direction, local combined yaw for magnitude
   float worldYaw = gyro.Y * gravNorm.Y + gyro.Z * gravNorm.Z; // dot product but just yaw and roll
   float yawRelaxFactor = 1.41;
   Camera.Yaw -= sign(worldYaw) * min(abs(worldYaw) * yawRelaxFactor, Vec2(gyro.Y, gyro.Z).Length())
      * Settings.GyroSensitivity * DeltaSeconds;
 
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

But where'd we get that gravity vector from? What is this actually doing? We'll answer those questions soon enough. But first we need to understand the alternatives, what we're taking from them, and what problems we're solving by using player space gyro instead.

The Basics: Local, World, Player

Almost every modern game controller and phone has an IMU in it — an Inertial Measurement Unit. This can include a gyroscope (measuring angular velocity, usually in 3 axes), accelerometer (measuring acceleration, usually in 3 axes), and a magnetometer (measuring the magnetic field, usually in 3 axes). No standard console controller I know of includes a magnetometer, which can be very helpful for tracking the device's orientation in world space. But for using the controller as a mouse, the gyro and accelerometer will suffice — or often just the gyro on its own!

Local Space

The vast majority of gyro-controlled games I've played on Switch and PS4 use local space gyro controls. Same for input remappers like JoyShockMapper, Steam Input, DS4Windows, and reWASD. This is where the game doesn't care about the controller's orientation in real world space. It largely ignores the accelerometer (which can be used to detect gravity). All it cares about, moment to moment, is the controller's angular velocity around its local axes:

GyroAxesLocal_1920.png

In this image you can see the controller's local axes. We start with a top-down view of the controller on the left, then turn it about its local yaw axis (in green), then its local pitch axis (red), and finally its local roll axis (blue). As we turn the controller around each of those axes, the other two axes change relative to our view, but stay the same relative to the controller — that green yaw axis is always pointing in the same direction as the controller's sticks and its blue roll axis points out in the direction of the triggers and shoulder buttons.

Pros:

  • Incredibly simple to implement;
  • Very accurate;
  • Works reliably and consistently regardless of player posture (sitting up, lying back, etc) or environment (living room, bedroom, International Space Station, etc).

That last point makes local gyro ideal for games on handheld devices (phones, tablets, Switch in handheld mode).

Cons:

  • Default behaviour may not be intuitive for players depending on how they prefer to hold their controller;
  • As players pitch the controller up or down from their neutral position, the change in yaw axis can make it difficult to use.

I've advocated for local-only controls for a long time due to their simplicity and accuracy. It's very difficult for developers to get it wrong. By only relying on one sensor (the gyro), we can expect very reliable results. But I want to remove that initial friction for players who don't intuitively hold their controller flat. I don't want them to have to deal with unclear settings, like having to choose between "yaw" and "roll" (which games present inconsistently on Switch!).

In this side-on view of a controller held two ways, notice that when the controller is held flat (on the left), the controller's up/yaw axis lines up with the player's up axis. But if the controller is held upright (on the right), now the controller's forward/roll axis is lined up with the player's up axis. A local solution doesn't know which of these the player is using. Players who are used to holding their controller upright will generally prefer to rotate the controller about their up/yaw axis, which in this position is the controller's forward/roll axis.

GyroAxesSide.png

Ideally, we'd have a solution that can tell how the player is holding the controller and adjust accordingly. Not only would we fit the needs of more players by default, but players would be able to aim accurately in a much wider range of positions than with local gyro aiming. Because no matter how far the player pitches their controller up or down, the yaw axis would remain intuitive for them. So let's look at some such alternatives:

World Space

World space gyro controls calculate the direction of gravity to figure out which way is "up" relative to the player, and then keep the yaw axis aligned with the player's "up". See below how regardless of how the controller is oriented, the green yaw axis stays pointing in the same direction:

GyroAxesWorld_1920.png

By using the accelerometer to detect the direction of gravity, the controller's local space inputs can be converted to world space. Whether you hold the controller flat, upright, or even upside down, players can always turn the camera left and right by turning the controller left and right relative to them.

Pros:

  • More intuitive to players when implemented well;
  • Allows players to continue to aim accurately from a wider range of positions.

Cons:

  • Relatively difficult to implement well;
  • Not as good for handheld / mobile, since players may be reclining;
  • Significantly more prone to error from miscalculating the direction of gravity.

World space solutions are much more difficult to implement than local space solutions. To figure out where "up" is, we'll usually combine input from the accelerometer and gyro. This falls under a broad category called "sensor fusion", and you can read more about it in the Additional Material section of this article Sensor Fusion: Finding Gravity.

Even with the gravity direction in our hand, it's not necessarily obvious how to use that and the local angular velocities we get from the gyro to turn it into an appropriate camera or cursor movement. There aren't many games that offer world space gyro aiming, and some of those that do encounter serious issues (Mario Odyssey). But for brevity's sake, let's point to Splatoon 2 as an example of a good, robust world space gyro aiming solution. You'll find the world space solution recommended here to be very similar.

But here's the big catch with world space gyro controls: the software will often not be exactly correct about which way is up. There will frequently be some error in the calculated gravity vector, and when playing with world space gyro controls, that error is passed on to the player.

If the calculated gravity vector is 10° off, this can mean the player's input vector will be interpreted as 10° off as well. Those 10 degrees translate to a 17.4% displacement error. By that I mean, if the player moves their aimer or cursor 100 units, the resulting position could be as much as 17.4 units away from where it would've been without the error. It's worth noting that because players are self-correcting as they make movements, it shouldn't feel too bad. But thanks to human reaction times, that self-correction is delayed. Fast flick shots will have all the error with no time for self-correction.

To me, this error is the biggest shortcoming of a world space gyro aiming solution. Even though it removes (or rather reduces) a source of player error that comes with local space gyro aiming, it adds a new source of algorithmic error that's out of the player's hands. Is the 10° example typical? I don't know. It depends on how you're calculating gravity and how much the player is moving the controller about. But with the same gravity calculation, with the same error, we can do much better.

And that's the main purpose of this article: to present and explain player space gyro, which offers the same algorithmic accuracy as local space while giving the player even more intuitive control than world space.

Player Space

While local space and world space gyro aiming solutions will usually ignore any rotation that's not in two preferred axes, player space gyro aiming works a bit differently. Player space gyro trusts that the player's inputs in all 3 axes are intentional and meaningful. It still has some constraints, but generally it means players can continue to play in:

  • local space, without having to tell the game whether they prefer to hold the controller flat or upright;
  • world space, without worrying that a miscalculated gravity vector is adding error to their aim;
  • or anything in between.
GyroAxesPlayer_001.png

It's difficult to illustrate, but here's the idea: pitch rotation is read exclusively from the local pitch axis (red). The yaw axis is at right angles to the pitch axis, and is calculated from the combination of the controller's local yaw and roll angular velocities, with some help from gravity. This means the yaw axis could be anywhere on that green ring depicted above.

It might not be intuitive, but using local pitch all the time generally works very well. If the player is rotating their controller in world space yaw and pitch axes, the controller's local pitch actually always lines up with the world space pitch. It's only when players roll their controller that they go out of alignment. But players generally don't roll their controller very far anyway because on its own it isn't doing anything for their aim. And even when they have rolled out of alignment with the world pitch axis, it turns out just using local pitch is very intuitive for players anyway.

Zelda Breath of the Wild actually combines world space yaw and local space pitch, as far as I can tell. Splatoon 2 used to do the same until moving to a full world space solution that we'll explore later.

But this player space gyro doesn't use world space yaw, necessarily. It may also use local space yaw, or another axis in between, if that's what the player is using. Because in practice, players don't turn their controller exactly in their intended axis with respect to the controller, and they don't turn their controller exactly in their intended axis with respect to gravity (let alone whatever direction the application thinks gravity is pointing in). Some of their rotation will be lost to the ignored local roll or world roll axes.

This is what player space handles really well. It calculates a moment-to-moment yaw axis by combining the angular velocities in the local yaw and roll axes. The magnitude of the player's rotation is always respected and expressed in game. By trusting that the rotations in all 3 axes are intentional, it's pretty straightforward to convert these into the intended in-game movement.

What does this actually mean for the player? Player space gyro controls are easy to pick up. Whether you're used to Splatoon or DOOM (world or local, respectively), it should just work as expected. But players looking to make the most of player space gyro controls should consider it the same as a world space solution, because player space and world space gyro both offer far more range of movement than local space. Turn left and right by rotating the controller left and right with respect to your body / gravity. Pitch up and down by pitching the controller up and down. Player space offers even more freedom of movement than world space without any of its algorithmic error.

Pros:

  • Intuitive to just pick up and use (like world space gyro);
  • Very accurate (like local space gyro);
  • Resistant to player error (actual axis of rotation vs intended axis of rotation);
  • Allows players to continue to aim accurately from a wider range of positions (like world space gyro).

Cons:

  • More difficult to implement than local;
  • Reliance on gravity means not ideal for handheld / mobile (like world space);
  • Difficult to explain (so maybe just tell players to treat it like world space).

Ideally, I think it's probably best to default to local space in handheld modes and player space when using a controller. And this is very simple.

Player space gyro mostly avoids the inaccuracy of world space by relying on gravity much less. But we still do need to take gravity into consideration. If you hold your controller pitched up at a 45° angle and then turn your body left and right to turn the controller about your yaw axis, your controller is turning in its local yaw and roll axes at the same time. It'll depend on the coordinate space of your game or your controller, but for simplicity, let's say that as you turn left, the controller's yaw and roll velocities are both positive. But now pitch your controller down at a 45° below the horizon instead of above it and continue to turn left. Now the controller's rolling has been inverted. You still have positive yaw velocity, but negative roll velocity.

I'll go more in-depth in other sections, but to put it simply, we expect different behaviour in the local axes depending on whether the controller is pitched down, pitched up, or even upside-down. And in order to correctly combine these values, we need to use gravity to figure out which way we're pointing.

Remember when we said a 10° error with a world space solution can give you a 17.4% displacement error? With player space gyro controls, despite the fact that we do use the same gravity vector in our calculations, we get none of the error.

Now, because we have to invert yaw or roll input in some orientations, players treating player space gyro as local space gyro don't technically have the full freedom of movement they are used to. As they approach the boundaries where their local input would be inverted, it first gets pinched down towards zero to avoid sudden inversions. However, I think the range afforded to local space players before pinching begins is still very wide and lets players pitch their controller far higher or lower than they'd normally be comfortable with. So I think narrowing that space is a reasonable trade-off for no longer having to change and invert axes to get their aiming set up to their liking in the first place.

But ultimately my recommendation to players would be to treat player space gyro like world space gyro. You can enjoy the wider range of movement and more intuitive aiming that comes with it, while still all of the algorithmic precision of local aiming.

In Depth and in Code

Let's unpack all 3 spaces with some pseudocode. We'll start extremely simple. For these, we're using the PlayStation controller's coordinate space:

  • X = side/pitch
  • Y = up/yaw
  • Z = forward/roll

This tutorial is based on my work on an input remapper. This means my output is virtual mouse events that will then be received by the game I'm playing. If you're also working on an input remapper, your experience will be similar. But if not, your outputs for camera yaw and pitch may need to be inverted when trying these examples.

Local

Here is your most basic local gyro aiming implementation:

GyroCameraLocal(Vec3 gyro) {
   Camera.Yaw += gyro.Y * Settings.GyroSensitivity * DeltaSeconds;
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

Simple, right? This assumes the gyro and game angles are in the same units — radians per second / radians, or degrees per second / degrees. Almost every game with gyro controls I've played has another multiplier in there to convert to a different sensitivity scale. Why? Because that's what we're used to with mouse and stick aiming. There's no natural scale, so we pick one that looks nice. Maybe we pick a nice max value and call it "100%".

Please stop doing this with gyro controls. We're converting from an angle to an angle! That's it! "1" should mean a real world angle converts to the same in game angle. "2" should mean the camera turns twice as much as the controller. And so on. Anyway. That's not what this is about. If this is new to you, please check out Good Gyro Controls Part 1, or this much shorter summary on Gamasutra.

Enough distractions. Let's get back to it.

Some players like to hold their controller upright. Let's give them the option:

GyroCameraLocal(Vec3 gyro) {
   if (Settings.GyroTurnAxis == Yaw) {
      Camera.Yaw += gyro.Y * Settings.GyroSensitivity * DeltaSeconds;
   } else {
      Camera.Yaw += gyro.Z * Settings.GyroSensitivity * DeltaSeconds;
   }
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

Invert if you need to for your coordinate space. A lot of games let the player choose which axes they invert, which is good.

Finally, we have the Overwatch solution, which just has both axes enabled at the same time. It also has different sensitivities in each axis, which is great! The player will use whichever they're most comfortable with. I first guessed they were just adding the yaw * yawSensitivity + roll * rollSensitivity. But as I fleshed out my own such solution, I encountered edge cases similar to what I saw in Overwatch, so I think I have a pretty good idea how it actually works:

GyroCameraLocal(Vec3 gyro) {
   Vec2 yawAxes = Vec2(gyro.Y, gyro.Z);
   float yawDirection;
   if (abs(yawAxes.X) > abs(yawAxes.Y)) {
      yawDirection = sign(yawAxes.X);
   } else {
      yawDirection = sign(yawAxes.Y);
   }
 
   Camera.Yaw += yawAxes.Length() * yawDirection * Settings.GyroSensitivity * DeltaSeconds;
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

For simplicity I skipped separate sensitivities for each axis and inversion in each axis. They're great, but not the most interesting part here. What's interesting is that the yaw and roll inputs aren't just added together to form the camera yaw output. Instead, we use the length of the vector formed by the yaw and roll components. This has the effect of treating both axes as two components in a rotation about a single axis.

See, for VR and other applications where we want to track the orientation of the controller, we treat the gyro input as a simple rotation about a single axis in 3D space. The three components of the gyro input form the direction of that axis, and the length gives is the size of the rotation. If we were to ignore one axis (in this case, pitch) and combine the others under the assumption they represent a single rotation about one axis, this is how we'd do it.

This means that the player can turn their controller in the local yaw axis or the local roll axis or another axis in between and the magnitude of the turn will be expressed in game.

The only problem is that when we get the length of that axis, we lose the signs of the individual components. What if the yaw component on its own would've turned the camera right while the roll component would've turned the camera left? In that case, as far as I can tell, the game has the larger of the two components decide the yaw direction. And this is where things can start to feel weird.

As long as the player keeps their controller pitched above the horizon but not so far that the sticks point below the horizon, they can turn their controller in the world yaw axis and get the appropriate rotation in game. If they go out of that range, yaw and roll will "disagree" with each other, but as long as they don't go too far further the larger of the two axes will keep rotation sensible. But as they reach the point where yaw and roll are of similar magnitude, players can experience this weird back-and-forth sudden inversion of their turn direction as one axis momentarily exceeds the other.

So while it sometimes works as a world-space solution, it's probably best for players to treat Overwatch's gyro controls as local-only and try to turn their controller in their preferred local axis. Overwatch will honour their turn completely as long as their real axis of rotation doesn't stray too far from the controller's local axis.

And while there's a lot of good in Overwatch's solution, if you want something that still takes both yaw and roll axes at the same time without dealing with sudden inversions, I wouldn't fault you for sticking with something simpler: Camera.yaw += gyro.Y * Settings.YawSensitivity + gyro.Z * Settings.RollSensitivity. Done like that, when input axes start to disagree, they'll start to cancel out gradually, and so players are hopefully encouraged to stick to one local axis.

World

A robust world space solution is really simple if you've already got your gravity vector. Sure, calculating the gravity vector yourself can be complicated, but depending on your platform or libraries you're using you may already have gravity calculated for you. If not, you can learn about calculating a gravity vector from gyro and accelerometer input in a later section (Sensor Fusion: Finding Gravity). For now we'll just assume you've got it. I'm also assuming it's normalised (we're interested in the direction of gravity, not the strength of it).

Getting the world yaw rotation is very simple, because the world yaw axis is always aligned with gravity. Taking the gyro input as a 3D vector, how much of that vector is aligned with gravity? That's a simple dot product, and that gives us our yaw.

Pitch is a little trickier. The world pitch axis for our purposes is actually still based on the direction the controller is pointing (unlike world yaw). There are a couple of ways to do this. One (bad) way is to calculate the controller's orientation (usually a quaternion), convert to Euler angles, and use the difference in pitch between the previous frame and the current frame. Since we're treating gyro as a mouse, we're not using that absolute orientation in game. More on that in The Importance of Gyro as a Mouse later on in this article. Changes in the controller's real world pitch don't always translate nicely to an in-game pitch velocity, especially when the controller's pitch and the camera's in-game pitch differ. It also means you have to deal with the pitch velocity flipping as the player goes past pointing vertical. Mario Odyssey does this when controlling a tank. It's not fun when that happens.

It also means you have to decide what the "front" of the controller is. This differs from player to player (some hold it flat, some hold it upright, and some in-between). The whole point of us pursuing a world space solution is to avoid that. So let's avoid that issue altogether.

Here's a much simpler, much more robust way of dealing with pitch that will behave well under all practical use. Take the controller's local pitch axis, project it onto the horizontal plane (as determined by the direction of gravity). Any rotational velocity around that axis will be treated as pitch velocity in game.

As you lean the controller onto its side, the controller's local pitch axis might line up with the gravity direction, and then you can no longer project the pitch axis onto the horizontal plane. That's okay. When gyro aiming, it'd be very unusual for players to lean their controller on the side like that. To avoid the pitch axis changing suddenly, we can just reduce the pitch velocity as we get near that boundary.

Let's look at some code:

GyroCameraWorld(Vec3 gyro, Vec3 gravNorm) {
   // some info about the controller's orientation that we'll use to smooth over boundaries
   float flatness = abs(gravNorm.Y); // 1 when controller is flat
   float upness = abs(gravNorm.Z); // 1 when controller is upright
   float sideReduction = clamp(max(flatness, upness) - 0.125) / 0.125, 0, 1);
 
   // world space yaw velocity (negative because gravity points down)
   Camera.Yaw -= gyro.Dot(gravNorm) * Settings.GyroSensitivity * DeltaSeconds;
 
   // project pitch axis onto gravity plane
   float gravDotPitchAxis = gravNorm.X; // shortcut for (1, 0, 0).Dot(gravNorm)
   Vec3 pitchVector = Vec3(1, 0, 0) - gravNorm * gravDotPitchAxis;
   // that's all it took!
 
   // normalize. it'll be zero if pitch and gravity are parallel, which we ignore
   if (!pitchVector.IsZeroVector()) {
      pitchVector = pitchVector.Normalize();
      // camera pitch velocity just like yaw velocity at the beginning
      // (but squish to 0 when controller is on its side)
      Camera.Pitch += sideReduction * gyro.Dot(pitchVector)
         * Settings.GyroSensitivity * DeltaSeconds;
   }
}

sideReduction is calculated such that it's normally just '1', and it will be '0' if the controller is totally on its side. If the controller is almost on its side, sideReduction will be between 0 and 1, and will reduce the effect of pitching the controller. Players will find similar behaviour in Splatoon 2.

Because neither pitch nor yaw depend on what the player considers to be the front of their controller, this will work whether players like to hold their controller flat, upright, or somewhere in between. If you pitch the controller so far that it's upside down, both yaw and pitch will still behave as expected from the player's point of view. So if you want a world space solution, I think you'll find this to be a great starting point.

Now that our axes of rotation are determined by our calculated gravity vector, any error in our gravity vector will result in the same amount of error in input direction. A 10° error in gravity direction can translate to a 10° error in input direction. And then there's player error. The player will often not be turning in exactly their intended axis. Some of their intentional rotation can and will be lost if it's not in the world yaw or calculated world pitch axis. So let's finally get to the player space solution, which addresses both of these issues.

Player

Player space gyro uses the most valuable parts of local and world space gyro. We will:

  • Use local pitch;
  • Combine yaw and roll to get rotation around whatever axis the player is actually using;
  • Use the gravity direction to detect when yaw or roll need to be inverted to combine together more nicely;
  • Also use the gravity direction to smooth over boundaries between axis inversions.

Let's start by taking the simplest parts of world space and local space. World space has a one-liner for yaw, local space has a one-liner for pitch. So let's use those:

GyroCameraPlayer(Vec3 gyro, Vec3 gravNorm) {
   // world yaw:
   Camera.Yaw -= gyro.Dot(gravNorm) * Settings.GyroSensitivity * DeltaSeconds;
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

This is a nice and simple start. I think this is basically what Zelda Breath of the Wild does (and Splatoon 2 used to do). You can tell by turning your controller on its side and turning it in its local pitch axis (which happens to also be the world yaw axis). The camera turns diagonally, pitching and yawing at the same time in roughly equal measure.

Let's remove local pitch from the world yaw calculation to avoid that interference:

GyroCameraPlayer(Vec3 gyro, Vec3 gravNorm) {
   // world yaw, but only using local yaw and roll
   float worldYaw = gyro.Y * gravNorm.Y + gyro.Z * gravNorm.Z;
   Camera.Yaw -= worldYaw * Settings.GyroSensitivity * DeltaSeconds;
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

This stops local pitch from interfering with world yaw input, although I don't think that was really a big issue. The bigger gain here is that pitch input no longer contributes to yaw output. And we already stopped yaw and roll from contributing to pitch output. This axis cross-contamination was the biggest source of error when the gravity vector is calculated incorrectly. Being 10° out of axis means some of the yaw input can go to the pitch output. But because of the way circles work, that 10° adds 17.4% of the yaw input to the pitch output (sin(10°) = 0.174) while only removing 1.5% of the yaw input from the yaw output (1 - cos(10°) = 1 - 0.985 = 0.015). Check out this interactive unit circle to see what I mean.

Now that the yaw and roll inputs can't add to the pitch output (and vice versa), we have eliminated that bigger source of error. We have also made it so that the player can't turn their controller on its side and still operate it in world space, but for mouse-like control in games, players aren't doing that.

But it's not enough to just reduce the error. Let's get rid of it altogether. If we're trusting that the player is intentional with their input rotation, we want to use the full yaw and roll input. We do this by getting the length of the vector composed of yaw and roll (square root of the sum of squares). For readability, like with the same example in the local space section, we're just going to convert them to a 2D vector and get the length, but now we're using the sign of our world space yaw to decide which direction to turn:

GyroCameraPlayer(Vec3 gyro, Vec3 gravNorm) {
   // use world yaw for yaw direction, local combined yaw for magnitude
   float worldYaw = gyro.Y * gravNorm.Y + gyro.Z * gravNorm.Z; // dot product but just yaw and roll
   Camera.Yaw -= sign(worldYaw) * Vec2(gyro.Y, gyro.Z).Length()
      * Settings.GyroSensitivity * DeltaSeconds;
 
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

Now no intentional player input is lost. We used gravity to figure out whether yaw and roll should be combined into a left rotation or a right rotation, and we can count on that being correct for any remotely reasonable input. But we're not quite there. Although it's uncommon, some players are already used to unusual ways to map controller rotations to camera rotations (we'll get to that in Turn vs. Lean). If players turn their controller in the "lean" axis (world roll) instead of the "turn" axis (world yaw), players will find their turn direction seemingly working until it suddenly inverts as they cross a threshold they cannot see.

In practice, I think it's good to place some constraints on the input to teach players how it's meant to be used. Secondly, even if players intuitively aim correctly, as they explore the limits of acceptable input and find these thresholds where the input suddenly inverts, players will perceive this as buggy behaviour (we don't count on players to be familiar with garbage in, garbage out).

When players are rotating very far out of axis like that, the magnitude of the world yaw turn will be much smaller than the total combined local yaw and roll. In fact, before inverting, world yaw will approach 0 pretty nicely. If players are turning their controller weirdly and getting little to no response from the game, they'll quickly pick up that's not how it's supposed to be used (just like turning the controller out of axis with a world space or local space gyro solution). So let's allow the constrained world yaw to take over when players start turning too far out of axis:

GyroCameraPlayer(Vec3 gyro, Vec3 gravNorm) {
   // use world yaw for yaw direction, local combined yaw for magnitude
   float worldYaw = gyro.Y * gravNorm.Y + gyro.Z * gravNorm.Z; // dot product but just yaw and roll
   float yawRelaxFactor = 1.41;
   Camera.Yaw -= sign(worldYaw)
      * min(abs(worldYaw) * yawRelaxFactor, Vec2(gyro.Y, gyro.Z).Length())
      * Settings.GyroSensitivity * DeltaSeconds;
 
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

The magnitude of world yaw will always be less than or equal to combined yaw and roll. When combined yaw and roll line up perfectly with the world yaw they'll be of equal length, but as they get further out of alignment, world yaw gets smaller and smaller. Our min function there will always use the worldYaw value. But if we bump it up with our magic yawRelaxFactor number, we create a buffer zone where we will use the full combined yaw and roll as long as they're close enough to being aligned with world yaw. If the axis of rotation gets too far away from our calculated world yaw axis (which is just the gravity direction), the output will start being squeezed down towards zero.

This buffer zone is where we make room for the player to unintentionally be out of axis and/or for the gravity vector to be calculated wrong. The example above uses a value of 1.41 to give us a roughly (90 - asin(1/1.41) =) 45° buffer, which I think is probably enough to give the player error-free world-like gyro aiming. Even if the gravity error + player error gets out of that range, it'll still constrain the player less than a strict world space solution.

But I claimed that players used to local space solutions (like those offered in the vast majority of gyro aiming games) could just pick up and play with a player space solution, as well. As long as players are within that buffer zone, they can operate entirely in the controller's local space, and it'll work great (regardless of whether they favour holding their controller flat and using local yaw or holding it upright and using local roll). +-45° is arguably not much room, though. That's why in JoyShockMapper I have set yawRelaxFactor to 2 to give players (90 - asin(1/2) =) 60° of freedom for local space aiming. Even with a local-only solution, players will try to avoid turning their controller very far out of their comfortable range because it can adversely affect their aim. +-60° should be plenty of room for most. If you're happy to trust players more and give them more freedom of movement, you might prefer to do the same in your game.

It's up to you what your "relax factor" is. Just remember that the more you constrain the player's input to an axis calculated from gravity, the more likely their aim will be affected by errors in the gravity calculation (even if only slightly). The less you constrain it, the more likely players are to think weird inputs are supposed to be accepted. I think the vast majority of players will be served well by a value between 1.15 (which gives about 30°) and 2 (60°), and you can experiment outside of that range, too.

But there you have it! That's all you need to implement player space gyro controls.

Conclusion

Player space gyro is not a silver bullet. Local is still probably better for handheld, where the player is far less likely to be aligned with gravity and the "controller" is always in the exact same position relative to the screen. And in theory, having unintended roll input add to intentional yaw input might cause stability issues for some users. This was something I was mindful of as I tested and worked on it, but for me, at least, this hasn't been an issue.

Player space gyro's increased freedom of movement has made it hard for me to go back to local space gyro. I have far more room to pitch my controller up and down in highly vertical games like Quake Champions and Fortnite without losing any accuracy with my horizontal aiming. And although my world space solution is robust and offers similar benefits, I can feel my movements are more constrained with it than with player space. Player space gyro aiming lets me perform flick shots more accurately thanks to its error-free handling of gravity. And since it's also much simpler to implement than a robust world space solution, I believe player space to be the uncompromising winner.

JoyShockMapper's users have responded extremely positively to being able to use player space gyro aiming! So I encourage you to give some form of player space gyro a go, and give your players the option to use it. It could be a great default setting when players are playing with a controller.

When exposing this option to users, it may be simpler to call it "world" as opposed to the alternative "local" space. But if you have tooltips or something to offer a more detailed description, gyro power-users will appreciate knowing that they're using a "player space" solution.

In Summary:

  • For handhelds, local space gyro probably remains the best solution, where the screen is always in a fixed position in relation to the "controller", and players may be lying back, forward, sitting up, or really be in any position;
  • For everything else, player space gyro lets the player turn the controller relative to their body while being simpler to implement and far less prone to error than standard world space gyro solutions. This allows better defaults, fewer settings to fit the player's needs, and far more freedom of movement;

But does this fit every player's preferences? How do we find gravity if we don't already have it? Why is it so important to see the gyro as a mouse? Check out the Additional Material below!

Additional Material

For a more complete understanding of the current state of gyro controls and what options are available to you, here's some more info to check out.

Turn vs. Lean

It's not uncommon for games that offer gyro controls to also let players choose their axis of rotation. They'll normally let players choose between yaw and roll. But there are some problems with this:

  1. Players don't intuitively know what yaw and roll are. This isn't helped by the fact that games can sometimes present this incorrectly, too. Overwatch's "yaw" and "roll" make sense in handheld mode, but are incorrect with a detached controller. The player's understanding of what these axes are will depend on which game they first played with this option.
  2. Players will choose "roll" for different reasons. Some, to hold the controller upright and turn in world space yaw, which player space handles for them. But some prefer to hold their controller flat and lean their controller from side to side to turn the camera left and right. Again, this is something that some games offer by default (see Overwatch's "Pro Controller" preset, which interprets roll as a flat lean rather than an upright turn).

We have two points of ambiguity here between yaw and roll. Does the player know the correct axis names? Does the player intend to use their controller flat or upright? In order to avoid these issues, I've started calling world space yaw "turn" and world space roll "lean". With a world space or player space solution, we can ignore the issue of whether the player holds their controller flat or upright, and instead just let them decide whether they want to "turn" the controller or "lean" it.

I think "turn" on its own may be vague, but the "lean" option contextualises it. "Lean" can only mean one thing, I think. So "turn" must mean the other.

With that out the way, the player space, world space, and local space solutions described above assume every player prefers to turn their controller to turn the camera. This is definitely the best default. It's most intuitive to most players, and is obviously a more natural mapping of the controller rotation to the camera rotation. Games that don't let players choose between yaw and roll will almost always map turning the controller to turning the camera.

However, there are some players who are more used to leaning their controller to control the camera. In the past, I thought this was relatively common as a few people complained about the default controls in JoyShockMapper. Over time I've realised that those complaints almost always came from the player holding the controller upright (which clashed with JSM's default local yaw to turn) rather than the player preferring to lean. I've found it very difficult to find players that actually prefer lean over turn, but someone owned up to being a convert from lean to turn in the Gyro Gaming Discord server, so there's that.

So if you really want to cover all preferences, you may want to provide "lean" options alongside your "turn" options. These will require more code than "turn" solutions because we'll no longer be using the gravity axis (world yaw) directly. Instead, we need to calculate a new world roll axis from the world yaw and world pitch axes.

GyroCameraPlayerLean(Vec3 gyro, Vec3 gravNorm) {
   // some info about the controller's orientation that we'll use to smooth over boundaries
   float flatness = abs(gravNorm.Y); // 1 when controller is flat
   float upness = abs(gravNorm.Z); // 1 when controller is upright
   float sideReduction = clamp(max(flatness, upness) - 0.125) / 0.125, 0, 1);
 
   // project pitch axis onto gravity plane
   float gravDotPitchAxis = gravNorm.X; // shortcut for (1, 0, 0).Dot(gravNorm)
   Vec3 pitchVector = Vec3(1, 0, 0) - gravNorm * gravDotPitchAxis;
 
   // normalize. it'll be zero if pitch and gravity are parallel, which we ignore
   if (!pitchVector.IsZeroVector()) {
      Vec3 rollVector = pitchVector.Cross(gravNorm);
      if (!rollVector.IsZeroVector()) {
         rollVector = rollVector.Normalize();
         float worldRoll = gyro.Y * rollVector.Y + gyro.Z * rollVector.Z;
         float rollRelaxFactor = 1.15;
         Camera.Yaw -= sign(worldRoll) * sideReduction
            * min(abs(worldRoll) * rollRelaxFactor, Vec2(gyro.Y, gyro.Z).Length())
            * Settings.GyroSensitivity * DeltaSeconds;
      }
   }
 
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

Future Work

What about a true one-size-fits-all solution? Can we handle turn players (both flat and upright) and lean players (both flat and upright) at the same time? Maybe. Let's combine both a player turn and a player lean solution, ultimately choosing the outcome that aligns best with the player's input rotation, and smooth over the transition between the two:

GyroCameraPlayerTurnOrLean(Vec3 gyro, Vec3 gravNorm) {
   float turn = 0;
   float lean = 0;
 
   // use world yaw for yaw direction, local combined yaw for magnitude
   float worldYaw = gyro.Y * gravNorm.Y + gyro.Z * gravNorm.Z; // dot product but just yaw and roll
   float yawRelaxFactor = 1.15;
   turn -= sign(worldYaw)
      * min(abs(worldYaw) * yawRelaxFactor, Vec2(gyro.Y, gyro.Z).Length())
      * Settings.GyroSensitivity * DeltaSeconds;
 
   // some info about the controller's orientation that we'll use to smooth over boundaries
   float flatness = abs(gravNorm.Y); // 1 when controller is flat
   float upness = abs(gravNorm.Z); // 1 when controller is upright
   float sideReduction = clamp(max(flatness, upness) - 0.125) / 0.125, 0, 1);
 
   // project pitch axis onto gravity plane
   float gravDotPitchAxis = gravNorm.X; // shortcut for (1, 0, 0).Dot(gravNorm)
   Vec3 pitchVector = Vec3(1, 0, 0) - gravNorm * gravDotPitchAxis;
 
   // normalize. it'll be zero if pitch and gravity are parallel, which we ignore
   if (!pitchVector.IsZeroVector()) {
      Vec3 rollVector = pitchVector.Cross(gravNorm);
      if (!rollVector.IsZeroVector()) {
         rollVector = rollVector.Normalize();
         float worldRoll = gyro.Y * rollVector.Y + gyro.Z * rollVector.Z;
         float rollRelaxFactor = 1.15;
         lean -= sign(worldRoll) * sideReduction
            * min(abs(worldRoll) * rollRelaxFactor, Vec2(gyro.Y, gyro.Z).Length())
            * Settings.GyroSensitivity * DeltaSeconds;
      }
   }
 
   // our "weights" here are how important each axis is based on the comparative size of the inputs
   float turnWeight = abs(turn);
   float leanWeight = abs(lean);
   float cutoffFactor = 0.75;
   if (turnWeight != 0 || leanWeight != 0) {
      // this cutoff will reduce the smaller component to 0 if it's not close in size to the larger component
      if (turnWeight < leanWeight) {
         float cutoff = leanWeight * cutoffFactor;
         turnWeight *= clamp((turnWeight - cutoff) / (leanWeight - cutoff), 0, 1);
      } else {
         float cutoff = turnWeight * cutoffFactor;
         leanWeight *= clamp((leanWeight - cutoff) / (turnWeight - cutoff), 0, 1);
      }
 
      // weighted average, but thanks to our cutoff this will usually just mean one or the other
      Camera.Yaw += (turn * turnWeight + lean * leanWeight) / (turnWeight + leanWeight);
   }
 
   // local pitch:
   Camera.Pitch += gyro.X * Settings.GyroSensitivity * DeltaSeconds;
}

This combined solution also narrows the range in which players can turn their controller in local space (whether accidentally or intentionally) and have their rotation fully expressed in game. In practice, I do find that I bump into the boundary between turning and leaning (such as when I happen to turn my controller in its local yaw when it's pitched above the horizon). Hitting this boundary still feels abrupt, even with the smaller yawRelaxFactor and rollRelaxFactor I have here (30° buffer). Reducing the cutoffFactor will soften the boundary some more, but also require me to be more careful about how I'm using the controller.

The code shown here is a very naive solution. It just combines our player turn and player lean functions and uses a weighted average (with some filtering on the weights) for the output. For this to be useful in a game would take more work in my opinion. Even if that work is fruitful, whether such a compromise is worthwhile probably depends a lot on what proportion of players actually prefer to lean their controller to turn their camera. Turning is both more natural and is what most games teach players (notably all Nintendo-made Switch games that offer gyro/motion aiming only offer "turn" controls). So I won't recommend this solution just yet, but there it is if you want to check it out.

Sensor Fusion: Finding Gravity

The accelerometer will often tell you which way gravity is pointing, but not always. A perfect accelerometer will only detect 0 in all axes when the controller is in freefall. When the controller is held still in the player's hands, those hands are exerting an upward force on the controller to keep it from falling. The accelerometer detects that, and you can flip it and say "gravity is pointing that way." But the accelerometer is also detecting linear acceleration, whether from intentional shakes/waggles, or unintentional movement during regular play. The accelerometer alone cannot reliably tell you the direction of gravity.

Sensor fusion to the rescue! "Sensor fusion" refers to combining data from more than one kind of sensor in a way that gives you a better result than using just one kind of sensor. In this case, when the controller is moving around, we calculate the gravity vector by just rotating our previous gravity vector by our gyro input (inverted). This will do a great job maintaining our gravity vector for the most part, but will gradually accumulate errors (rounding errors, sample rate errors, and noise from the gyro). By the time that error is significant, we hope the controller will have been held still long enough for us to calculate a brand new gravity vector using the accelerometer.

A good breakdown of this idea (with a bit of maths in it) can be found on the Oculus developer blog.

Since knowing which way is up is so useful for world space and players space gyro, I want to make sure you have some way to find the gravity vector. I'm hoping this has readers from a variety of contexts. It could be that:

  • You are working on a platform that already provides a gravity vector, or at least offers a real world orientation (in which case you can get your gravity vector by just reverse-rotating (0, 1, 0) by the reported orientation);
  • You have already got a decent gravity calculation in your software for other reasons (motion steering or lean bindings in input remappers, for example);
  • Or you can use a simple solution here.

With world space gyro, it's very important that we have a very accurate gravity vector. But player space gyro lets us get away with using a simpler, less accurate gravity calculation with no consequences, as long as it's within the bounds of our chosen buffer zone. If you read the implementation section (In Depth and in Code), you know JoyShockMapper uses a buffer of +-60°, so we could probably get away with a very simple gravity calculation there, too.

So in case you don't have a better option, here's a super simple sensor fusion solution for getting gravity from gyro and accelerometer inputs: the complementary filter. I think. I've heard the term thrown around as a super simple sensor fusion solution for getting the gravity vector from gyro and accelerometer data. They're usually trying to convert it to angles, which we don't care about. We just want the gravity direction.

Let's assume that, on average, the accelerometer input will be in the direction of gravity. We can continually rotate the previous gravity vector by the gyro input and then just nudge it towards the accelerometer vector (or the inverted accelerometer vector, since gravity's in the opposite direction of the acceleration). If those nudges are in proportion to the size of the difference, then that should prevent too much error accumulating. A small nudge towards the accelerometer input means any noise or shaking should be smoothed over. So let's do something like this:

Vec3 GravityVector;
 
Vec3 SensorFusionGravity(Vec3 gyro, Vec3 accel) {
   // convert gyro input to reverse rotation
   Quat rotation = AngleAxis(gyro.Length() * DeltaSeconds, -gyro.X, -gyro.Y, -gyro.Z);
 
   // rotate gravity vector
   GravityVector *= rotation;
 
   // nudge towards gravity according to current acceleration
   Vec3 newGravity = -accel;
   GravityVector += (newGravity - GravityVector) * 0.02;
}

You may be surprised how nicely this behaves. There are fancier alternatives, some of which I only know of by name (such as the Kalman filter), but this simple complementary filter gives great results for our needs and is super simple to understand.

Edit (2021-07-26): I've published a tutorial explaining our simple sensor fusion example here and how to implement a more advanced solution, so check it out. Or if you want to just plug that advanced solution into your project, check out my open source C++ library GamepadMotionHelpers.

The Importance of Gyro as a Mouse

Motion sensors have a wide variety of applications. Using them to track the controller's orientation is an obvious one. Using them to sense shakes and gestures is another, though a lot of people avoid games like that altogether. But gyro as a mouse is, in my opinion, the most important utility of motion sensors.

Camera control has been largely standardised across first- and third- person shooters, RPGs, action games, and more. In games that require the player to aim accurately, players are given two options:

  1. Use a mouse, and get direct control over the camera that is consistent across virtually all games in similar genres;
  2. Or use a controller, and the game will bend over backwards to help you with your precise aiming.

It's not a big deal that the game helps the player aim. Or at least it wouldn't be if all players received the same help. But in recent years, "cross-play" is more and more expected. PC and console players are playing together, and if the aim assist in these games was developed in isolation of PC players, it usually offers advantages over mouse aiming that can be exploited by savvy players.

Wouldn't it be great if every player had a mouse without having to put down their controller? No extra fingers required. No aim assist required. Just turning the controller to provide a mouse-like input to the game.

The only thing getting in the way of this being standard in all applicable games is the lack of developer knowledge. When games do offer gyro aiming, it's usually only on the Switch version (despite it being possible on every platform except Xbox). They also frequently provide disappointing options that don't let players reach their potential.

Great gyro control options are super simple. You just need to know what's helpful and what's not!

GyroWiki (you are here!) provides a variety of resources to help you make the most of your controller. Developers, check out:

  • Good Gyro Controls Part 1: The Gyro is a Mouse, which details the basic options that every game should provide, common gotchas to avoid, and advanced options for advanced users. It's a little old now, so it doesn't include any info about player space gyro, but everything there still applies, and is still missed by most games today. If that's too long a read for you, please at least check out this shorter summary of the basics on Gamasutra.
  • Good Gyro Controls Part 2: The Flick Stick, which details a new way to use the right stick once your precision aiming is handled by gyro. See it in action in the video below.

Player space gyro doesn't change the fact that good gyro controls are easy to implement. Although there's a lot of code on this page detailing alternatives, you saw that player space gyro itself is only a few simple lines of code. Isn't it great when simple solutions are also the best? So far, that looks to be the case here.

Players and developers on PC can enjoy all of the features described on this site in almost any game using JoyShockMapper, which converts inputs from standard PlayStation (4 & 5) and Switch controllers to mouse, keyboard, and controller input. It features local, world, or player space gyro, flick stick, natural sensitivity, smart smoothing and acceleration options, traditional stick controls, advanced button bindings, and more. You can also enjoy many of these features in other input remappers, like Steam, DS4Windows, and reWASD.

The gyro is a mouse. Every game that plays better with a mouse also plays better with gyro. Gyro controls are easier and more intuitive for new players to pick up, provide more room for mastery by skilled players, and go a long way to closing the gap between playing with a controller and playing with a mouse. Let's embrace the gyro as a mouse on every platform that'll let us. Let's change how games are played.