When it comes to action-filled video games, frame rates matter, and up until recently, traditional “frames per second” wisdom has landed at either 30 fps or 60 fps. Thirty, the rate seen in most standard TV broadcasts, is fine for slower cinematic games, while frantic battles and twitchy fights benefit from a higher rate, since it looks smoother and reduces button-tap latency.
This week, a surprising new number enters the conversation: 40 fps, a standard previously unattainable thanks largely to TV standards. It comes courtesy of a new patch to this month’s Ratchet & Clank: Rift Apart on PlayStation 5, which already includes a 60 fps “performance” option. So why would anyone pick 40 fps instead? And how does it work?
HDMI standards, menu picking, and math
Recent titles by Insomniac Games, particularly Marvel’s Spider-Man and the 2016 Ratchet & Clank remake, launched on PS4 with a 30 fps lock, meant to guarantee higher pixel counts and more detailed shadow and level-of-detail (LoD) settings. Both of those games eventually got PS5 versions with 60 fps support, since they could leverage the newer hardware’s power. As a native PS5 game, this month’s R&C:RA launched with both 30 and 60 fps modes on day one. Its menus asked you what you preferred in your gaming: more pixels and higher image quality or more frames?
Tuesday’s update changes this choice slightly, albeit with a toggle that you might not notice if you skip the fine print. R&C:RA now has a “120 Hz Display Mode” toggle tucked into the graphics option menu. No, the game doesn’t run at 120 fps now (though we’ve certainly played 120 fps games on PC and console, and we’ve played higher frame rates than that). Instead, R&C:RA engages any compatible TV’s 120 Hz mode, which, at this point, becomes a matter of HDMI standards, menu picking, and math.
Both of last year’s biggest new consoles, the Xbox Series X and PlayStation 5, support 120 fps connections at 4K resolution, so long as your TV supports the HDMI 2.1 connection standard. Without HDMI 2.1 bandwidth at the ready, older TVs with 120 Hz support can only get up to 1080p resolution. I’m able to test R&C:RA on LG’s 2020 OLED panel, the CX, and I got its 120 Hz mode working… once I changed another setting in the PS5’s root menus.
Getting 120 Hz, 4K, and HDR metadata into the same signal requires reducing the system’s HDR signal to 4:2:2 chroma subsampling. This reduces the total color information sent to your panel per frame, which is arguably more noticeable when picking through lines of text than when romping through high-speed action. I don’t notice a massive difference in PS5 image quality when changing this setting, but your mileage will vary.
Why “55 fps” isn’t typically a thing
Once you’ve taken all of these steps, R&C:RA can deliver video to an HDMI 2.1-rated panel with any frame rate that divides evenly into 120. This opens up more fixed frame-rate possibilities, and in traditional video games, that matters. Tons of games have launched with apparent 60 fps refresh rates only to stutter when a console or computer can’t keep up with the game’s rendering burden. This can result in visual artifacts like torn frames (where only half of the screen changes between frames of animation) or bad frame pacing (where the refresh is inconsistent).
In other words, a traditional 60 Hz monitor can’t actually run gameplay at, say, 55 fps. Instead, it will rapidly alternate between 30 fps and 60 fps to reach that 55 fps average in a way that can feel disjointed for anyone controlling the on-screen action.
Which brings us to R&C:RA‘s 40 fps mode. Previous video analysis of the game showed that its 30 fps “quality” mode didn’t break a sweat in sticking to a consistent 30 fps refresh. Teams like Digital Foundry wondered how much more the game could be pushed while otherwise remaining locked to a comfortable, consistent frame rate. Now, that 30 fps mode—and every single bell and whistle relating to its resolution, image quality, and ray-tracing fidelity—runs 33 percent faster.
Great news, even if it’s not VRR
While I haven’t run today’s new 40 fps mode through comprehensive video analysis—I don’t have gear suited for 120 fps analysis—I did tear through an hour of the game’s campaign at this setting, and I can report that it runs smoothly on my 120 Hz panel. If there’s any frame-rate hitching, the panel may simply drop from 40 fps to 30 fps in a single-frame stutter, which I imagine is a lot less noticeable than drops between 60 and 30 or 30 and 15.
As an uncommon gaming frame rate, 40 fps is interesting in action. There’s a certain sheen to how the game runs at its native near-4K resolution, with all ray-tracing effects maxed out, that feels smoother yet still cinematic at this jump above the standard 30 fps rate. Before this patch went live, I was happy to trade details for extra frames to get to 60 fps. But now, with extra detail unlocked at a smoother combat frame rate, I think I can make 40 fps work.
If you’re the kind of home-theater savant who knows what chroma subsampling is and bristles at a downgrade from 4:4:4 to 4:2:2, 40 fps may not be for you. And this is not the same as variable refresh rate (VRR), a standard that Xbox Series X/S consoles support. HDMI 2.1-rated monitors include VRR support, and this allows computers and consoles to draw frames on your screen as soon as they’re ready, instead of dividing into a “base” frame rate (so long as the frame-rate minimum exceeds approximately 40-45 fps). But PS5 is not yet compatible with the standard for some reason, and Sony hasn’t indicated that it’ll remedy that issue any time soon—which particularly stinks for something like R&C:RA, whose “performance” modes could very well stretch into the frame rate 70s and 80s without having to divide into a number like 120.
Until then, 40 fps will have to do, and at least in R&C:RA‘s case, it’s a damned good compromise.