ASUS PG259QNR: 360Hz display with integrated latency analyzer?

Offering its partners the first G-Sync IPS panels capable of reaching 360 Hz, NVIDIA had them add a latency analyzer. If such products are not expected until next year, we were able to test a preview of the ROG Swift PG259QNR from ASUS. With its GeForce RTX 30 series, NVIDIA has announced several technological developments. One of them was Reflex, also benefiting its older generation GPU, from the GTX 900 generation.

Faced with the enthusiasm for eSports, the manufacturer does not want to be satisfied with maximizing raw performance or reducing the consumption of its chips. He is also interested in an aspect that is increasingly preoccupying his competitive gaming audience and which does not only concern graphics cards: latency.

Thus, for several years now, solutions have been implemented to reduce it as much as possible. But we were generally faced with two problems: the actions undertaken went only through the pilots, both at AMD and NVIDIA, with all the limitations that this implies. Moreover, this “delay” is difficult to measure precisely.

And it is these two aspects that the Reflex initiative intends to tackle. How? ‘Or’ What? This is what we will detail, with our first test of a screen with NVIDIA RLA (Reflex Latency Analyzer): the ASUS PG259QNR at 360Hz (IPS).

What is latency, why bother?

To put it simply, it is most often explained that in the field of video games, latency is the time which separates an action such as the click of a mouse button, from its manifestation on the screen, like a flash of firing.

In reality, it’s a little more complex than that. Because there is not one, but latencies. Indeed, each component has its own reaction time which is added up: mouse, calculations carried out by the CPU, GPU, organization of the render queue, screen. For online gaming, there is also network latency which is far from negligible.

Each manufacturer therefore aims to reduce this latency as much as possible to its level. In recent years, we have seen mice and screens promise mountains and wonders. Why? Because put end to end, these few milliseconds recovered here and there can become a significant competitive advantage.

A fairly reactive player will thus be able to see an enemy arriving in his field of vision sooner, to make the decision to shoot earlier. Manufacturers use many of these arguments to justify their investments in this area. NVIDIA Research also produces rather interesting papers on the subject.

As always, however, you have to be wary of how brand marketing is taking over all of this. Because reducing latency will not transform you into a professional gamer tomorrow. What makes the strength of the latter is above all their constant training to acquire the strategy and reflexes necessary to make the right decisions, be in the right place, at the right time and hit the mark.

As in other sports, having the right equipment, and therefore in this case low latency hardware, high display frequency, helps champions come true. An average gamer might see it as a crutch that helps them stand out. But if you’re bad, it won’t change anything (our tests confirm this).

NVIDIA Reflex: reducing latency beyond the driver

The first component of Reflex is an SDK made available to developers. It allows them to integrate mechanics into their engines or games to reduce latency. Thus, the user can activate an option going in this direction, which is expressed especially in “GPU bound” scenarios, where the CPU waits for the GPU.

It makes it possible to reduce the rendering queue as much as possible, by arranging for the CPU to send images to the GPU as soon as possible to be calculated and then to take care of the mouse actions. If necessary, the graphics processor has its boost frequency adjusted to send images to the screen as soon as possible in “CPU Bound” scenarios.

This is why you will usually see two levels of Reflex options in the settings of compatible games, allowing you to adapt to your settings. NVIDIA confirms that this way of doing things is similar to what was integrated into its drivers, but much more efficient since it is controlled directly from the game engine.

The manufacturer has sufficiently prepared his shot so that several titles are already compatible. This is particularly the case of Apex Legends, Fortnite, Call of Duty: Modern Warfare and Warzone but also the beta of Cold War or Valorant. Others are expected to follow in the weeks and months to come.

RLA: screens and mouse to measure latency

Reducing latency is one thing, but it is still necessary to be able to measure the gains made. Until now, the solutions to be put in place on the subject were quite expensive since a camera with a very high image rate was needed, mechanisms to make the link between the click of the mouse and the appearance at the screen, etc.

Manufacturers have such resources, or companies working in areas where latency is critical such as cloud gaming. But this is not necessarily the case for testers and even less for players. NVIDIA has therefore worked on improving its G-Sync screens which could provide a solution.

Thus, certain 360 Hz screens which will be put on the market next year will integrate a feature called Reflex Latency Analyzer (RLA). Used with a compatible mouse (some on the market already), sending a signal with each click, they can display the overall latency, but also allow to analyze it and therefore to know what is the share of the mouse, the rendering and of the screen in the total. But again, nothing is simple.

What to do, for example, when a mouse does not send the signal? NVIDIA has set up a database referencing the latency of the 30 most popular models to take them into account in its calculation. Which area should be analyzed to detect the effect of the click? Games and screens have their role to play.

ROG Swift PG259QNR: a first RLA model by Asus

To help us better understand how RLA works, NVIDIA provided us with an ASUS ROG Swift PG259QNR Display. We will not go into its characteristics in detail here since it is a PG259QN from which it takes all the advantages, including a 1080p IPS matte panel that can reach 360 Hz via G-Sync.

What is such a frequency for? Not much on the desk. Indeed, if we perceive a clear difference in fluidity when going from 60 to 120 Hz, bringing real comfort even in office automation, the difference exists but is more difficult to perceive beyond. It is in games, especially competitive games, that such a solution has a real interest.

Indeed, this makes it possible to have synchronization between the rendering of the image and its display on the screen even when playing with a very high IPS bit rate. It is therefore common for such games to exceed 200 or 300 fps at 1080p with a high-end graphics card. Such a screen allows you to make the most of it.

Is it necessary? It is up to each to judge according to his practice and his needs. In any case, we can only recommend you as much as possible to use panels at 120 Hz or more, and variable cooling.

This screen also benefits from HDR, DisplayPort 1.4, HDMI 2.0, USB 3.0 hub, jack (3.5 mm), wall mounting (VESA 100), viewing angles of 178° and a brightness of 400 cd/m² (contrast at 1000: 1). Rather robust in its design, it can rotate up to 90° (its foot from -25° to 25°), tilt from -5° to 20°, be adjusted up to 120 mm.

In short, this is a top of the range model for which we ultimately only have the size of its foot (a little imposing) to blame. We measured its consumption at the outlet between 25 and 40 watts depending on the level of light. By going from 360 Hz to 60 Hz, the maximum value has been reduced by 10 watts.

The price of the classic model is around 700 euros. ASUS has confirmed to us that its QNR version would be more expensive, without being able to confirm the price for the moment. It is not expected before early 2021. We used it for the tests with a Chakram Core mouse from the manufacturer. A model which it boasts of very low latency with some refinements such as a programmable joystick, high configurability, up to 16,000 DPI, etc.

What are its concrete differences from the screen? First of all a specific port, red, where to connect the mouse. Operating in passthrough, it allows you to recover the click signal which will be used as a basis for calculating the latency.

The OSD allows you to define an area of ​​a larger or smaller size, visible or not (according to your choice) which will be analyzed. The evolution of the pixels in the area will be considered as the end signal of the latency measurement. Some games, like Fortnite, have an option to make a white rectangle appear on the screen with each click. Enough to take sharper measurements by placing the analysis area there.

The updated GeForce Experience performance viewer

The screen can display in real time its frequency, constantly adapted by G-Sync, but also the total latency of the system. In fact, it can only determine the time which separates the sending of the signal from the evolution of the detection zone. To analyze this result, NVIDIA uses the performance overlay of GeForce Experience.

This retrieves the latency information calculated by the screen, which it mixes with its own readings. In connection with the graphics card drivers, it can differentiate between rendering latency, that of the screen and that remaining with the mouse. All displayed within the overlay in real time. An average of the values ​​is also calculated.

At the moment, this version of GeForce Experience (3.20.6.5) is not yet available. Note also that the latency overlay is added to the other pre-configurations offered. We continue to hope that NVIDIA will eventually offer to select the values ​​that we want to appear or not appear independently of each other.

During our surveys, we were able to observe significant gains. For example in Fortnite by going from 36.2 to 26.1 ms of average system latency, that of the screen and the PC (Average DLSS, Ultra graphics, RTX). It was similarly reduced on different titles on our test system equipped with a GeForce RTX 3080.

On a game like Apex Legends, which is not very greedy, you should not expect big gains unless you are using an entry-level or mid-range graphics card. Our readings were in all cases around 15 ms.

A very interesting technology, aimed at a limited audience

Ironically enough, the first people interested in such a technology will undoubtedly be the testers. Indeed, having a solution allowing to analyze the end-to-end latency in different scenarios is interesting for testing certain components, but also cloud gaming services for example.

We also imagine that competitive players will be very fond of RLA certified products, which will allow them to adjust their games in such a way as to reduce latency as much as possible and make the right choice of accessories. For the general public, RLA as a 360 Hz screen will be of more limited interest, especially given the additional cost of the set.

But it can help the emergence of a coherent and technical discourse that the issue of high frequency gaming and latency, when these topics are too often overlooked. What encourage you to finally jump the course of 60 Hz or to go a little further in the options of your games and your drivers? We hope so.

We can also imagine that NVIDIA initiative will not fail to make AMD react, which could also seek to strengthen its strengths on the latency side. We will find out soon enough.

Leave a Reply

Your email address will not be published. Required fields are marked *