There is no g sync setting in the nvidia driver. System requirements for g-sync hdr. Desktop PC connected to a G-SYNC monitor

In those good old days when the owners personal computers they actively used huge CRT monitors, earning themselves astigmatism, there was no question of smoothness of the image. The technologies of that time did not really support 3D either. Therefore, poor users had to be content with what they had. But as time goes on, technologies develop, and many are no longer satisfied with the tearing of the frame (tearing) in a dynamic game. This is especially true for the so-called cyber athletes. In their case, a split second decides everything. How to be?

Progress does not stand still. Therefore, what seemed impossible before can now be taken for granted. The same situation is with the image quality on the computer. Manufacturers of video cards and other components for PCs are now working hard on the problem of poor-quality image output on monitors. And I must say that they have already advanced quite far. Just a little remains, and the image on the monitor will be perfect. But this is all a lyrical digression. Let's get back to our main topic.

A bit of history

Many monitors have actively tried to overcome tearing and improve the image. What they didn’t invent: they increased the “hertz” of the monitor, turned on V-Sync. Nothing helped. And one day a famous manufacturer NVIDIA graphics cards presents technology G-Sync, with which you can achieve "unreal" smooth images without any artifacts. It seems to be good, but there is one small, but very serious "but". To use this option, you need monitors with G-Sync support. The manufacturers of monitors had to strain themselves and "throw" a couple of dozen models onto the market. What's next? Let's take a look at the technology and try to figure out if it's so good.

What is G-Sync?

G-Sync is NVIDIA's display technology. It is characterized by smooth frame changes without any artifacts. There are no image tearing, no slowdowns. For adequate operation of this technology, a rather powerful computer is required, since a rather large processor power is needed to process a digital signal. That is why only new models of video cards from NVIDIA are supplied with technology. In addition, G-Sync is a proprietary NVIDIA feature, so the owners of video cards from other manufacturers do not get a chance.

In addition, a G-Sync monitor is required. The thing is, they are equipped with a board with a digital signal converter. Owners of conventional monitors will not be able to take advantage of this exciting option. It is unfair, of course, but this is the policy of modern manufacturers - to siphon as much money as possible from the poor user. If your PC is configured to use G-Sync, and the monitor miraculously supports this option, then you may well appreciate all the delights of this technology.

How G-Sync works

Let's try to simplify how G-Sync works. The fact is that an ordinary GPU (video card) simply sends a digital signal to the monitor, but does not count its frequency. That is why the signal when displayed on the screen turns out to be "ragged". The signal coming from the GPU is interrupted by the monitor frequency and looks clumsy in the final version. Even with the V-Sync option enabled.

When using G-Sync, the GPU itself adjusts the frequency of the monitor. That is why signals reach the matrix when they really need it. Thanks to this, it becomes possible to avoid image tearing and improve the smoothness of the picture as a whole. Since conventional monitors do not allow the GPU to control itself, a G-Sync monitor was invented, in which an NVIDIA board was embedded to regulate the frequency. Therefore, it is not possible to use conventional monitors.

Monitors supporting this technology

Gone are the days when users killed their eyesight by staring at ancient CRT monitors for hours. The current models are elegant and harmless. So why not add some new technology to them? The first monitor with NVIDIA G-Sync support and 4K resolution was released by Acer. The novelty made a splash.

So far, high-quality monitors with G-Sync are pretty rare. But in the plans the manufacturers have an idea to make these devices standard. Most likely, in five years, monitors supporting this technology will become a standard solution even for an office PC. In the meantime, it remains only to look at these new items and wait for their widespread distribution. That's when they get cheaper.

After that, monitors with G-Sync support began to rivet all and sundry. Even budget models with this technology have appeared. But what's the use of this technology on a budget screen with a bad matrix? But, be that as it may, such models do exist. The best option for this option is (G-Sync will work on it at full strength).

Best monitors with G-Sync

Monitors with G-Sync technology stand out in a special line of devices. They must have the characteristics required for the full operation of this option. It is clear that not all screens will cope with this task. Several leaders in the production of such monitors have already been identified. Their models turned out to be very successful.

For example, the G-Sync monitor is one of the brightest representatives of this line. This device belongs to the premium class. Why? Judge for yourself. The diagonal of the screen is 34 inches, the resolution is 4K, the contrast is 1: 1000, 100 Hz, the response time of the matrix is ​​5 ms. In addition, many would like to get themselves this "monster". It is clear that he will cope with the G-Sync technology with a bang. It has no analogues yet. You can safely call him the best in its class and not be mistaken.

In general, ASUS G-Sync monitors are now at the top of Olympus. Not a single manufacturer has yet been able to surpass this company. And it is unlikely that it will ever work out. ASUS can be called a pioneer in this regard. Their G-Sync-enabled monitors go viral.

The future of G-Sync

Now G-Sync technology is actively trying to implement in laptops. Some manufacturers have even released a couple of these models. Moreover, they can work without a G-Sync card in the monitor. Which is understandable. Still, the laptop has slightly different design features. There is quite enough video card with support for this technology.

Probably, NVIDIA G-Sync will soon occupy a significant place in the computer industry. Monitors with this technology should be cheaper. Eventually, this option should become widely available. Otherwise, what's the point in developing it? In any case, so far everything is not so rosy. There are some problems with the implementation of G-Sync.

In the future, G-Sync technology will become the same everyday thing that was once for us the VGA port for connecting a monitor. But all sorts of "vertical synchronization" against the background of this technology look like a blatant anachronism. Not only can these outdated technologies not provide satisfactory picture quality, but they also "eat" a considerable amount system resources... Definitely, with the advent of G-Sync, their place in the dustbin of history.

Testing methodology

ASUS ROG SWIFT PG278Q monitor has been tested using our new method. We decided to ditch the slow and sometimes inaccurate Spyder4 Elite in favor of the faster and more accurate X-Rite i1Display Pro colorimeter. Now this colorimeter will be used to measure the main parameters of the display in conjunction with the Argyll CMS software package. latest version... All operations will be carried out in Windows 8. During testing, the screen refresh rate is 60 Hz.

In accordance with the new methodology, we will measure the following monitor parameters:

  • Brightness of white at backlight power from 0 to 100% in 10% steps;
  • Black brightness at backlight power from 0 to 100% in 10% increments;
  • Display contrast at backlight power from 0 to 100% in 10% steps;
  • Color gamut;
  • Color temperature;
  • Gamma curves of three primary colors RGB;
  • The gamma curve is gray;
  • Delta E (according to CIEDE2000 standard).

For calibration and analysis, Delta E is applied graphical interface for Argyll CMS - DispcalGUI, the latest version at the time of this writing. All measurements described above are taken before and after calibration. During our tests, we measure the main monitor profiles - default, sRGB (if available) and Adobe RGB (if available). Calibration is carried out in the default profile, with the exception of special cases, which will be discussed further. For monitors with wide color gamut, we select sRGB hardware emulation if available. In the latter case, colors are converted according to the monitor's internal LUTs (which can have a bit depth of up to 14 bits per channel) and output to a 10-bit matrix, while an attempt to narrow the gamut to the sRGB borders with the OS color correction tools will lead to a decrease in color coding accuracy. Before starting all tests, the monitor warms up for an hour, and all its settings are reset to factory settings.

We will also continue our old practice of publishing calibration profiles for the monitors we tested at the end of the article. At the same time, the 3DNews test laboratory warns that such a profile will not be able to 100% correct the shortcomings of your particular monitor. The fact is that all monitors (even within the same model) will necessarily differ from each other in their small color errors. It is physically impossible to make two identical matrices - they are too complicated. Therefore, any serious monitor calibration requires a colorimeter or spectrophotometer. But even a “universal” profile created for a specific instance can generally improve the situation with other devices of the same model, especially in the case of cheap displays with pronounced color defects.

Viewing angles, backlight uniformity

The first thing that interested us in ASUS PG278Q was the viewing angles, because the monitor uses a TN matrix - its biggest problems are always associated with them. Fortunately, it turned out to be not so bad. Of course, IPS matrices have larger viewing angles, but the ASUS PG278Q did not have to be rotated often in order to eliminate distortions in contrast and color rendition.

But the developers of ASUS PG278Q could not avoid problems with screen backlighting. The monitor has a slight flare in all four corners and at the top. If a game is running on the display, it will not be easy to see the flare, but if you run a movie in a dark room (with the usual vertical black stripes above and below), the defect becomes immediately noticeable.

Testing without calibration

Maximum brightness ASUS PG278Q was 404 cd / m2 - even more than the manufacturer promises. Such a high value is justified by the support for 3D, because when using active shutter glasses, the perceived brightness of the monitor can drop by half. The maximum brightness of the black field was 0.40 cd / m2, which is also quite good. As a result, the static contrast ratio fluctuates around 1000: 1 across the entire brightness range of the backlight. An excellent result - such a high contrast is typical for high-quality IPS-matrices. The MVA, however, is out of reach.

Our subject's color gamut is as good as required. The sRGB color space is 107.1% covered. The white point is near the D65 reference point.

If we talk about games, then with the color palette of ASUS PG278Q full order, but there can be problems with professional photo processing due to slightly oversaturated colors due to excessive color gamut compared to sRGB. However, the display we are considering is designed just for games, so you shouldn't pay much attention to this drawback.

The color temperature of ASUS PG278Q during the measurements was kept at the level of 6000 K, which is 500 K below the norm. This means that light colors may show a slight warm tint.

Only the red gamma curve turned out to be close to the standard, and the blue and green curves sagged, although they tried to stick together. At the same time, the monitor's gray gamut is doing pretty well. When measuring dark tones, it practically does not deviate from the reference curve, and when passing to light tones, it departs, but not much.

The average value of the Delta E color accuracy index was 2.08 units, and the maximum value was 7.07 units. The results, of course, are not the best, but, firstly, the ASUS PG278Q is still intended for games, not for photo processing, and secondly, for a TN matrix, the results we obtained are quite satisfactory.

Post-calibration testing

Usually, after calibration, the white brightness drops, and very much - by 10% or more, even for quite high-quality panels. In the case of ASUS PG278Q, it dropped by about 3% - to 391 cd / m2. The hardware calibration did not affect the brightness of the black field. As a result, the static contrast ratio dropped to 970: 1.

Calibration had practically no effect on the color gamut, but the white point returned to its proper place, albeit it moved only quite a bit.

After calibration, the color temperature rose a little, but did not reach the reference one. Now the gap between the measured and the reference value was about 100-200 K instead of 500 K, which, however, is quite tolerable.

The position of the three main gamma curves, unfortunately, hardly changed after calibration, while the gray gamma looked a little better.

Calibration had the best effect on color accuracy. The average value of Delta E dropped to 0.36 units, the maximum - to 1.26 units. Excellent results for any matrix, and for TN + Film - just fantastic.

G-Sync Testing: Technique

NVIDIA's G-Sync Guide provides settings for testing across multiple games, with frames per second hovering between 40 and 60 FPS. It is in such conditions that at a refresh rate of 60 Hz there are most "friezes" with V-Sync enabled. We'll start by comparing three use cases: with V-Sync, without it, and with G-Sync - all at 60Hz.

But remember that raising the refresh rate from 60 to 120/144 Hz by itself makes the breaks less noticeable without vertical synchronization, and with V-Sync reduces the "freezes" from 13 to 8/7 ms, respectively. Is there any real benefit to G-Sync over 144Hz V-Sync? Let's check this too.

I would like to emphasize that, if you believe the description, in the case of G-Sync, the refresh rate does not make sense at all. Therefore, it is not entirely correct to say that we, for example, compared V-Sync and G-Sync at 60 Hz. V-Sync was indeed at 60Hz, and G-Sync means screen refresh on demand, not periodically. But even with G-Sync enabled, we can still select the refresh rate in the driver control panel. At the same time, FRAPS in games when G-Sync is activated shows that exactly such a ceiling of the frame rate is in effect, as if V-Sync was working. It turns out that this setting adjusts the minimum frame life time and, accordingly, the screen refresh interval. Roughly speaking, the frequency range in which the monitor operates is set - from 30 to 60-144 Hz.

In order to enable G-Sync, you need to go to the NVIDIA control panel, find the corresponding link in the left corner of the screen and check the box next to the only checkbox. The technology is supported in drivers for Windows 7 and 8.

Then you need to make sure G-Sync is also enabled in the 3D Settings section - it can be found in the Vertical Sync submenu.

That's all: the G-Sync function has been enabled for all games running in full screen mode - this function is not yet able to work in the window. For testing, we used a stand with a graphic with a GeForce card GTX TITAN Black.

The tests were carried out in the games Assasin's Creed: Black Flag, as well as in Counter-Strike: Global Offensive. We tested the new technology in two ways: we just played and then hunted for gaps using a script that smoothly moved the game camera, that is, "moved the mouse" horizontally. The first method allowed us to evaluate the feeling of G-Sync "in battle", and the second - to more clearly see the difference between V-sync on / off and G-Sync.

G-Sync in Assasin's Creed: Black Flag, 60 Hz

No V-Sync and G-Sync @ 60Hz tears were perfectly visible with almost any camera movement.

The gap is visible in the upper right part of the frame, near the mast of the ship.

When V-Sync was enabled, tearing disappeared, but freezes appeared, which did not benefit the gameplay.

The double mast of the ship in the photo is one of the signs of the "frieze"

After enabling G-Sync tears and freezes disappeared completely, the game began to run smoother. Of course, the periodic decrease in the frame rate to 35-40 FPS was noticeable, but thanks to the synchronization of the display and the video card, it did not cause such noticeable brakes as with vertical synchronization.

However, as they say, it's better to see once than hear a hundred times, so we made a short video showing the work of the new Assassins with V-sync on and off, as well as G-Sync. Of course, the video cannot fully convey the "live" impressions, if only because of the shooting at a frequency of 30 frames per second. In addition, the camera "sees" the world differently than the human eye, so artifacts that cannot be seen in the real world, such as ghosting, can be seen in the video. Nevertheless, we tried to make this video as descriptive as possible: at least the presence or absence of gaps in it is quite noticeable.

Now let's launch Assasin's Creed: Black Flag with minimal settings and see what has changed. The number of frames per second in this game mode did not exceed 60 FPS - the set refresh rate. Without V-sync turned on, tears were noticeable on the screen. But as soon as V-Sync was turned on, the gaps disappeared and the “picture” began to look almost the same as with G-Sync.

When exposing maximum settings graphics number of frames per second began to fluctuate around 25-35 FPS. Of course, the breaks immediately returned without V-Sync and "friezes" with it. Even enabling G-Sync could not fix this situation - with such a low FPS, the GPU itself generates brakes.

G-Sync in Assasin's Creed: Black Flag, 144 Hz

With V-Sync and G-sync disabled tears could be found on the screen, but thanks to the 144Hz refresh rate, there are far fewer gaps than before. When turned on V-Sync gaps disappeared, but "friezes" began to appear more often - almost like with a screen refresh rate of 60 Hz.

Turning on G-Sync, as before, it was able to correct the situation, but the strongest improvement in the picture was noticeable only at a high frame rate - from 60 FPS and above. But without reducing the settings or adding a second video card of the GeForce GTX Titan Black level, it was impossible to achieve such a high frame rate.

G-Sync in Counter-Strike: Global Offensive, 60 and 144 Hz

In network games, the gameplay and image quality are affected not only by the video card and monitor, but also by the ping - the higher it is, the greater the delay in the "response" of the game. During our tests, the ping was at the level of 25-50 ms, and the frame rate during the test hovered around 200 FPS.

Image settings used in Counter-Strike: Global Offensive

Without using G-Sync and V-Sync in CS, as in Assasin's Creed, there were gaps. When turned on V-Sync @ 60 Hz it became more difficult to play - the frame rate dropped to 60 FPS, and the game character began to run unevenly due to a large number"Friezes".

When turned on G-Sync the frame rate has remained at 60 frames per second, but the "freezes" have become much less. This is not to say that they disappeared altogether, but they stopped spoiling the impression of the game.

Now let's increase the refresh rate of the screen and see what changes. With G-Sync and V-Sync off at 144Hz the discontinuities became much smaller than at 60 Hz, but they did not disappear completely. But when you turn on V-Sync all gaps have disappeared, and the "friezes" have become almost invisible: playing in this mode is very comfortable, and the speed of movement does not decrease. Turning on G-Sync and turned the image into candy at all: the gameplay became so smooth that even a 25ms ping began to strongly affect the gameplay.

Testing the ULMB mode

Ultra Low Motion Blur is enabled from the monitor menu, but you must first disable G-Sync and set the screen refresh rate to 85, 100 or 120 Hz. Lower or higher frequencies are not supported.

The practical application of this "trick" is obvious: the text on sites is less blurred during scrolling, and in strategies and other RTS games, moving units look more detailed.

ASUS ROG SWIFT PG278Q in 3D

ASUS ROG SWIFT PG278Q is the world's first monitor capable of displaying stereoscopic images at 2560x1440 using DisplayPort 1.2. Also, in principle, a rather big achievement. Unfortunately, the monitor does not have a built-in IR transmitter, so we took the transmitter from the NVIDIA 3D Vision kit, and the glasses from the 3D Vision 2 kit. Such a bundle worked without problems, and we were able to test stereoscopic 3D properly.

We did not find any effect of ghosting and other artifacts encountered in pseudo-volume video. Of course, sometimes in games some objects were at the wrong depth, but this cannot be attributed to the disadvantages of the monitor. On ASUS PG278Q you can watch stereo movies and play similar games. The main thing is that the video adapter pulls.

⇡ Conclusions

Without wishing to diminish the achievements of NVIDIA, it should be noted that, in general, G-Sync is such an innovation that boils down to getting rid of the long-standing and harmful atavism - regular updating of LCD panels that do not initially need it. It turned out that for this, it was enough to make small changes to the DisplayPort protocol, which at the click of your fingers got into the 1.2a specification and, according to AMD promises, will very soon find application in display controllers from many manufacturers.

So far, however, only a proprietary version of this solution is available in the form of G-Sync, which we had the pleasure of testing in the ASUS ROG SWIFT PG278Q monitor. The irony is that this is the kind of monitor for which the benefits of G-Sync are not very noticeable. Refreshing the screen at 144Hz in itself reduces the number of notorious breaks so much that many will be ready to turn a blind eye to this problem. And with vertical sync, we have less pronounced "freezes" and input lag compared to 60Hz screens. G-Sync in such a situation can only bring the smoothness of the game to the ideal.

Still, syncing screen refresh with GPU frame rendering is still a more graceful and economical solution than constant refresh at ultra-high frequency. Also, let's not forget that the use of G-Sync is not limited to matrices with a frequency of 120/144 Hz. First of all, 4K monitors come to mind, which are still limited to 60 Hz in terms of matrix specifications and bandwidth video inputs. Then - monitors on IPS, also unable to switch to 120/144 Hz due to the limitations of the technology itself.

At 60Hz refresh rate, the effect of G-Sync cannot be overstated. If the frame rate is consistently above 60 FPS, then simple V-sync eliminates tearing just as well, but only G-Sync can keep the frame smooth when the frame rate drops below the refresh rate. In addition, thanks to G-Sync, the performance range of 30-60 FPS becomes much more playable, which either reduces the GPU performance requirements or allows for more aggressive quality settings. And again the thought comes back to 4K monitors, which require extremely powerful hardware to play with good graphics.

It is also commendable that NVIDIA has adopted the pulsed backlight technology to remove motion blur (ULMB), which we saw earlier with the EIZO Foris FG2421. It's a shame that it cannot work simultaneously with G-Sync yet.

The ASUS ROG SWIFT PG278Q monitor itself is good primarily for the combination of 2560x1440 resolution and 144 Hz refresh rate. Previously, there were no devices with such parameters on the market, and meanwhile, gaming monitors with such a low response time and support for stereoscopic 3D are long overdue to grow out of Full HD. You shouldn't find fault with the fact that the PG278Q has a TN-matrix, because this is a really good copy with the highest brightness, contrast and excellent color reproduction, which, after calibration, will be the envy of IPS displays. The technology is given out only by limited viewing angles. We will not leave without praise the design befitting such a quality product. ASUS ROG SWIFT PG278Q wins the well-deserved Editors' Choice award - it turned out to be so good.

Recommend this gaming monitor only the price in the region of 30 thousand rubles interferes with buying without unnecessary hesitation. In addition, at the time of this writing, ASUS ROG SWIFT PG278Q is still not on sale in the Russian Federation, so there is nowhere to see it, as well as G-Sync, with your own eyes. But we hope that ASUS and NVIDIA will solve this problem in the future - for example, by showing G-Sync at computer games exhibitions. Well, the price will probably go down one day ...

From the file server site you can download the color profile for this monitor, which we received after calibration.

The editors of the site are grateful to the "Grafitech" company for providing the X-Rite i1Display Pro colorimeter.

G-Sync Technology Overview | A brief history of fixed refresh rates

Long ago, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Adjusting the speed of an electron gun from one full upgrade to the next was not very practical before, and there was no particular need for this before the advent of 3D games. Therefore, CRTs and related analog video standards were developed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI, and DisplayPort) replaced analog (VGA). But the video standardization associations (led by VESA) have not moved from a fixed refresh rate. Movies and televisions still rely on constant frame rate inputs. Once again, switching to variable refresh rate does not seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not an issue for displays. But it arose when we first encountered powerful GPUs: the frequency at which the GPU rendered individual frames (what we call the frame rate, usually expressed in FPS or frames per second) is not constant. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and when looking at an empty sky - 60 FPS.


Disabling sync leads to breaks

It turns out that the variable frame rate GPU and fixed refresh rate LCD panels don't work very well together. In this configuration, we encounter a graphical artifact called tearing. It appears when two or more incomplete frames are rendered together during the same monitor refresh cycle. They usually shift, which gives a very unpleasant effect while driving.

The image above shows two well-known artifacts that are common but difficult to capture. Since these are display artifacts, you won't see this in regular game screenshots, however our shots show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a card with video capture support, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to another; this is the method we use for the FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with a camera, the bottom one is done with a video capture function. The bottom picture is "cut" horizontally and looks displaced. In the top two images, the left is on a Sharp 60Hz display, and the right is on an Asus 120Hz display. The gap on the 120 Hz display is not as pronounced as the refresh rate is twice as high. However, the effect is visible and manifests in the same way as in the left image. An artifact of this type is a clear indication that the images were taken with V-sync turned off.


Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect seen in BioShock: Infinite images is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels do not change color fast enough, which leads to this type of glow. A single shot cannot convey the effect of ghosting on the game itself. A panel with 8ms gray-to-gray response times, such as the Sharp, will result in blurred image with any movement on the screen. This is why these displays are generally not recommended for first-person shooters.

V-sync: "wasted on soap"

Vertical sync, or V-sync, is a very old solution to the tearing problem. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing the tears. The problem is that if your video card is unable to keep the frame rate above 60 FPS (on a display with a 60 Hz refresh rate), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). which in turn will lead to noticeable slowdowns.


When the frame rate drops below the refresh rate while V-sync is active, you will experience stuttering

Moreover, because V-sync forces the graphics card to wait and sometimes relies on an invisible buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems, but at the same time provoking other disadvantages. An informal survey of our employees found that gamers tend to turn off vertical sync, and only turn it on when tears get unbearable.

Get creative: Nvidia unveils G-Sync

At startup new graphics card GeForce GTX 680 Nvidia has enabled a driver mode called Adaptive V-sync (adaptive vertical sync), which attempts to mitigate problems when V-sync is enabled when the frame rate is above the monitor's refresh rate, and quickly turns it off when there is a dramatic drop in performance below the refresh rate. While the technology did its job in good faith, it was only a workaround to avoid tearing down if the frame rate was below the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can make new monitors run at an inconsistent frequency.


GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The DisplayPort packet data transmission mechanism has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaling device with a module that operates with variable blanking signals, the LCD panel can operate at a variable refresh rate related to the frame rate that the graphics card outputs (within the monitor's refresh rate). In practice, Nvidia got creative with the special features of the DisplayPort interface and tried to catch two birds with one stone.

Before starting the tests, I want to give credit for the creative approach to solving a real problem affecting PC games. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-Sync... We are already familiar with this display. The article is dedicated to him "Asus VG248QE Review: 24" 144Hz Gaming Monitor for $ 400 ", in which the monitor earned Tom's Hardware Smart Buy award. Now it's time to see how Nvidia's new technology will impact the hottest games.

G-Sync Technology Overview | 3D LightBoost, Onboard Memory, Standards & 4K

As we browsed through Nvidia's press materials, we asked ourselves many questions, both about the place of technology in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed was that Nvidia sent a monitor Asus VG248QE modified to support G-Sync... This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to brighten up 3D displays but has long been unofficially used in 2D mode, using a pulsating panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting if this technology is used in G-Sync.

Nvidia responded in the negative. Although using both technologies at the same time would ideal solution Today, strobing backlighting at variable refresh rates leads to flickering and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, it now has to choose between the two technologies, although the company is trying to find a way to use them simultaneously in the future.

Built-in memory of the G-Sync module

As we already know G-Sync eliminates the incremental input lag associated with V-sync as there is no longer any need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for a frame to get through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data is received, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, we encounter almost the same delay when V-sync is off, and it is associated with the peculiarities of the game, video driver, mouse, etc.

Will G-Sync be standardized?

This question was asked in a recent interview with AMD, when the reader wanted to know the company's reaction to the technology. G-Sync... However, we wanted to ask this directly from the developer and find out if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA Association.

However, no new specifications are planned for DisplayPort, HDMI or DVI. G-Sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology now called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules. G-Sync and make them more accessible.

G-Sync at Ultra HD

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we are going to review today, only supports 1920x1080 pixels. Ultra HD monitors currently use the STMicro Athena controller, which has two scaling units to create a tiled display. We are wondering if the module will G-Sync support MST configuration?

In truth, 4K VFR displays will have to wait. There is no standalone 4K upscaler yet, the nearest one is due in the first quarter of 2014, and monitors equipped with them only in the second quarter. Since the module G-Sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens up to 30 Hz?

G-Sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played back more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync Technology Overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to only high refresh rate panels?

You will notice that the first monitor with G-Sync initially has a very high screen refresh rate (higher than the level required for the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, like the 6-bit TN panel. We got curious about the introduction of technology G-Sync Is it planned only for displays with a high refresh rate, or will we be able to see it on more common 60 Hz monitors? Besides, I want to get access to 2560x1440 resolution as soon as possible.

Nvidia reiterated that the best experience from G-Sync can be obtained when your video card keeps the frame rate in the range of 30 - 60 FPS. So the technology can really benefit from conventional 60Hz monitors and module G-Sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decided not to use this function (and why not, because it is not yet compatible with G-Sync) can create a panel with G-Sync for much less money.

Speaking of resolutions, it can be noted that everything goes like this: QHD screens with a refresh rate of more than 120 Hz may begin to be released as early as 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Nowadays, of course, you don't need to combine two graphics adapters in order to display images in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to play comfortably at this resolution. But there is also no way to run two cards in SLI for three G-Sync-monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI, and one DisplayPort. G-Sync requires DisplayPort 1.2 and the adapter will not work (nor does an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. a separate card for each monitor. Naturally, we expect Nvidia's partners to start shipping "G-Sync Edition" cards with more DisplayPort connections.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with Vsync. Do I need it for G-Sync? The answer is no. G-Sync not only does not require triple buffering, since the channel never stops, on the contrary, it harms G-Sync because it adds an extra frame of latency with no performance gain. Unfortunately, triple buffering games are often self-defined and cannot be manually bypassed.

What about games that tend to react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to work with V-sync on a 60Hz panel (although this sometimes makes life difficult for us due to input lag). To test them, you need to modify certain files with the .ini extension. As it behaves G-Sync with games based on vsync sensitive Gamebryo and Creation engines? Are they limited to 60 FPS?

Second, you need a monitor with an Nvidia module G-Sync... This module replaces the display scaler. And, for example, add to the split Ultra HD display G-Sync impossible. For today's review, we are using a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you will be able to get an idea of ​​what impact it will have G-Sync if manufacturers start installing it in cheaper 60Hz panels.

Third, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option for work G-Sync on three monitors in Surround mode, they are connected via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

Finally, don't forget about driver support. The latest package, version 331.93 beta, already has compatibility with G-Sync and we expect future WHQL-certified versions to be equipped with it.

Test stand

Test bench configuration
CPU Intel Core i7-3970X (Sandy Bridge-E), 3.5 GHz base frequency, overclocked to 4.3 GHz, LGA 2011, 15 MB Shared L3 Cache, Hyper-Threading On, Power Saving Features On
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 and 1.65V
Storage device Samsung 840 Pro SSD 256 GB SATA 6Gb / s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2GB
Power Supply Corsair AX860i 860 W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now you need to figure out in what cases G-Sync has the greatest impact. Chances are good that you are already using a monitor with a 60Hz refresh rate. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that most of the enthusiasts on the market will still stick to 60 Hz.

With active vertical sync on a 60Hz monitor, the most noticeable artifacts appear when the card cannot deliver 60fps, resulting in annoying jumps between 30 and 60fps. There are noticeable slowdowns here. With V sync off, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. Some players are so distracted by this that they just turn on V-sync and suffer slowdowns and input delays.

At 120 Hz and 144 Hz and higher frame rates, the display refreshes more frequently, shortening the time that one frame is retained over multiple screen scans when performance is insufficient. However, issues with active and inactive vertical sync persist. For this reason, we will be testing the Asus monitor at 60 and 144 Hz with technology on and off. G-Sync .

G-Sync Technology Overview | Testing G-Sync with V-Sync Enabled

It's time to start testing G-Sync... All that remains is to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it’s wrong.

Today we are not measuring productivity, but quality. In our case, tests can show only one thing: the frame rate at a particular point in time. On the quality and experience of use with the technology turned on and off G-Sync they don't say anything at all. Therefore, you will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to readers for judgment? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Insofar as G-Sync implements a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-Sync incl., G-Sync off, 60 Hz, 120 Hz, 144 Hz, ... The list goes on for a long time. But we'll start with a 60Hz refresh rate and active vertical sync.

Probably the easiest place to start is with Nvidia's own demo utility, in which the pendulum swings from side to side. The utility can simulate frame rates of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. Then you can disable or enable V-sync and G-Sync... Although the test is fictional, it does a good job of demonstrating the technology's capabilities. You can watch the scene at 50 FPS with vertical sync enabled and think: "Everything is quite good, and the visible slowdowns can be tolerated." But after activation G-Sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a technical demo. I would like evidence based on real games. To do this, you need to run a game with high system requirements, such as Arma III.

Arma III can be installed in a test car GeForce GTX 770 and set ultra settings. With vertical sync disabled, the frame rate fluctuates between 40 - 50 FPS. But if you enable V-sync, it drops to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the video card is simply reduced.

Since there was no freezing of the image, a significant difference when activated G-Sync invisible, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, since the same frame is not saved over multiple scans of the monitor. We feel that Arma is generally less jittery than many other games, so the lag is not felt.

On the other hand, Metro: Last Light is influenced by G-Sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select the SSAA settings from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes a hallway, which is easy to strafe back and forth. After starting the level with active vertical sync at 60 Hz, we went out into the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA leads to fluctuations from 60 to 30 FPS, from which each duplicated frame creates inconvenience. This is one of the games in which we would definitely turn off vertical sync and just ignore tearing. Many people have already developed a habit.

but G-Sync eliminates all negative effects. You no longer have to look at the Fraps counter while waiting for drawdowns below 60 FPS to lower another graphical parameter. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn more about this later.

G-Sync Technology Overview | Testing G-Sync with V-Sync Disabled

The conclusions in this material are based on a survey of the authors and friends of Tom "s Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical sync is and what disadvantages users have to put up with in this regard. , they only resort to vertical sync when the gaps become unbearable due to the large variation in frame rates and monitor refresh rates.

As you can imagine, the visual impact of Vsync off is hard to confuse, although it is heavily influenced by the game and its detail settings.

Take for example Crysis 3... The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And since Crysis 3 is a first person shooter with very dynamic gameplay, the tears can be quite palpable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we forcibly disable Vsync in Skyrim, the breaks are not that strong. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. For these reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it does show that even with vertical sync disabled, the feel of the game can change.

For the third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a fairly clear break in the image (also look at the hair and the strap of the shirt). Tomb Raider is the only game in our selection that allows you to choose between double and triple buffering when vertical sync is activated.

The last graph shows that Metro: Last Light with G-Sync at 144 Hz, generally provides the same performance as when vertical sync is disabled. However, you cannot see the absence of gaps in the chart. If you use technology with a 60 Hz screen, the frame rate will stop at 60 FPS, but there will be no slowdowns or lags.

In any case, those of you (and us) who have spent countless amounts of time on graphics tests, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with active G-Sync immediately catch the eye, because there is a smoothness, as with the enabled V-sync, but without the breaks inherent in disabled V-sync. It's a shame that now we can't show the difference in the video.

G-Sync Technology Overview | Game compatibility: almost perfect

Checking other games

We've tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All but Skyrim have benefited from technology. G-Sync... The effect depended on the competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present earlier.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. Most likely, you will want to set the anti-aliasing as high as possible to remove unpleasant bumps that were not so noticeable before.

Skyrim: special case

Skyrim's Creation graphics engine activates V-sync by default. To test the game at frame rates above 60 FPS, add the line iPresentInterval = 0 to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in the initial state, allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings intact and then enable G-Sync and disable V-sync in the game file with the .ini extension.

The first configuration, in which the development monitor is set to 60 Hz, showed a stable 60 FPS on ultra settings with a video card GeForce GTX 770... Therefore, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed a noticeable motion blur. However, this is how most people play on PC. You can of course buy a 144Hz screen and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of about 90 - 100 frames per second, noticeable slowdowns will appear when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-Sync has a negative effect on the picture, probably due to active vertical sync, while the technology should work with disabled V-sync. Now the side strafe (especially closer to the walls) leads to pronounced slowdowns. This is a potential problem for 60 Hz panels with G-Sync at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144Hz mode, and despite the active V-sync, G-Sync will work at that frame rate flawlessly.

Disabling vertical sync completely in Skyrim results in smoother mouse control. However, tearing appears in the image (not to mention other artifacts such as shimmering water). Turning on G-Sync leaves deceleration at 60 Hz, but at 144 Hz the situation improves significantly. Although in our video card reviews we test the game with Vsync disabled, we would not recommend playing without it.

For Skyrim, perhaps the most the best solution will disable G-Sync and play at 60 Hz, which will give a constant 60 frames per second at your chosen graphics settings.

G-Sync Technology Overview | Is G-Sync what you've been waiting for?

Even before we got our test sample of the Asus monitor with technology G-Sync We are already pleased with the fact that Nvidia is working on a very real problem affecting games, the solution to which has not yet been proposed. Until now, you may or may not turn on vertical sync to your liking. In this case, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to turn on V-sync until the tearing becomes unbearable, then you can say that you are choosing the lesser of two evils.

G-Sync solves the problem by giving the monitor the ability to scan the screen at a variable frequency. Such innovation is the only way to continue to advance our industry while maintaining the technical advantage of personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for failing to develop a standard that competitors can apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the technology was announced G-Sync she ended up in our hands.

The question is, is Nvidia doing everything it promised in G-Sync?

Three talented developers touting qualities of technology you've never seen in action can inspire anyone. But if your first experience with G-Sync Based on Nvidia's pendulum demo test, you are sure to wonder if such a huge difference is possible at all, or if the test presents a special scenario that is too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I can see the difference." Best of all the impact of activation G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to run the 60Hz test with G-Sync to see what you (hopefully) get with cheaper displays in the future. In some cases, simply going from 60 to 144 Hz will overwhelm you, especially if your graphics card can deliver high frame rates.

Today we know that Asus plans to implement support G-Sync in the model Asus VG248QE, which the company says will sell for $ 400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144 Hz. Version without G-Sync has already won our Smart Buy Award for Outstanding Performance. But for us personally, the 6-bit TN panel is a disadvantage. I'd like to see 2560x1440 pixels on an IPS-matrix. We'll even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, official Nvidia comments regarding other displays with modules G-Sync and we have not heard their prices. Also, we are not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-Sync into an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say that it is worth the wait. You will see that in some games the influence new technologies cannot be confused, but in others it is less pronounced. But anyway G-Sync answers the "bearded" question, whether or not to enable vertical sync.

There is another interesting thought. After we have tested G-Sync How much more AMD will be able to shy away from commenting? The company teased our readers in his interview(English), noting that she will soon decide on this opportunity. If she has anything in the plans? The end of 2013 and the beginning of 2014 prepare us a lot of interesting news to discuss, including Battlefield 4 Mantle versions, upcoming Nvidia Maxwell architecture, G-Sync, AMD xDMA engine with CrossFire support and rumors about new dual-GPU video cards. Now we are short of graphics cards with more than 3 GB (Nvidia) and 4 GB (AMD) GDDR5 memory, but they cost less than $ 1000 ...

Instructions

To correct this parameter, open the menu of your game, find the menu "Options" or "Options", in the sub-item "Video" look for the item "Vertical" (Vertical Sync). If the menu is in English and the options are text, then look for the position of the Disabled or "Disabled" switch. Then click the Apply or Apply button to save this parameter. Changes take effect after restarting the game.

Another case is if there is no such parameter in the application. Then you will have to configure the synchronization through the video card driver. The setting is different for video cards made by AMD Radeon or nVidia Geforce.

If your graphics card is in the Geforce family, click right click on the desktop and select the menu item "Panel nVidia control". Another option is to open the control panel from the Start menu, there will be a launch icon with the same name. If you don't find the icon you want in the control panel or in the desktop menu, look near the clock in the right corner of the screen, there will be a green nVidia icon that looks like an eye - double click on it. As a result, the video card settings menu will open.

The driver control panel window consists of two parts, on the left side there are categories of actions, and on the right side there are options and information. Select the bottom line "Manage 3D parameters" on the left. In the right part of the window, on the "Global parameters" tab, find the "Vertical sync pulse" option at the very top of the list. On the contrary, the current setting will be indicated: "Enable", "Disable" or "Application settings". Select the "Disable" option from the drop-down list and confirm your choice by clicking the "Apply" button.

For owners of AMD Radeon video cards, the driver is configured through special application Catalyst. To launch it, right-click on the desktop and select Catalyst Control Center. Alternatively, open your computer control panel and look for the icon with the same name. The third way - in the system area of ​​the screen near the clock, in the lower right corner, look for the red round symbol and double-click on it. The result of all these actions is the same - the control center for the settings of your video card will open.

The principle is the same as in the nVidia control panel. On the left side of the window there will be categories of settings, and on the right side there will be detailed settings and tips for them. Select Games or Gaming in the left column and then the 3D Application Settings submenu. On the right side, items for setting various parameters of the video card will appear. Scroll down the page and find the "Wait for vertical update" caption, and below it is a toggle slider with four checkmarks. Move this slider to the extreme left position, below will be the inscription "Always off". Click the "Apply" button in the lower right corner of the window to save the changes.

G-Sync Technology Overview | A brief history of fixed refresh rates

Long ago, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Adjusting the speed of an electron gun from one full upgrade to the next was not very practical before, and there was no particular need for this before the advent of 3D games. Therefore, CRTs and related analog video standards were developed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI, and DisplayPort) replaced analog (VGA). But the video standardization associations (led by VESA) have not moved from a fixed refresh rate. Movies and televisions still rely on constant frame rate inputs. Once again, switching to variable refresh rate does not seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not an issue for displays. But it arose when we first encountered powerful GPUs: the frequency at which the GPU rendered individual frames (what we call the frame rate, usually expressed in FPS or frames per second) is not constant. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and when looking at an empty sky - 60 FPS.


Disabling sync leads to breaks

It turns out that the variable GPU frame rate and the fixed refresh rate of the LCD panel don't work very well together. In this configuration, we encounter a graphical artifact called tearing. It appears when two or more incomplete frames are rendered together during the same monitor refresh cycle. They usually shift, which gives a very unpleasant effect while driving.

The image above shows two well-known artifacts that are common but difficult to capture. Since these are display artifacts, you won't see this in regular game screenshots, however our shots show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a card with video capture support, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to another; this is the method we use for the FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with a camera, the bottom one is done with a video capture function. The bottom picture is "cut" horizontally and looks displaced. In the top two images, the left is on a Sharp 60Hz display, and the right is on an Asus 120Hz display. The gap on the 120 Hz display is not as pronounced as the refresh rate is twice as high. However, the effect is visible and manifests in the same way as in the left image. An artifact of this type is a clear indication that the images were taken with V-sync turned off.


Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect seen in BioShock: Infinite images is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels do not change color fast enough, which leads to this type of glow. A single shot cannot convey the effect of ghosting on the game itself. A panel with 8ms gray-to-gray response times like the Sharp will result in a blurry image with any movement on the screen. This is why these displays are generally not recommended for first-person shooters.

V-sync: "wasted on soap"

Vertical sync, or V-sync, is a very old solution to the tearing problem. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing the tears. The problem is that if your video card is unable to keep the frame rate above 60 FPS (on a display with a 60 Hz refresh rate), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). which in turn will lead to noticeable slowdowns.


When the frame rate drops below the refresh rate while V-sync is active, you will experience stuttering

Moreover, because V-sync forces the graphics card to wait and sometimes relies on an invisible buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems, but at the same time provoking other disadvantages. An informal survey of our employees found that gamers tend to turn off vertical sync, and only turn it on when tears get unbearable.

Get creative: Nvidia unveils G-Sync

When starting a new video card GeForce GTX 680 Nvidia has enabled a driver mode called Adaptive V-sync (adaptive vertical sync), which attempts to mitigate problems when V-sync is enabled when the frame rate is above the monitor's refresh rate, and quickly turns it off when there is a dramatic drop in performance below the refresh rate. While the technology did its job in good faith, it was only a workaround to avoid tearing down if the frame rate was below the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can make new monitors run at an inconsistent frequency.


GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The DisplayPort packet data transmission mechanism has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal and replacing the monitor scaling device with a module that operates with variable blanking signals, the LCD panel can operate at a variable refresh rate related to the frame rate that the graphics card outputs (within the monitor's refresh rate). In practice, Nvidia got creative with the special features of the DisplayPort interface and tried to catch two birds with one stone.

Before starting the tests, I want to give credit for the creative approach to solving a real problem affecting PC games. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-Sync... We are already familiar with this display. The article is dedicated to him "Asus VG248QE Review: 24" 144Hz Gaming Monitor for $ 400 ", in which the monitor earned Tom's Hardware Smart Buy award. Now it's time to see how Nvidia's new technology will impact the hottest games.

G-Sync Technology Overview | 3D LightBoost, Onboard Memory, Standards & 4K

As we browsed through Nvidia's press materials, we asked ourselves many questions, both about the place of technology in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed was that Nvidia sent a monitor Asus VG248QE modified to support G-Sync... This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to brighten up 3D displays but has long been unofficially used in 2D mode, using a pulsating panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting if this technology is used in G-Sync.

Nvidia responded in the negative. While both technologies would be ideal at the same time, strobe backlighting at variable refresh rates today causes flicker and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, it now has to choose between the two technologies, although the company is trying to find a way to use them simultaneously in the future.

Built-in memory of the G-Sync module

As we already know G-Sync eliminates the incremental input lag associated with V-sync as there is no longer any need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for a frame to get through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data is received, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, we encounter almost the same delay when V-sync is off, and it is associated with the peculiarities of the game, video driver, mouse, etc.

Will G-Sync be standardized?

This question was asked in a recent interview with AMD, when the reader wanted to know the company's reaction to the technology. G-Sync... However, we wanted to ask this directly from the developer and find out if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA Association.

However, no new specifications are planned for DisplayPort, HDMI or DVI. G-Sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology now called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules. G-Sync and make them more accessible.

G-Sync at Ultra HD

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we are going to review today, only supports 1920x1080 pixels. Ultra HD monitors currently use the STMicro Athena controller, which has two scaling units to create a tiled display. We are wondering if the module will G-Sync support MST configuration?

In truth, 4K VFR displays will have to wait. There is no standalone 4K upscaler yet, the nearest one is due in the first quarter of 2014, and monitors equipped with them only in the second quarter. Since the module G-Sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens up to 30 Hz?

G-Sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played back more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync Technology Overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to only high refresh rate panels?

You will notice that the first monitor with G-Sync initially has a very high screen refresh rate (higher than the level required for the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, like the 6-bit TN panel. We got curious about the introduction of technology G-Sync Is it planned only for displays with a high refresh rate, or will we be able to see it on more common 60 Hz monitors? Besides, I want to get access to 2560x1440 resolution as soon as possible.

Nvidia reiterated that the best experience from G-Sync can be obtained when your video card keeps the frame rate in the range of 30 - 60 FPS. So the technology can really benefit from conventional 60Hz monitors and module G-Sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decided not to use this function (and why not, because it is not yet compatible with G-Sync) can create a panel with G-Sync for much less money.

Speaking of resolutions, it can be noted that everything goes like this: QHD screens with a refresh rate of more than 120 Hz may begin to be released as early as 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Nowadays, of course, you don't need to combine two graphics adapters in order to display images in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to play comfortably at this resolution. But there is also no way to run two cards in SLI for three G-Sync-monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI, and one DisplayPort. G-Sync requires DisplayPort 1.2 and the adapter will not work (nor does an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. a separate card for each monitor. Naturally, we expect Nvidia's partners to start shipping "G-Sync Edition" cards with more DisplayPort connections.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with Vsync. Do I need it for G-Sync? The answer is no. G-Sync not only does not require triple buffering, since the channel never stops, on the contrary, it harms G-Sync because it adds an extra frame of latency with no performance gain. Unfortunately, triple buffering games are often self-defined and cannot be manually bypassed.

What about games that tend to react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to work with V-sync on a 60Hz panel (although this sometimes makes life difficult for us due to input lag). To test them, you need to modify certain files with the .ini extension. As it behaves G-Sync with games based on vsync sensitive Gamebryo and Creation engines? Are they limited to 60 FPS?

Second, you need a monitor with an Nvidia module G-Sync... This module replaces the display scaler. And, for example, add to the split Ultra HD display G-Sync impossible. For today's review, we are using a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you will be able to get an idea of ​​what impact it will have G-Sync if manufacturers start installing it in cheaper 60Hz panels.

Third, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option for work G-Sync on three monitors in Surround mode, they are connected via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

Finally, don't forget about driver support. The latest package, version 331.93 beta, already has compatibility with G-Sync and we expect future WHQL-certified versions to be equipped with it.

Test stand

Test bench configuration
CPU Intel Core i7-3970X (Sandy Bridge-E), 3.5 GHz base frequency, overclocked to 4.3 GHz, LGA 2011, 15 MB Shared L3 Cache, Hyper-Threading On, Power Saving Features On
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 and 1.65V
Storage device Samsung 840 Pro SSD 256GB SATA 6Gb / s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2GB
Power Supply Corsair AX860i 860 W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now you need to figure out in what cases G-Sync has the greatest impact. Chances are good that you are already using a monitor with a 60Hz refresh rate. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that most of the enthusiasts on the market will still stick to 60 Hz.

With active vertical sync on a 60Hz monitor, the most noticeable artifacts appear when the card cannot deliver 60fps, resulting in annoying jumps between 30 and 60fps. There are noticeable slowdowns here. With V sync off, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. Some players are so distracted by this that they just turn on V-sync and suffer slowdowns and input delays.

At 120 Hz and 144 Hz and higher frame rates, the display refreshes more frequently, shortening the time that one frame is retained over multiple screen scans when performance is insufficient. However, issues with active and inactive vertical sync persist. For this reason, we will be testing the Asus monitor at 60 and 144 Hz with technology on and off. G-Sync .

G-Sync Technology Overview | Testing G-Sync with V-Sync Enabled

It's time to start testing G-Sync... All that remains is to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it’s wrong.

Today we are not measuring productivity, but quality. In our case, tests can show only one thing: the frame rate at a particular point in time. On the quality and experience of use with the technology turned on and off G-Sync they don't say anything at all. Therefore, you will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to readers for judgment? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Insofar as G-Sync implements a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-Sync incl., G-Sync off, 60 Hz, 120 Hz, 144 Hz, ... The list goes on for a long time. But we'll start with a 60Hz refresh rate and active vertical sync.

Probably the easiest place to start is with Nvidia's own demo utility, in which the pendulum swings from side to side. The utility can simulate frame rates of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. Then you can disable or enable V-sync and G-Sync... Although the test is fictional, it does a good job of demonstrating the technology's capabilities. You can watch the scene at 50 FPS with vertical sync enabled and think: "Everything is quite good, and the visible slowdowns can be tolerated." But after activation G-Sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a technical demo. I would like evidence based on real games. To do this, you need to run a game with high system requirements, such as Arma III.

Arma III can be installed in a test car GeForce GTX 770 and set ultra settings. With vertical sync disabled, the frame rate fluctuates between 40 - 50 FPS. But if you enable V-sync, it drops to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the video card is simply reduced.

Since there was no freezing of the image, a significant difference when activated G-Sync invisible, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, since the same frame is not saved over multiple scans of the monitor. We feel that Arma is generally less jittery than many other games, so the lag is not felt.

On the other hand, Metro: Last Light is influenced by G-Sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select the SSAA settings from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes a hallway, which is easy to strafe back and forth. After starting the level with active vertical sync at 60 Hz, we went out into the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA leads to fluctuations from 60 to 30 FPS, from which each duplicated frame creates inconvenience. This is one of the games in which we would definitely turn off vertical sync and just ignore tearing. Many people have already developed a habit.

but G-Sync eliminates all negative effects. You no longer have to look at the Fraps counter while waiting for drawdowns below 60 FPS to lower another graphical parameter. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn more about this later.

G-Sync Technology Overview | Testing G-Sync with V-Sync Disabled

The conclusions in this material are based on a survey of the authors and friends of Tom "s Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical sync is and what disadvantages users have to put up with in this regard. , they only resort to vertical sync when the gaps become unbearable due to the large variation in frame rates and monitor refresh rates.

As you can imagine, the visual impact of Vsync off is hard to confuse, although it is heavily influenced by the game and its detail settings.

Take for example Crysis 3... The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And since Crysis 3 is a first person shooter with very dynamic gameplay, the tears can be quite palpable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we forcibly disable Vsync in Skyrim, the breaks are not that strong. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. For these reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it does show that even with vertical sync disabled, the feel of the game can change.

For the third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a fairly clear break in the image (also look at the hair and the strap of the shirt). Tomb Raider is the only game in our selection that allows you to choose between double and triple buffering when vertical sync is activated.

The last graph shows that Metro: Last Light with G-Sync at 144 Hz, generally provides the same performance as when vertical sync is disabled. However, you cannot see the absence of gaps in the chart. If you use technology with a 60 Hz screen, the frame rate will stop at 60 FPS, but there will be no slowdowns or lags.

In any case, those of you (and us) who have spent countless amounts of time on graphics tests, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with active G-Sync immediately catch the eye, because there is a smoothness, as with the enabled V-sync, but without the breaks inherent in disabled V-sync. It's a shame that now we can't show the difference in the video.

G-Sync Technology Overview | Game compatibility: almost perfect

Checking other games

We've tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All but Skyrim have benefited from technology. G-Sync... The effect depended on the competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present earlier.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. Most likely, you will want to set the anti-aliasing as high as possible to remove unpleasant bumps that were not so noticeable before.

Skyrim: special case

Skyrim's Creation graphics engine activates V-sync by default. To test the game at frame rates above 60 FPS, add the line iPresentInterval = 0 to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in the initial state, allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings intact and then enable G-Sync and disable V-sync in the game file with the .ini extension.

The first configuration, in which the development monitor is set to 60 Hz, showed a stable 60 FPS on ultra settings with a video card GeForce GTX 770... Therefore, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed a noticeable motion blur. However, this is how most people play on PC. You can of course buy a 144Hz screen and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of about 90 - 100 frames per second, noticeable slowdowns will appear when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-Sync has a negative effect on the picture, probably due to active vertical sync, while the technology should work with disabled V-sync. Now the side strafe (especially closer to the walls) leads to pronounced slowdowns. This is a potential problem for 60 Hz panels with G-Sync at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144Hz mode, and despite the active V-sync, G-Sync will work at that frame rate flawlessly.

Disabling vertical sync completely in Skyrim results in smoother mouse control. However, tearing appears in the image (not to mention other artifacts such as shimmering water). Turning on G-Sync leaves deceleration at 60 Hz, but at 144 Hz the situation improves significantly. Although in our video card reviews we test the game with Vsync disabled, we would not recommend playing without it.

For Skyrim, perhaps the best solution would be to disable G-Sync and play at 60 Hz, which will give a constant 60 frames per second at your chosen graphics settings.

G-Sync Technology Overview | Is G-Sync what you've been waiting for?

Even before we got our test sample of the Asus monitor with technology G-Sync We are already pleased with the fact that Nvidia is working on a very real problem affecting games, the solution to which has not yet been proposed. Until now, you may or may not turn on vertical sync to your liking. In this case, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to turn on V-sync until the tearing becomes unbearable, then you can say that you are choosing the lesser of two evils.

G-Sync solves the problem by giving the monitor the ability to scan the screen at a variable frequency. Such innovation is the only way to continue to advance our industry while maintaining the technical advantage of personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for failing to develop a standard that competitors can apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the technology was announced G-Sync she ended up in our hands.

The question is, is Nvidia doing everything it promised in G-Sync?

Three talented developers touting qualities of technology you've never seen in action can inspire anyone. But if your first experience with G-Sync Based on Nvidia's pendulum demo test, you are sure to wonder if such a huge difference is possible at all, or if the test presents a special scenario that is too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I can see the difference." Best of all the impact of activation G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to run the 60Hz test with G-Sync to see what you (hopefully) get with cheaper displays in the future. In some cases, simply going from 60 to 144 Hz will overwhelm you, especially if your graphics card can deliver high frame rates.

Today we know that Asus plans to implement support G-Sync in the model Asus VG248QE, which the company says will sell for $ 400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144 Hz. Version without G-Sync has already won our Smart Buy Award for Outstanding Performance. But for us personally, the 6-bit TN panel is a disadvantage. I'd like to see 2560x1440 pixels on an IPS-matrix. We'll even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, official Nvidia comments regarding other displays with modules G-Sync and we have not heard their prices. Also, we are not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-Sync into an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say that it is worth the wait. You will find that in some games the impact of new technology cannot be confused, while in others it is less pronounced. But anyway G-Sync answers the "bearded" question, whether or not to enable vertical sync.

There is another interesting thought. After we have tested G-Sync How much more AMD will be able to shy away from commenting? The company teased our readers in his interview(English), noting that she will soon decide on this opportunity. If she has anything in the plans? The end of 2013 and the beginning of 2014 prepare us a lot of interesting news to discuss, including Battlefield 4 Mantle versions, upcoming Nvidia Maxwell architecture, G-Sync, AMD xDMA engine with CrossFire support and rumors of new dual-GPU graphics cards. Now we are short of graphics cards with more than 3 GB (Nvidia) and 4 GB (AMD) GDDR5 memory, but they cost less than $ 1000 ...


Top