AMD has become quite ambitious in the graphics card market lately, making significant strides in competition with NVIDIA. The company aims to enhance its position with new features in their graphics cards. One of the most prominent features of AMD-signed graphics cards lately is the Variable Refresh Rate (VRR) technology.
AMD’s new technology significantly reduces power consumption
The AMD Radeon RX 6000 and RX 700 series graphics cards, based on the RDNA 2 and RDNA 3 GPU architectures, were tested for this feature by ComputerBase. These tests focused on power consumption data, unlike the performance tests of the graphics cards.
The results of the tests revealed that when the Variable Refresh Rate (VRR) feature is enabled, the power consumption of AMD Radeon cards decreases significantly during idle times. Using a 4K monitor with a refresh rate of 144 Hz, ComputerBase tested Radeon RX 6800/6700 XT and RX 7900 XT, as well as Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080 in the tests.
The tests were conducted in terms of efficiency, including desktop idle power consumption, dual-monitor power consumption, window movement, YouTube performance at 60 FPS with SDR, and YouTube performance at 60 FPS with HDR, all using a 4K 144 Hz monitor.
The results of the tests are quite remarkable. According to the test results, the Radeon RX 7900 XTX consumes 81 percent less power with the VRR technology enabled on a single monitor. In a dual-monitor setup, the same graphics card consumes 71 percent less power.
On the other hand, the Radeon RX 6700 XT consumes 76 percent less power with a single monitor. Looking at the Radeon RX 6800 XT, we see a 79 percent reduction in power consumption. This clearly demonstrates how well AMD has performed with their VRR technology.
As a result of the tests, it is evident that AMD-signed graphics card models with VRR technology offer a significant advantage in energy consumption compared to other brands. What are your thoughts on AMD‘s VRR technology and the power consumption of their Radeon graphics cards? Feel free to share your opinions in the comments section.
{{user}} {{datetime}}
{{text}}