Features and Motherboard Layout
Features
Courtesy of GIGABYTE
- Supports AMD Ryzen™ 2nd Generation / Ryzen™ 1st Generation
- Dual Channel ECC/ Non-ECC Unbuffered DDR4, 4 DIMMs
- 10+2 Phase IR Digital PWM Design
- Fins-Array Heatsink & Direct Touch Heatpipe
- 2-Way CrossFire/ SLI Graphics Support with Dual Armor and Ultra Durable™ Design
- Intel® 802.11ac Wave2 2T2R WIFI & BT 5
- ALC1220-VB Enhance 114dB(Rear)/ 110dB(Front) SNR in Microphone and Built-in ESS SABRE DAC with WIMA Audio Capacitors
- Dual Ultra-Fast M.2 with NVMe PCIe X4 with Dual Thermal Guard
- RGB FUSION with Multi-zone LED Light Show Design, Supports Digital LED & RGB LED Strips
- Swappable Overlay for Accent LED
- Intel® Ethernet LAN with cFOS Speed Internet Accelerator
- USB DAC-UP 2 with Adjustable Voltage
- Integrated Base Plate & I/O Shield Armor
- Rear Power/Reset/Clear CMOS Button
- CEC 2019 Ready, Save Power with a Single Click
Motherboard Layout
The X470 AORUS Gaming 7 WIFI motherboard features the matte black PCB with silver plastic overlays and black heat sinks common to the latest generation of GIGABYTE AORUS line boards. The plastic overlay sits over the rear panel components and audio components, functioning to spruce up the board's aesthetics as well as offering protection for the covered components. The CPU VRMs are cooled by two densely-finned heat sinks, connected via heat pipe. The board's ATX form factor provides more than enough surface area to house the designed-in integrated features, as well as giving the board compatibility with most available consumer enclosures.
The board's back completely free of components with the exception of small capacitors and power circuitry sitting in the center hole of the CPU backplate. GIGABYTE covered the upper left quadrant of the board back, protecting the underside of the rear panel assembly and the VRMs. Further, this plate acts as a secondary heat dissipation path for heat from the VRMs. If your cooler requires replacement of the stock back plate, the center CPU circuitry could become problematic if the backplate does not have a hole in its center to compensate.
GIGABYTE integrated a 12-phase (10+2) digital power system into the X470 AORUS Gaming 7 WIFI board, capable of providing more than enough power to the CPU for whatever torture tests you choose to throw at it. Designed with the GIGABYTE Ultra Durable power system, the board includes IR digital power controllers and PowIRstage ICs, Server Level Chokes, and Durable Black capacitors is its digital power system design. The CPU VRMs are cooled by two inter-connected, densely-finned aluminum heat sinks to the top and right of the CPU socket. This design allows for optimized cooling of these critical components with minimal airflow.
The board's dual ATX12V power connectors are located to the upper right of the CPU socket, in between the two VRM heat sinks. GIGABYTE included both an 8-pin and 4-pin connector.
GIGABYTE designed the board with two M.2 slots, one in between the primary and secondary PCIe x16 slots and the other in between the secondary and tertiary PCIe x16 slots. All M.2 slots include integrated port covers with thermal tape along the inside. In previous reviews, such plates where found to reduce M.2 card heat by over 10C when used in a strict passive configuration with optimal cooling obtained when used in conjunction with active air-flow over the port plate.
Both the M.2 slots in between the primary and secondary PCIe x16 slots support cards up to 110mm in length (M2A_SOCKET), while the other slot support cards up to 80mm in length (M2B_SOCKET). Both M.2 ports a keyed with an M-key slot devices with port M2A_SOCKET supporting standard SSD, PCIe x2, and PCIe x4 type M.2 drives and port M2A_SOCKET supporting PCIe x2, and PCIe x4 type M.2 drives. However, the 80mm M.2 port will only run M.2 drives with a maximum bandwidth of x2 because it is only PCIe Gen2 compliant. Note that the port M2B_SOCKET and the tertiary PCIe x16 slot share bandwidth. The PCIe x16 slot is disabled with an M.2 drive seated in that port.
The board has a total of four DIMM slots with Dual Channel memory mode active with modules in slots 1 / 3 or 2 / 4. The board supports up to 64GB of memory running at a maximum speed of 3600MHz. Note that memory speeds above 2667MHz are considered overclocked speeds (for Ryzen2 processors) and are outside of the official AMD stock memory speed specifications. The memory slots are steel reinforced for extra strength. Further, GIGABYTE integrated RGB LEDs in between the slots, configurable via the UEFI or the RGB Fusion Windows applet.
Directly below the memory port block are a USB 3.1 Gen2 Type C header, the 24-pin ATX power connector, three 4-pin system fan headers, a two-digit LED diagnostic display, BIOS Select (BIOS_SW) and DualBIOS (SB) switches, the OC button, a 5-pin RBGW LED header, a 3-pin LED header, and an LED voltage jumper. The unlabeled headers to the right of the OC button are debug headers used for factory board validation testing. To the upper right of the DIMM slots are the CPU and secondary CPU 4-pin fan headers. The acrylic overlay directly below the DIMM slots glow when the board is actively powered, in line with the RGB LED settings from the UEFI or Windows RGB Fusion applet.
The 2-digit diagnostic display can be used for debugging system issues during system initialization. The displayed debug codes can be decoded using the table from the motherboard manual. The OC button enables BIOS-assisted automated overclocking of the board, based on factory configured settings. The 5-pin LED header supports use of either a 4-wire RGB LED strip or a 5-wire RGBW strip with the board. The BIOS Select switch forces the board to boot using the main BIOS (in the 1 position) or the backup BIOS (in the 2 position). The Dual BIOS enable switch enables the board's Dual BIOS functionality in the 1 position, or disables it in the 2 position (forcing the board into single BIOS mode).
The AMD X470 chipset is covered by a low profile chromed black heat sink with an AORUS overlay. The overlay has an embedded RGB LED that can be configured via the UEFI or Windows RGB Fusion applet. In addition to the two PCIe x4 M.2 slots, the board features six SATA III ports in the port block directly below the chipset heat sink. All SATA ports are usable in conjunction with the a drive seated in the M.2 port and do not share bandwidth with other board integrated devices or ports.
The board contains a total of five PCIe slots – three PCIe x16 slots and two PCIe x1 slots. The PCIe x16 slots can be used in x16 / 0 / x4 or x8 / x8 / x4 modes. Note that the third PCIe x16 slot shares bandwidth with the 80mm M.2 port and is automatically disabled with an M.2 device installed in that port.
The board's integrated audio components are covered by a black plastic overlay just above the PCIe ports with a tribal-type design stamped into its surface. The audio subsystem lives on an isolated PCB to minimize line-noise and distortion caused by other integrated components. There are RGB LEDs integrated into the white colored insert that runs along the length of the overlay. These LEDs illuminate in accordance to the color scheme configured through the UEFI or the Windows RGB Fusion applet. To ensure balanced audio performance using headphones or external speakers, GIGABYTE designed the audio subsystem with Realtek ALC1220-VB CODEC, a smart headphone AMP, an ESS SABRE DAC, and Nichicon Audio capacitors.
GIGABYTE integrated support for connecting both RGBW and single color LED strips directly into the board using the LED and RGBW LED headers located in the upper left and lower right quadrants of the board. Connecting an RGB LED strip to the header synchronizes the LED strip color and activity with that of the motherboard's integrated LEDs. The board is shown with an external RGB strip connected to the header in the board's upper left quadrant (by the CMOS battery). Notice that the board has multiple integrated RGB zones (in addition to the RGB headers) – integrated into the rear panel cover, lining PCIe x16 slots 1 and 3, integrated into the chipset heat sink, integrated into the audio cover, in between the DIMM slots, and the underneath the acrylic overlay located just under the DIMM slots.
In addition to the 3-pin LED and 5-pin RGBW LED headers, the front panel audio header, S/PDIF audio header, LED Demo header, and a trusted module port header are located in the upper left corner of the board.
Along the mid and lower left side of the board are the USB 2.0 headers, 4-pin system fan and pump headers, multiple USB 3.0 headers, the front panel header, the system Status LEDs, and the CMOS reset jumper. The Status LEDs give a visual indicator for troubleshooting issues with LEDs illuminating to indicate a malfunctioning device. LEDs show status for the CPU, memory, PCIe x16 slots, and the BIOS. The 4-pin water pump header can be used to power and monitor a standard 12V water pump – an AIO cooler's integrated coolant pump or a pump integrated into a DIY loop.
GIGABYTE chose to integrate an attached rear panel shield into the rear panel assembly. The rear panel plate is a matte black base with white colored port text, making it easy to read under most lighting conditions. The X470 AORUS Gaming 7 WIFI motherboard contains the following integrated ports in its rear panel assembly: two USB 2.0 ports (black), six USB 3.0 ports (blue / yellow), two USB 3.1 10Gbps ports (red) – one Type A and one Type C, an Intel I219-V GigE RJ-45 port (above the USB 2.0 ports), dual antennae ports for the 802.11ac controller, Clear CMOS and system power buttons, an S/PDIF digital audio output port, and five analogue audio output ports. The yellow colored USB 3.0 ports provide low noise power with minimal voltage fluctuation using the DAC-UP 2 functionality, specifically designed for use with VR devices.
“Support for NVIDIA® Quad-GPU
“Support for NVIDIA® Quad-GPU SLI™ and 2-Way NVIDIA® SLI™ technologies
Support for AMD Quad-GPU CrossFire™ and 2-Way AMD CrossFire™ technologies”
With only 3 PICe X16 Slots(Whatever electrical) how is Quad GPU SLI/CF support possible on this MB? Can this board somehow be plugged into the Delorean and initiate time travel, it sure has enough LED Bling to qualify as a prop for a 1980s SIFI comedy.
No you dr emmet brown
No you dr emmet brown wannabe. Quad sli/xfire is for cards that have TWO gpus on each card, like the titanz( must say titanz wirh heavy german accent) or like the 295 x2.
Really the drivers are going
Really the drivers are going to abstract away Ze dual GPUs on Ze one PCIe card mostly so that’s not what CF/SLI is about. AMD’s CF uses XDMA while Nvidia uses a hardware bridge. But still you are wrong about this MB as it has only 3 PCIe x16(whatever electrical slots) and some folks in the past have had 4 of those Dual GPU on one PCIe card SKUs on a single system. This MB can not support 4 different cards at the same time so that’s just BS on your part!
CF and SLI are still not very good at milti-GPU load balancing but maybe with DX12/Vulkan and that explicit GPU Multi-Adaptor managed by these new Graphics APIs and some games programmers that are competent and not whining script Kiddies then there can be more progess. It should not be a problem for most GPUs that can work with DX12/Vulkan to have proper APIs developed to hand hold the stupid script kiddies hands and automate the process of proper GPU load balancing under DX12/Vulkan or even Apple’s metal. Poor little “programmers” so wedded to OpenGL’s complex and software abstracted state machine design that they can not deal with any GPU metal. But that’s OK as there will be middleware and Game Engine’s SDKs to help.
CF/SLI is not so good for games because of all that single threaded latency issues in dealing with milti-GPUs but really GPUs are parallel beasts and newer CPUs are getting way more cores and threads on mainstream CPU SKUs. So with proper programmers and DX12/Vulkan/etc that can be fixed over time. Nvidia sure is not receptive to more than 2 GPUs for SLI and AMD needs to maybe go back to using Bridge Connectors instead of XDMA and make use of Infinity Fabric instead. Nvidia has NVLink that it could speak across its bridge connectors but Nvidia appears to not be as interested in muiti-GPU uasge for gaming just yet.
The entire gaming/gaming engine industry mostly is really not taking the time to properly hide the latency issues with their games and are relying too much on the CPU and GPU makers to throw ever more powerful hardware their way so they do not have to worry about optimizing PC games as much as the console games/gaming engine makers have to do in order to eke out every last bit of performance on those consoles relatively weak hardware.
Really both AMD and Nvidia maybe need to slow down on the New hardware features and spend more time optimizing their GPUs firmware/driver and API support but Nvidia makes loads of dosh with its new hardware sales at the expense of its older GPU hardware while maybe AMD open sourceing most of their Vulkan driver development will see some Older AMD hardware(GCN 1.2/later) continue to net performance gains over time.
Poor AMD(At the Time) bit off more than they could chew trying to get That Implicit primitive shader API layer to work for legacy games that are not written to take advantage of the Explicit Primitive Shader hardware in AMD’s Vega GPU micro-arch. But gaming engine makers are still free to target Vega’s Explicit Hardware Primitive Shaders even if that’s going to not catch on as soon as AMD had hoped for PC gaming. Maybe the Open Source community can get around to targeting Vega’s explicit primitive hardware shaders or that Chinese Console maker that’s using That New AMD Semi-Custom Zen/Vega APU. Once the Console Makers switch over to all Zen/Vega based console hardware you can be damned sure that they will target Vega’s Explicit Primitive Hardware shaders and Rapid Packed Math/etc.
The marketing wank is “NVIDIA
The marketing wank is “NVIDIA Quad-GPU SLI”. It is not “NVIDIA® Quad-card SLI”. How do you get Quad-GPU SLI on a system that features 2-way SLI? Get two graphics cards with two GPUs each, and there you have Quad-GPU SLI. Also, from the horses mouth: http://www.nvidia.com/object/slizone_quadsli.html
So yes, that “dr emmet brown wannabe” is right, you annoying brat…
Oopsie, the
Oopsie, the “Anonymousnameisalreadyused” was right, not the “dr. emmet brown wannabe”… Argh…
thanks for the review
morry,
thanks for the review
morry, do you know what the ‘EDC %’ is at stock and when overclocking? ryzen master monitors this metric
i am running a 2700x on a asrock fatality mini itx 470 in an in win 901 case and at stock ‘EDC’ is hitting max, so i am assuming that is why it is stuck at around 3900 on all cores when running cinebench
it could be temps as well, but the noctua i am using is excellent, and it is the same with the cooler master aio i tried before the noctua
i think the issue is that the vrm is not beefy enough to fully max out the cpu because i believe ‘EDC’ is the max current the vrm is able to handle
Just a slight correction you
Just a slight correction you might want to make in the Features and Motherboard Layout section. I was a bit confused when I read the below, so I doubled checked this in the manufacturers manual.
Note that the port M2A_SOCKET and the tertiary PCIe x16 slot share bandwidth. The PCIe x16 slot is disabled with an M.2 drive seated in that port.
This should read that the “M2B_SOCKET and the tertiary PCIe x16 slot share bandwidth.”
Sourced from the manufacturers manual, Page 7, Expansion Slots section:
1 x PCI Express x16 slot, running at x4 (PCIEX4)
* The PCIEX4 slot becomes unavailable when a device is installed in the M2B_SOCKET connector.
Hope this clears up any confusion.
Thanks for pointing this
Thanks for pointing this out. It has been updated…
Any thoughts on getting
Any thoughts on getting around the M.2 80mm slot performance problem by using a PCI-E 3.0 compliant adapter card in the second 16x slot? I know this would drop the first two slots to 8x speeds, but most real world bench marking seems to suggest only little performance loss overall if a graphics card is in the first slot?
Anyone think it’s worth the trade off?
worth it if you need to run
worth it if you need to run two or more M.2s in raid mode. You won't see much if any performance loss between 16x and 8x on the video card unless you are running 4k most likely….
Thanks for the reply on this
Thanks for the reply on this one Morry.
One more question I had was
One more question I had was around RAM and this board. Given what you noted in the review about the memory speeds and this board, is there much point in going above DDR4-3200? I’m planning to overclock my Ryzen 2700X to around 4.2 GHz paired with a GTX 1080Ti. I had been looking at some Corsair Vengeance DDR4-3600 up until I read through the review. Thoughts?
no, not much point going
no, not much point going above stock speeds on memory, you see little improvement performance wise. Best to try to maximize your core speeds…
Appreciate the quick reply
Appreciate the quick reply again Morry!