Pcie 4 In A Pcie 3 Slot?
Conclusion: Upgrade to PCIe Gen 4 with Trenton Systems – Only you can make the decision to upgrade to PCIe 4.0, and we’ll help you figure out if this is the right choice for you during every step of the decision-making process. To learn more about the ins and outs of PCIe 4.0, check out PCIe Gen 4 vs.
Gen 3 Slots, Speeds, For more information on PCIe in general, take a look at our introductory PCIe blog post, The fact of the matter is – if you’re getting the performance you need from PCIe Gen 3 right now – then it’s probably not quite time to upgrade to PCIe 4.0, but you should definitely start considering it soon, as PCIe 4.0 performance will become an industry standard before you know it.
Trenton Systems is currently testing PCIe 4.0 components and devices for its secure, made-in-USA computing solutions in-house to ensure that, once available, its PCIe 4.0 motherboards integrate seamlessly with customer platforms and that customers feel the least possible amount of resistance.
Näytä koko vastaus
Contents
Can you put PCIe 4 in 3 slot?
What is PCIe? – At its most basic, PCIe is an interface that lets you connect high-speed components — such as add-on chips, memory, graphics cards and storage—to your motherboard. They are available in five different types of cards that fit into the motherboard: x1, x2, x4, x8 and x16.
- These designations indicate that the cards have a corresponding number of slots that function as lanes for data to travel to and from the peripheral.
- One side of the lane sends data, and the other side receives it.
- If your PCIe interface is a PCIe 4.0 x8, that means it can handle PCIe 1.0, 2.0, 3.0 and 4.0 devices with up to eight different lanes.
You’ll be able to do more with less in this case, since a PCIe 4.0 x8 slot can handle almost exactly what a PCIe 3.0 x16 slot can currently handle. Another helpful aspect of PCIe devices is that they are backward and downward compatible, so a PCIe 2.0 x2 will still work with a PCIe 4.0 x8 interface.
Näytä koko vastaus
Is PCIe 4 backward compatible with PCIe 3?
Is PCIe Gen 4 backward compatible? – PCIe Gen 4 is backward compatible, so a PCIe Gen 4 device connected to a PCIe Gen 3 system will function normally at PCIe Gen 3 speeds. That means if you purchase a PCIe Gen 4 NVMe SSD today, you can use it immediately in your current system even if it does not support PCIe Gen 4.
Näytä koko vastaus
Can I use PCIe 4.0 in 3.0 slot Reddit?
Yes. It won’t give you PCIe 4.0 though.
Näytä koko vastaus
Can I use PCIe 4.0 in 3.0 Reddit?
Performance loss is very negligible and not noticeable in anyway. You can go ahead with it. Not at all, PCIe 4.0 cards are backwards compatible with PCIe 3.0.
Näytä koko vastaus
Is PCIe 3.0 a bottleneck for RTX 3080?
truels2 New MemberTotal Posts : 19 Reward points : 0 Joined: 2021/06/04 18:06:20 Status: offline Ribbons : 0
So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! justin_43 CLASSIFIED Member
Total Posts : 3319 Reward points : 0 Joined: 2008/01/04 18:54:42 Status: offline Ribbons : 7
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:21:49 (permalink) A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. Intel Core i7 12700K • MSI Z690 Edge WiFi • 32GB G.Skill Trident Z • EVGA RTX 3090 FTW3 Ultra • EVGA 1600T2 PSU 3x 2TB Samsung 980 Pros in RAID 0 • 250GB Samsung 980 Pro • 2x WD 2TB Blacks in RAID 0 • Lian-Li PC-D600WB EK Quantum Velocity • EK Quantum Vector w/ ABP • EK Quantum Kinetic TBE 200 D5 • 2x Alphacool 420mm Rads LG CX 48″ • 2x Wasabi Mango UHD430s 43″ • HP LP3065 30″ • Ducky Shine 7 Blackout • Logitech MX Master Sennheiser HD660S w/ XLR • Creative SB X-Fi Titanium HD • Drop + THX AAA 789 • DarkVoice 336SE OTL EVGA_JacobF EVGA Product Manager
Total Posts : 16946 Reward points : 0 Joined: 2006/01/17 12:10:20 Location: Brea, CA Status: online Ribbons : 26
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:23:20 (permalink) ☼ Best Answer by Hoggle 2021/06/19 02:55:25 Nope, PCI-E 3.0 x16 is still plenty of bandwidth. Jstandaert Superclocked Member
Total Posts : 243 Reward points : 0 Joined: 2021/04/10 16:36:16 Status: offline Ribbons : 2
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:23:28 (permalink) the gain in frames from 3.0 to 4.0 to most isn’t worth it for most. Save some Dough-Use my Code scott91575 iCX Member
Total Posts : 344 Reward points : 0 Joined: 2008/03/27 17:41:02 Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:28:27 (permalink) Jstandaert the gain in frames from 3.0 to 4.0 to most isn’t worth it for most. The only gains to be had with 4.0 right now is for NVME drives. Even then most people won’t notice the difference in every day tasks. New Member
Total Posts : 19 Reward points : 0 Joined: 2021/06/04 18:06:20 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:29:58 (permalink) justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. justin_43 CLASSIFIED Member
Total Posts : 3319 Reward points : 0 Joined: 2008/01/04 18:54:42 Status: offline Ribbons : 7
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:36:32 (permalink) truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. New Member
Total Posts : 19 Reward points : 0 Joined: 2021/06/04 18:06:20 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:37:46 (permalink) justin_43 truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. You are more than fine sweet thanks for the info aka_STEVE_b EGC Admin
Total Posts : 17586 Reward points : 0 Joined: 2006/02/26 06:45:46 Location: OH Status: offline Ribbons : 68
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:38:32 (permalink) pcie 3.0 is still not fully saturated by graphics outputs – you will be good AMD RYZEN 9 5900X 12-core cpu~ ASUS ROG Crosshair VIII Dark Hero ~ EVGA RTX 3080 Ti FTW3~ G.SKILL Trident Z NEO 32GB DDR4-3600 ~ Phanteks Eclipse P400s red case ~ EVGA SuperNOVA 1000 G+ PSU ~ Intel 660p M.2 drive~ Crucial MX300 275 GB SSD ~WD 2TB SSD ~CORSAIR H115i RGB Pro XT 280mm cooler ~ CORSAIR Dark Core RGB Pro mouse ~ CORSAIR K68 Mech keyboard ~ HGST 4TB Hd.~ AOC AGON 32″ monitor 1440p @ 144Hz ~ Win 10 x64 justin_43 CLASSIFIED Member
Total Posts : 3319 Reward points : 0 Joined: 2008/01/04 18:54:42 Status: offline Ribbons : 7
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:39:50 (permalink) truels2 justin_43 truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. Superclocked Member
Total Posts : 101 Reward points : 0 Joined: 2021/06/10 21:38:51 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 11:37:25 (permalink) PCI 3 will be enouth for years and years from now. mdb983 Superclocked Member
Total Posts : 108 Reward points : 0 Joined: 2020/09/26 16:39:46 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 11:41:03 (permalink) truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine Zixinus Superclocked Member
Total Posts : 165 Reward points : 0 Joined: 2020/10/10 04:18:52 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 11:57:11 (permalink) truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! It isn’t the age of the mobo but its specifications that tell whether it is PCIe 4 or not.
You may also want to check your BIOS settings (consult your manual!) and configuration too. It is possible that you may be running in 3.0 mode for some reason. However, using a PCIE 3 isn’t going to be a huge loss. Maybe you’ll lose a percent points of performance, but if you have a 3080ti, you should have plenty to spare.
oletorius New Member
Total Posts : 90 Reward points : 0 Joined: 2020/11/09 20:46:27 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 18:49:29 (permalink) Keep that bad boy. I have a very similar rig. It’s fine. 🙂 nezff SSC Member
Total Posts : 940 Reward points : 0 Joined: 2012/08/20 16:36:53 Location: Cajun Country Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 18:57:02 (permalink) truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. Superclocked Member
Total Posts : 172 Reward points : 0 Joined: 2008/05/20 11:23:22 Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 23:32:37 (permalink) dmitri2k New Member
Total Posts : 66 Reward points : 0 Joined: 2016/04/16 07:38:48 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 23:47:05 (permalink) Edwin405 New Member
Total Posts : 100 Reward points : 0 Joined: 2021/06/08 09:22:25 Location: California Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 23:52:05 (permalink) Should be ok, bottleneck no more, PCIE gen 4 is for nvme m.2, since it Run! at faster speeds. And the cpu with the cpu you have is a good combo Gogod2020 iCX Member
Total Posts : 272 Reward points : 0 Joined: 2020/10/19 14:31:11 Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 04:02:27 (permalink) The only things that will bottle neck you are CPU (huge impact), RAM (some impact), PSU (stability impact), temperatures (clocks impact). PCIE 3.0 vs 4.0 will not do anything at all at the moment and for the next few years give or take. SSC Member
Total Posts : 854 Reward points : 0 Joined: 2010/03/27 20:40:35 Location: Nebraska Status: offline Ribbons : 3
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 05:04:03 (permalink) I have a 9900KS @5ghz+ on a z390 board (PCIe 3.0) and a Ryzen 5 3600XT @4.5ghz+ on a x570 board (PCIe 4.0) and at 3440x1440p there is virtually no difference in FPS with a 3080 Ti in the games I play. kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 09:50:44 (permalink) Are half the people on this forum stupid now? 16x PCIe Gen 3.0 is fine for any current top end GPU. HOWEVER.8x PCIe Gen 4.0 is also fine, which frees up 8x pcie lanes off the CPU to be used for other things. Kokin Superclocked Member
Total Posts : 106 Reward points : 0 Joined: 2017/08/23 23:15:07 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:01:13 (permalink) kevinc313 Are half the people on this forum stupid now? 16x PCIe Gen 3.0 is fine for any current top end GPU. HOWEVER.8x PCIe Gen 4.0 is also fine, which frees up 8x pcie lanes off the CPU to be used for other things. Use my Associate Code to get 3-10% off your purchase: D7J9R5NG8G0BRER dmitri2k New Member
Total Posts : 66 Reward points : 0 Joined: 2016/04/16 07:38:48 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:16:05 (permalink) agreed, we are all learning and teaching each other, easy plz kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:30:17 (permalink) Kokin kevinc313 Are half the people on this forum stupid now? 16x PCIe Gen 3.0 is fine for any current top end GPU. HOWEVER.8x PCIe Gen 4.0 is also fine, which frees up 8x pcie lanes off the CPU to be used for other things. It’s not the lack of OBVIOUS, READILY AVAILABLE knowledge that I’m concerned about, it’s the number of people repeating “Gen 4 don’t matter, Gen 3 fine, HURRR DURRR” to answer the post. Go back to Reddit morans. kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:31:25 (permalink) mdb983 truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine It will run but there is a tested hit to performance. Superclocked Member
Total Posts : 106 Reward points : 0 Joined: 2017/08/23 23:15:07 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:52:35 (permalink) kevinc313 mdb983 truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0.
How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine It will run but there is a tested hit to performance. It’s been tested so many times by numerous sources on the internet. It’s a 0-3% in performance loss and even then it depends on the application used.
That’s margin of error type of differences and is negligible from a normal person’s perspective. PCIE 4.0 can easily be generalized to not make a difference unless it’s NVME performance and we’re only talking about a 3080Ti. Anyway, why bother answering if you’re just gonna call everyone stupid and morons (not even spelled correctly lol). Use my Associate Code to get 3-10% off your purchase: D7J9R5NG8G0BRER kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:59:50 (permalink) Kokin kevinc313 mdb983 truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0.
How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine It will run but there is a tested hit to performance. It’s been tested so many times by numerous sources on the internet. It’s a 0-3% in performance loss and even then it depends on the application used.
That’s margin of error type of differences and is negligible from a normal person’s perspective. PCIE 4.0 can easily be generalized to not make a difference unless it’s NVME performance and we’re only talking about a 3080Ti. Anyway, why bother answering if you’re just gonna call everyone stupid and morons (not even spelled correctly lol). CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 11:05:38 (permalink) KingEngineRevUp FTW Member
Total Posts : 1030 Reward points : 0 Joined: 2019/03/28 16:38:54 Status: offline Ribbons : 9
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 11:09:02 (permalink) kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 11:13:41 (permalink)
Will PCIe 3.0 bottleneck RTX 3060?
Yes and PCIe 3.0 won’t be a bottleneck for that card. Basically no existing mainstream consumer GPU can fully saturate 16 PCIe Gen 3.0 lanes. Do newer graphics cards that are compatible with PCI 4.0 actually function at a reduced performance level if slotted into a PCI 3.0?
Näytä koko vastaus
Will 4080 work on PCIe 3?
Should you buy and upgrade to the RTX 40-series? – If you’re on a 10-series or 20-series, yes. Now is the time to upgrade. However, if you’re on a 30-series, no. Unless you absolutely need to remain on the cutting edge. These cards will either need PCIe 4.0 or PCIe 5.0.
Even though 3.0 is forward-compatible, if you’re planning on making the leap to the RTX 4080 or any of its siblings, you’re going to be upgrading the full thing. Motherboard, CPU, etc. It’s time. Let it go. For those on the 30-series, just don’t bother. The Ti variations might be worth it, but the jump at a consumer – that’s you – level is not worth the hassle of having to get, install and pay out the nose for yet another new GPU.
So if you recently upgraded or built a machine and are now kicking yourself, then don’t. New graphics cards will not magically make your older card slower. If you click on a product link on this page we may earn a small affiliate commission. : Where to buy the Nvidia RTX 4080: Price, specs & more
Näytä koko vastaus
Is PCIe 3 obsolete?
What is PCI Express? – The Peripheral Component Interconnect Express (PCI Express or PCIe) is a high-speed interface standard for connecting additional graphics cards (GPUs), Local Area Network (LAN) ports, NVME solid-state drives (SSDs), Universal Serial Bus (USB) ports and other hardware to a computer’s motherboard.
- This is accomplished using expansion cards, also known as add-on cards.
- Simply put, the PCI Express interface allows for the expansion of a motherboard beyond its default GPU, network and storage configurations.
- The Peripheral Component Interconnect Special Interest Group (PCI-SIG), comprised of big-name technology companies like Intel, IBM, Dell, HP, AMD and NVIDIA, introduced the first generation of PCI Express, entitled PCIe 1.0, in 2003.
PCIe 2.0 and 3.0 were released in 2007 and 2010, respectively. PCIe 4.0 came out in 2017, and PCI-SIG’s latest generation, PCIe 5.0, debuted in 2019. The PCI Express interface is actualized through PCIe slots, which vary in type depending on a motherboard’s chipset, Photo: A motherboard showcasing the different PCIe slot configurations, as well as Peripheral Component Interconnect (PCI) slots, which are now obsolete. Credit: CCBoot For example, PCIe 3.0 x4 refers to a Gen 3 expansion card or slot with a four-lane configuration.
PCI Express: Unidirectional Bandwidth in x1 and x16 Configurations | ||||
Generation | Year of Release | Data Transfer Rate | Bandwidth x1 | Bandwidth x16 |
PCIe 1.0 | 2003 | 2.5 GT/s | 250 MB/s | 4.0 GB/s |
PCIe 2.0 | 2007 | 5.0 GT/s | 500 MB/s | 8.0 GB/s |
PCIe 3.0 | 2010 | 8.0 GT/s | 1 GB/s | 16 GB/s |
PCIe 4.0 | 2017 | 16 GT/s | 2 GB/s | 32 GB/s |
PCIe 5.0 | 2019 | 32 GT/s | 4 GB/s | 64 GB/s |
PCIe 6.0 | 2021 | 64 GT/s | 8 GB/s | 128 GB/s |
Table: PCI-SIG introduced the first generation of PCI Express in 2003. With each new generation comes a doubling of data transfer rate and total bandwidth per lane configuration, the latter of which is expressed in both unidirectional and bidirectional measurements, depending on the source.
- To find the total unidirectional bandwidth for each lane configuration, simply multiply the x1 bandwidths listed in the table above by two, four, eight or 16.
- Multiply the number resulting from that calculation by two to calculate total bidirectional bandwidth.
- Source: PCI-SIG For example, PCIe 1.0 has a 250 MB/s bandwidth in the one-lane configuration, a 0.500 GB/s bandwidth in the two-lane, a 1 GB/s bandwidth in the four-lane, a 2 GB/s bandwidth in the eight-lane and a 4.0 GB/s bandwidth in the 16-lane.
It’s important to note as well that these lane-specific bandwidths are often doubled to account for bidirectional travel, or data traveling to and from each lane. Furthermore, each new generation of PCIe typically doubles its predecessor’s data rate and bandwidth for each configuration.
- For example, PCIe 1.0 has a 2.5 GT/s data rate and a 250 MB/s bandwidth in the one-lane configuration, while the one-lane configuration for PCIe 2.0 supports a 5.0 GT/s data rate and a 500 MB/s bandwidth, and so forth.
- But PCIe 1.0 and PCIe 2.0 are outdated.
- Today, PCIe 3.0 is a motherboard standard, at least until the industry universally adopts PCIe 4.0 and eventually PCIe 5.0.
And by that point, PCI-SIG will have rolled out the next generation, PCIe 6.0, which is expected in 2021. As with any new technology, it can take computer hardware manufacturers some time to begin standardizing their motherboards with the latest PCI Express generation.
Näytä koko vastaus
Does PCIe 4.0 matter for 3080?
Watch video via Vimeo (below) or over on YouTube at 2160p – If you haven’t already read our, I would recommend doing so first. That will give you an idea of how this new GPU compares to the likes of the RTX 2080 Ti, GTX 1080 Ti and more. In this article, we are only looking at performance of the RTX 3080, but tested on different CPU platforms.
Here, we are comparing three sets of data. First of all, the data we used for our review – the RTX 3080 Founders Edition tested with an overclocked to 5.1GHz, restricting us to PCIe 3.0 due to the Z490 platform. To give us an idea of PCIe Gen4 performance with this GPU, we turn to AMD’s and the X570 platform.
As a final data set, we also force PCIe 3.0 while testing with the Ryzen 9 3900XT.
GPU | RTX 3090 | RTX 3080 | RTX 2080 Ti (FE) | RTX 2080 SUPER | RTX 2080 (FE) |
SMs | 82 | 68 | 68 | 48 | 46 |
CUDA Cores | 10496 | 8704 | 4352 | 3072 | 2944 |
Tensor Cores | 328 | 272 | 544 | 384 | 368 |
RT Cores | 82 | 68 | 68 | 48 | 46 |
Texture Units | 328 | 272 | 272 | 192 | 184 |
ROPs | 112 | 96 | 88 | 64 | 64 |
GPU Boost Clock | 1695 MHz | 1710 MHz | 1635 MHz | 1815 MHz | 1800 MHz |
Memory Data Rate | 19.5 Gbps | 19 Gbps | 14 Gbps | 15.5 Gbps | 14 Gbps |
Total Video Memory | 24GB GDDR6X | 10GB GDDR6X | 11GB GDDR6 | 8GB GDDR6 | 8GB GDDR6 |
Memory Interface | 384-bit | 320-bit | 352-bit | 256-bit | 256-bit |
Memory Bandwidth | 936 GB/Sec | 760 GB/Sec | 616 GB/sec | 496.1 GB/sec | 448 GB/sec |
TGP | 350W | 320W | 260W | 250W | 225W |
Driver Notes
- All Nvidia GPUs (except RTX 3000) were benchmarked with the 452.06 driver.
- RTX 3080 was benchmarked with the 456.16 driver supplied to press.
- All AMD GPUs were benchmarked with the Adrenalin 20.8.2 driver.
Test Systems Our first test system is a custom built rig from PCSpecialist, based on Intel’s latest Comet Lake-S platform. You can read more about it over, and configure your own system from PCSpecialist,
CPU | Intel Core i9-10900K Overclocked to 5.1GHz on all cores |
Motherboard | ASUS ROG Maximus XII Hero Wi-Fi |
Memory | Corsair Vengeance DDR4 3600MHz (4 X 8GB) CL 18-22-22-42 |
Graphics Card | Varies |
System Drive | 500GB Samsung 970 Evo Plus M.2 |
Games Drive | 2TB Samsung 860 QVO 2.5″ SSD |
Chassis | Fractal Meshify S2 Blackout Tempered Glass |
CPU Cooler | Corsair H115i RGB Platinum Hydro Series |
Power Supply | Corsair 1200W HX Series Modular 80 Plus Platinum |
Operating System | Windows 10 2004 |
We also tested with a purpose-built AMD Ryzen machine, to enable PCIe 4.0 testing. This is uses a Ryzen 9 3900XT on the X570 platform.
CPU | Ryzen 9 3900XT Overclocked to 4.4GHz on all cores |
Motherboard | ASUS TUF Gaming X570-Plus Wi-Fi |
Memory | Corsair Vengeance DDR4 3600MHz (4 X 8GB) CL 18-22-22-42 |
Graphics Card | Varies |
System Drive | 500GB Samsung 960 Evo M.2 |
Games Drive | 1TB Kingston KC600 2.5″ SSD |
Chassis | Fractal Meshify S2 Tempered Glass |
CPU Cooler | Corsair H115i Pro RGB Hydro Series |
Power Supply | Corsair 1200W HX Series Modular 80 Plus Platinum |
Operating System | Windows 10 2004 |
Software and Games List
- 3DMark Fire Strike & Fire Strike Ultra (DX11 Synthetic)
- 3DMark Time Spy (DX12 Synthetic)
- 3DMark Port Royal (DXR Synthetic)
- Control (DX12)
- Death Stranding (DX12)
- The Division 2 (DX12)
- Far Cry New Dawn (DX11)
- Gears 5 (DX12)
- Ghost Recon: Breakpoint (Vulkan)
- Horizon Zero Dawn (DX12)
- Metro: Exodus (DX12)
- Middle Earth: Shadow of War (DX11)
- Red Dead Redemption 2 (Vulkan)
- Shadow of the Tomb Raider (DX12)
- Total War Saga: Troy (DX11)
We run each benchmark/game three times, and present mean averages in our graphs. We use to measure average frame rates as well as 1% low values across our three runs.3DMark Fire Strike is a showcase DirectX 11 benchmark designed for today’s high-performance gaming PCs.
So far, there’s not much to report with these synthetic tests. The Intel system outperforms the 3900 XT PCIe 4.0 system by 5% in Fire Strike, while remaining 1% faster in the rest of the benchmarks. We see no effective difference between the 3900XT Gen4 and Gen3 results.
Our first game of the day is Control, and here we do see some difference at 1080p. The 3900XT PCIe 4.0 system outperforms its PCIe 3.0 equivalent by 3%, while it has a 1% lead over the Intel system too. These are fine margins though, and as we increase the resolution, the results only get tighter.
At 1440p, there’s just 2FPS separating all three systems, while that shrinks to just 1FPS at 4K. Death Stranding is an action game developed by Kojima Productions. It is the first game from director Hideo Kojima and Kojima Productions after their split from Konami in 2015. It was released by Sony Interactive Entertainment for the PlayStation 4 in November 2019 and by 505 Games for Windows in July 2020.
(Wikipedia). Engine: Decima. We test using the Very High preset, with TAA, DX12 API.
- Moving onto Death Stranding, here we have a good example of Intel outperforming AMD for 1080p gaming, as our Intel system pulls 10% ahead of the 3900XT Gen4 test data.
That difference is eliminated at 1440p resolution, and there we can see the AMD Gen4 system edging ahead of the AMD Gen3 system by 5FPS, or 3%. At 4K, there’s a 2FPS/2% lead for the Gen4 data as well. Tom Clancy’s The Division 2 is an online action role-playing video game developed by Massive Entertainment and published by Ubisoft.
Next up is The Division 2. At 1080p, our Intel test system is 6% faster than the Ryzen 3900XT with PCIe 4.0. Moving to PCIe 3.0 on the AMD system does make a small difference though, but only 2%. At 1440p, Intel remains 3% faster than AMD, while that margin remains consistent at 4K.
AMD’s PCIe 4.0 results are 1% better than their PCIe 3.0 results at 1440p, but there’s no difference at 4K. Far Cry New Dawn is an action-adventure first-person shooter developed by Ubisoft Montreal and published by Ubisoft. The game is a spin-off of the Far Cry series and a narrative sequel to Far Cry 5.
It was released for Microsoft Windows, PlayStation 4 and Xbox One on February 15, 2019. (Wikipedia). Engine: Dunia 2. We test using the Ultra preset, with the HD Textures pack, DX11 API.
Far Cry New Dawn is the perfect example of a game that doesn’t give a toss about PCIe 4.0, instead the CPU is very much the bottleneck here. At 1080p, our Intel system is 21% faster than either AMD system, with PCIe 4.0 making no difference at all. As we step up to 1440p, Intel is still 16% faster than either of the 3900XT test rigs, and even at 4K performance is best on Team Blue to the tune of 5%.
The performance trend in Gears 5 isn’t that different to Far Cry. Here, Intel’s gaming superiority at 1080p comes to the fore again, with our 10900K system outperforming the AMD systems by 14%. There is a single FPS difference between AMD with PCIe Gen4 vs with Gen3, too, but that’d be well within margin for error.
At 1440p, the Intel test system is still providing better results than our PCIe Gen4 data, though the margin has shrunk to 7%. Lastly, at 4K, there’s just a 3FPS difference between the results, regardless of CPU and PCIe bandwidth. Tom Clancy’s Ghost Recon Breakpoint is an online tactical shooter video game developed by Ubisoft Paris and published by Ubisoft.
The game was released worldwide on 4 October 2019, for Microsoft Windows, PlayStation 4 and Xbox One, (Wikipedia). Engine: AnvilNext 2.0. We test using the Very High preset, with AA disabled, Vulkan API.
It’s more of the same in Ghost Recon: Breakpoint, albeit on a slightly smaller scale. The Intel system outperforms AMD with PCIe Gen4 across every resolution we tested – there’s certainly not much in it at 1440p and 4K, but the overall performance crown remains with Intel.
We can see AMD’s system doing better with PCIe 4.0 than with PCIe 3.0, but only by a 2% margin at most, resulting in just an extra 3FPS at 1440p. Horizon Zero Dawn is an action role-playing game developed by Guerrilla Games and published by Sony Interactive Entertainment. The plot follows Aloy, a hunter in a world overrun by machines, who sets out to uncover her past.
It was released for the PlayStation 4 in 2017 and Microsoft Windows in 2020. (Wikipedia). Engine: Decima. We test using the Ultimate Quality preset, DX12 API.
Horizon Zero Dawn is a game we introduced specifically for this testing, as we saw a number of reports suggesting the game benefitted significantly from increased PCIe bandwidth – at least when moving from PCIe 3.0 x8 to x16. We wanted to see if that carried over to PCIe Gen4 with the RTX 3080.
The data certainly suggests some benefit for PCIe 4.0, but not a lot. At 1080p, PCIe 4.0 offers a 4% performance improvement over PCIe 3.0 when both tested on AMD’s X570 platform. Despite that, our Intel system still records the highest frame rates, with an 8% lead over the 3900XT system using PCIe Gen4.
At 1440p, the AMD system with PCIe 4.0 outperforms its PCIe 3.0 equivalent by 3%, though the 10900K system is itself 3% ahead. At 4K, the margins shrink to just 2FPS regardless of the test system used. Metro Exodus is a first-person shooter video game developed by 4A Games and published by Deep Silver in 2019.
Metro Exodus is another game that cares far more about the CPU than it does PCIe bandwidth. Our Intel system takes the top spot across every resolution tested, including a significant 24% performance advantage at 1080p. There’s no difference between the two AMD systems, except at 4K where the 1FPS advantage for the PCIe 4.0 data is again well within margin of error.
Middle-earth: Shadow of War is an action role-playing video game developed by Monolith Productions and published by Warner Bros. Interactive Entertainment. It is the sequel to 2014’s Middle-earth: Shadow of Mordor, and was released worldwide for Microsoft Windows, PlayStation 4, and Xbox One on October 10, 2017.
(Wikipedia). Engine: LithTech Firebird. We test using the Very High preset, DX11 API.
- Up next is Middle earth Shadow of War, and there’s not much to cover here – our Intel system wins across the board once more, with Gen4 vs Gen3 not coming into play at all apart from another 1FPS difference at 4K.
Red Dead Redemption 2 is a 2018 action-adventure game developed and published by Rockstar Games. The game is the third entry in the Red Dead series and is a prequel to the 2010 game Red Dead Redemption. Red Dead Redemption 2 was released for the PlayStation 4 and Xbox One in October 2018, and for Microsoft Windows and Stadia in November 2019.
As for Red Dead Redemption 2, we’re heavily CPU bottlenecked at 1080p, and in fact the performance numbers from our AMD system didn’t change at all as we increased the resolution to 1440p. The Intel system was 7% faster at that resolution. At 4K, however, we see one of the very very few occasions where the AMD system with PCIe 4.0 outperforms our Intel machine.
Granted, it’s a difference of just 2FPS so let’s not get carried away here, but it is a break in the overall trend we have seen today. Shadow of the Tomb Raider is an action-adventure video game developed by Eidos Montréal in conjunction with Crystal Dynamics and published by Square Enix. It continues the narrative from the 2013 game Tomb Raider and its sequel Rise of the Tomb Raider, and is the twelfth mainline entry in the Tomb Raider series.
The game released worldwide on 14 September 2018 for Microsoft Windows, PlayStation 4 and Xbox One. (Wikipedia). Engine: Foundation Engine. We test using the Highest preset, with TAA, DX12 API.
Apart from a strong performance lead in favour of Intel at 1080p, the results in Shadow of the Tomb Raider are pretty much identical at 1440p and 4K, so this is another game which doesn’t seem to care about PCIe 4.0. Total War Saga: Troy is a 2020 turn-based strategy video game developed by Creative Assembly Sofia and published by Sega.
Total War Saga: Troy does show some performance gains with PCIe 4.0. Granted, the 10900K system is still faster at 1080p, but the 3900XT with PCIe 4.0 is 3% faster than the same CPU with PCIe 3.0. In fact, at 1440p, that lead actually grows to 4%. Again, not really a game changer, but one of the largest difference we have seen today.
- Here we revisit Control, this time with ray tracing effects set to High.
Turning on ray tracing effects in Control brings us the single biggest gain for PCIe 4.0 vs PCIe 3.0 from across all of our testing today – an improvement of 5% at 1080p. That was when comparing just the AMD test systems, but the 3900XT with PCIe 4.0 is still 2% faster than the 10900K at 1080p.
- Here we revisit Metro Exodus, this time with ray tracing effects set to Ultra.
The performance trends within Metro Exodus don’t really change when turning RT on. The game still prefers Intel hardware across the board, and Gen4 vs Gen3 with the 3900XT doesn’t make any meaningful difference.
- Here we revisit Shadow of the Tomb Raider, this time with ray tracing effects set to Ultra.
At 1080p in Shadow of the Tomb Raider, we do see some benefit to PCIe 4.0, as the 3900XT results with Gen4 are 3% higher than the same CPU running PCIe 3.0. However, that still only puts performance on par with our Intel system, which remains the case at 1440p and 4K resolutions.
There’s a couple of key points to make here. First of all, if you’ve gone through our game-by-game breakdown, you will not be surprised to see PCIe 4.0 makes very little overall difference compared to PCIe 3.0. Almost nothing, in fact – at 1440p, the average difference between the 3900XT with PCIe 4.0 and the same CPU with PCIe 3.0, is a mere 2%, or 2FPS.
At 4K, the difference is reduced to exactly nothing. The other point to bring up is CPU bottlenecking. At 1080p, the 10900K is simply a faster gaming CPU than the 3900XT, performing 10% better on average despite its PCIe 3.0 limitation. Even at 1440p and 4K, it’s simply averaging higher frame rates than the 3900XT with PCIe 4.0.
Concluding this article is relatively straightforward. As of right now, performance improvements to be had when using the on a PCIe 4.0 interface are very small at best, and in many cases are simply non-existent. That’s not to say we didn’t observe some difference.
- In a few games we tested – including Control with RTX On, Horizon Zero Dawn and Total War Saga: Troy – PCIe 4.0 can net up to 5% better performance versus the older PCIe 3.0 standard, when tested with a Ryzen 3900XT.
- The fact remains that these are best case scenarios, and even if we saw 5% gains in every single game, it’s still not a big difference.
The reality is many games simply do not care about PCIe bandwidth – at least not yet. Right now, the choice of CPU is far more of a limiting factor for the RTX 3080. At 1080p, we observed the i9-10900K outperforming the 3900XT by 10% on average, while three games we tested had the difference over 20% in Intel’s favour.
That means it’s even harder to pin-point relative gains by switching to PCIe 4.0. The extra bandwidth seemingly helps most at 1080p, but that’s the same resolution where Intel is able to outperform AMD by a not insignificant margin, muddying the waters somewhat. It’s possible we will get a clearer idea of the true benefits of PCIe 4.0 if Zen3 is as fast as we hope it is, but that’s all hypothetical at this point.
Then again, the differences at 1440p and 4K are much less significant. While our 3900XT system with PCIe 4.0 produced on average 2% higher frame rates than the same CPU with PCIe 3.0, the Intel system’s lead was in turn reduced to just 3%. At 4K, the margins of difference grow even smaller.
For KitGuru, we will continue to use our i9-10900K test system for our reviews in the immediate future, but as games progress and we start to see more benefits to PCIe 4.0, we will no doubt have to make the jump sooner rather than later. Whether that is to an AMD Zen3-based test system, or an Intel Alder Lake machine, we will have to wait and see.
RTX 3080 Founders Edition has a UK MSRP of £649 and will be available for purchase directly from Nvidia, Custom cards will be available from Overclockers UK starting at £649.99, Discuss on our Facebook page, KitGuru says: It’s been fascinating to test the RTX 3080 on both Intel and AMD systems these last few days.
Näytä koko vastaus
Is PCIe 3.0 still good for gaming?
Benchmarks – Starting with F1 2021, we see that limiting the PCIe bandwidth with the 8GB 5500 XT has little to no impact on performance. Then for the 4GB model we are seeing a 9% reduction in 1% low performance at a 6% hit to the average frame rate when comparing the stock PCIe 4.0 x8 configuration of the 5500 XT to PCIe 3.0 x4.
Jumping up to 1440p we see no real performance loss with the 8GB model, whereas the 4GB version drops ~12% of its original performance. This isn’t a significant loss in the grand scheme of things and the game was perfectly playable, but for a card that’s not exactly packing oodles of compute power, a double-digit performance hit will likely raise an eyebrow.
Things get much much worse in Shadow of the Tomb Raider. A couple of things to note here. although we’re using the highest quality preset for this game, it was released back in 2018 and with sufficient PCI Express bandwidth, the 5500 XT can easily drive 60 fps on average, resulting in an enjoyable and very playable experience.
- We see that PCIe bandwidth is far less of an issue for the 8GB model and that’s because the game does allocate up to 7 GB of VRAM using these quality settings at 1080p.
- The 4GB 5500 XT plays just fine using its stock PCIe 4.0 x8 configuration, there were no crazy lag spikes, the game was very playable and enjoyable under these conditions.
Even when limited to PCIe 4.0 x4 bandwidth, we did see a 6% drop in performance, though overall the gameplay was similar to the x8 configuration. If we then change to the PCIe 3.0 spec, performance tanks and while still technically playable, frame suffering becomes prevalent and the overall experience is quite horrible.
We’re talking about a 43% drop in 1% low performance for the 4GB model when comparing PCIe 4.0 operation to 3.0, which is a shocking performance reduction. You could argue that we’re exceeding the VRAM buffer here, so it’s not a realistic test, but you’ll have a hard time convincing me of that, given how well the game played using PCIe 4.0 x8.
As you’d expect, jumping up to 1440p didn’t help and we’re still looking at a 43% hit to the 1% lows. When using PCI Express 4.0, the 4GB model was still able to deliver playable performance, while PCIe 3.0 crippled performance to the point where the game is simply not playable.
Resident Evil Village only requires 3.4 GB of VRAM in our test, so this is a good example of how these cards perform when kept within the memory buffer. We’re using the heavily dialed down ‘balanced’ quality preset, so those targeting 60 fps on average for these single player games will have some headroom to crank up the quality settings, though as we’ve seen you’ll run into performance related issues much sooner when using PCIe 3.0 with a x4 card.
Rainbow Six Siege is another example of why heavily limiting PCI Express bandwidth of cards with smaller VRAM buffers is a bad idea. The 4GB 5500 XT is already up to 27% slower than the 8GB version, with the only difference between the two models being VRAM capacity.
But we see that limiting the PCIe bandwidth has a seriously negative impact on performance of the 4GB model. Halving the bandwidth from x8 to x4 in the 4.0 mode drops the 1% low by 21%. This is particularly interesting as it could mean even when used in PCIe 4.0 systems, the 6500 XT is still haemorrhaging performance due to the x4 bandwidth.
But it gets much worse for those of you with PCIe 3.0 systems, which at this point in time is most, particularly those seeking a budget GPU. Here we’re looking at a 52% drop in performance from the 4.0 x8 configuration to 3.0 x4. Worse still, 1% lows are not below 60 fps and while this could be solved by reducing the quality settings, the game was perfectly playable even with 4GB of VRAM when using the PCIe 4.0 x8 mode.
Moving on to Cyberpunk 2077, we tested using the medium quality preset with medium quality textures. This game is very demanding even using these settings, but with the full PCIe 4.0 x8 mode the 4GB 5500 XT was able to deliver playable performance with an average of 49 fps at 1080p.
We tested Watch Dogs: Legion using the medium quality preset and although the 4GB model is slower than the 8GB version as the game requires 4.5 GB of memory in our test using the medium quality preset, performance was still decent when using the standard PCIe configuration with 66 fps on average.
Despite the fact that we must be dipping into system memory, the game played just fine. However, reducing the PCIe bandwidth had a significant influence on performance and we see that PCIe 4.0 x4 dropped performance by 24% with PCIe 3.0 x4, destroying it by a 42% margin. We’ve heard reports that the upcoming 6500 XT is all over the place in terms of performance, and the limited 4GB buffer along with the gimped PCIe 4.0 x4 bandwidth is 100% the reason why and we can see an example of that here at 1080p with the 5500 XT.
The PCIe 3.0 x4 mode actually looks better at 1440p relative to the 4.0 spec as the PCIe bandwidth bottleneck is less severe than the compute bottleneck at this resolution. Still, we’re talking about an unnecessary 36% hit to performance.
Assassin’s Creed Valhalla has been tested using the medium quality preset and we do see an 11% hit to performance for the 8GB model when using PCIe 3.0 x4, so that’s interesting as the game only required up to 4.2 GB in our test at 1080p. That being the case, the 4GB model suffered more, dropping 1% lows by 22% from 51 fps to just 40 fps.
The game was still playable, but that’s a massive performance hit to an already low-end graphics card. The margins continued to grow at 1440p and now the PCIe 3.0 x4 configuration for the 4GB model was 32% slower than what we saw when using PCIe 4.0 x8. Obviously, that’s a huge margin, but it’s more than just numbers on a graph.
The difference between these two was remarkable when playing the game, like we were comparing two very different tiers of product.
Far Cry 6, like Watch Dogs: Legion, is an interesting case study. Here we have a game that uses 7.2 GB of VRAM in our test at 1080p, using a dialed down medium quality preset. But what’s really interesting is that the 4GB and 8GB versions of the 5500 XT delivered virtually the same level of performance when fed at least x8 bandwidth in the PCIe 4.0 mode, which is the default configuration for these models.
Despite exceeding the VRAM buffer, at least that’s what’s being reported to us, the 4GB 5500 XT makes out just fine in the PCIe 4.0 x8 mode. However, limit it to PCIe 4.0 x4 and performance drops by as much as 26% – and again, remember the 6500 XT uses PCIe 4.0 x4. That means right away the upcoming 6500 XT is likely going to be heavily limited by PCIe memory bandwidth under these test conditions, even in a PCI Express 4.0 system.
But it gets far worse. If you use PCIe 3.0, we’re looking at a 54% decline for the average frame rate. Or another way to put it, the 4GB 5500 XT was 118% faster using PCIe 4.0 x8 compared to PCIe 3.0 x4, yikes. Bizarrely, the 4GB 5500 XT still worked at 1440p with the full PCIe 4.0 x8 bandwidth but was completely broken when dropping below that.
Using the ‘favor quality’ preset, Horizon Zero Dawn required 6.4 GB of VRAM at 1080p. Interestingly, despite not exceeding the VRAM buffer of the 8GB model we still saw an 11% decline in performance when forcing PCIe 3.0 x4 operation. Then with the 4GB model that margin effectively doubled to 23%.
Doom Eternal is another interesting game to test with as this one tries to avoid exceeding the memory buffer by limiting the level of quality settings you can use. Here we’ve used the ultra quality preset for both models, but for the 4GB version we have to reduce texture quality from ultra to medium before the game would allow us to apply the preset.
At 1080p with the ultra quality preset and ultra textures the game uses up to 5.6 GB of VRAM in our test scene. Dropping the texture pool size to ‘medium’ reduced that figure to 4.1 GB. So the 8GB 5500 XT sees VRAM usage hit 5.6 GB in this test, while the 4GB model maxes out, as the game would use 4.1 GB if available.
Despite tweaking the settings, the 4GB 5500 XT is still 29% slower than the 8GB version when using PCIe 4.0 x8. Interestingly, reducing PCIe bandwidth for the 8GB model still heavily reduced performance, dropping 1% lows by as much as 16%. But it was the 4GB version where things went really wrong.
Näytä koko vastaus
Is PCIe 3.0 good enough for gaming?
Numerous gamers tested the difference between PCIe 3.0 and PCIe 4.0 and found no difference.
Näytä koko vastaus
Does PCIe 4.0 matter for 3070?
Will an RTX 3070 with PCI-E 4.0, work on a GPU Riser with PCI-E 3.0 and Motherboard with PCI-E 3.0? The RTX 3070 is backwards compatible with pcie 3.0 spec. If you want to put it on a riser card/cable it should be fine.
Näytä koko vastaus
Can I use a PCIe 4.0 graphics card in a 5.0 slot?
Is PCIe 5.0 forwards and backwards compatible? – Yes! PCIe 5.0 is both backwards and forwards compatible, as are all generations of PCIe. This means that a PCIe 5.0 card can be connected to a PCIe 4.0 slot, or a PCIe 4.0 card can be connected to a PCIe 5.0 slot.
Näytä koko vastaus
Can I connect a PCIe 3.0 m 2 SSD into a PCIe 4.0 slot?
Absolutely. It will use four PCIe lanes, but the 3.0 standard is unidirectional (data only flows in one direction at a time) while the 4.0 standard is bidirectional (theoretically doubling the transfer rate).
Näytä koko vastaus