Pci 3.0 Graphics Card In 2.0 Slot?
Add bookmark #1
I have a year or so old Dell XPS 8300 (sandybridge/intel) and and older Inspiron with a AMD X2 2200 CPU. I’m looking to upgrade the graphics cards, probably to a Radeon HD7750 or HD7770. However I notice the cards I’m looking at are PCIe 3.0. The little info I found indicated some problems with the XPS with the latest and greatest nVidia, and the poster with that problem suspected PCIe 3.0 as being the culprit.
Add bookmark #2
It’s not an issue, it’ll work fine – just check your PSU’s are up to the job.
Add bookmark #4
PCI-e 3.0 card into a PCI-e 2.0 slot should generally be fine. PCI-e 3.0 into a PCI-e 1.1 slot again technically should be fine, although risk of incompatibility is probably higher. PCI-e 3.0 into a PCI-e 1.0a slot again technically should be fine, but I generally would not expect it to work as PCI-e 1.0a boards are damned old and probably not going to handle it as well.
Add bookmark #5
continuum”:mcg6rhjq said: PCI-e 3.0 card into a PCI-e 2.0 slot should generally be fine. PCI-e 3.0 into a PCI-e 1.1 slot again technically should be fine, although risk of incompatibility is probably higher. PCI-e 3.0 into a PCI-e 1.0a slot again technically should be fine, but I generally would not expect it to work as PCI-e 1.0a boards are damned old and probably not going to handle it as well.
Doh. I just checked the specs on my HTPC box (an old dell e521 btx): PCI 2.3 PCI Express 1.0A SATA 1.0 and 2.0 USB 2.0 I had hoped to standardize on the 77XX series. The 7750 for the HTPC and 7770 for my main computer. So based on possible PCI 3.0 issues, for the older HTPC I’m getting a Radeon HD 6570 from amazon warehouse for around $40.
I need to cross my fingers on the PCIe 2.1 working, it may have issues but Amazon has been pretty good to me on shipping/returns. I probably should have just gone with a passively cooled 6450 but I’m a sucker for a little bit performance for a few extra $$ plus the small fan on the 6570 actually has a better chance of fitting in the wacky BTX case than some large heat sink.
I know the 6570 is a big step down from the 7750 but I figure the savings can be rolled into my main graphics card. Researching my ancient PC made me realize how old the CPU is, giving my a “reality check” on how much I should be spending on upgrading this clunker. (a clunker that is our most used PC in the house since we “cut the cord” and stream everything.
Why even bother than? I’m noticing the streaming is a bit jerky on some shows with the nvidia 7900GS. It’s not the connection as they are smooth as silk on my sandybridge box with the built in intel graphics. I’m hoping the upgraded decoding hardware picks up the slack and gives me another year or two out of the box.
For my non/htpc computer I use for borderlands etc I was all set on the 7770 based on the better power usage, but I really don’t game a lot so I guess as long as the standby/2d power is decent the high TDP for gaming is not a big deal? Right now I’m getting swayed by Tom’s “best cards for the money” towards the 6850 (or a used 6870) as a better price/performance.
Other than lower power it does not seem like the 7770 is going to give me much other than lower frame rates when I do game? All the video features look to be on par? Thanks again for the advice!
Add bookmark #6
You might be SOL, unfortunately. the Geforce 6150LE chipset in that box may not even play well with some PCI-e 2.0 cards. Best to buy from a place you can do a hassle-free return from if needed. Plenty of reviews on the 6850 vs.7770 and whatnot, I believe Techreport’s latest one is actually pretty good- they do a good analysis on minimum framerates and whatnot.
Add bookmark #7
continuum”:31ue5963 said: PCI-e 3.0 card into a PCI-e 2.0 slot should generally be fine. PCI-e 3.0 into a PCI-e 1.1 slot again technically should be fine, although risk of incompatibility is probably higher. FWIW, I just tested the following combination: GTX 660 Ti (PCI-e 3.0 card) on a PCI-e 2.0 board in both 2.0 and 1.1 mode.
The 1.1 mode kicks in if you get a GPU crash (overclock too high) and stays until reboot, which I tested in GPU-Z. In the PCI-e 2.0 slot, the card works really well. Some of the benchmarks out there like 3dmark 11 and Unigine Heaven 3.0 seem to be able to utilize a little tiny bit of bandwidth/latency/idontknow of a full PCI-e 3.0 end-to-end build, because the 2.0 board is about 5-8% behind on the scores I see around (similar CPU and GPU, similar clocks).
When the board switches to PCI-e 1.1, really everything gets cut in half in terms of benchmark scores. Games suffer fps losses as well.
Add bookmark #8
hansmuff”:3pfkw3av said: FWIW, I just tested the following combination: When the board switches to PCI-e 1.1, really everything gets cut in half in terms of benchmark scores. Games suffer fps losses as well. hansmuff Thanks for the info! On the PCI 1.0a board my main reason for upgrading is improved hardware support for video playback.
My 7900GS is pre H.264 or anything else modern. I’m hoping the card will be enough to let me stream content smoothly. If not I have a swell reason to upgarde. The “cut in half” is disappointing. Maybe I should just give up or I should have just went with one of the $30 HD 6450s. The machine I use for gaming is an dell XPS 8300 sandybridge, which should be at least PCI 2.0.
The main concern I have is Dell has not and will not test the 8300 with any 3.0 cards so if there are issues it is unlikely they will get fixed (at least that is what I gather from the dell forums). I’m thinking I’ll play it safe and go with a PCI 2.1 Radeon 6850 or nvidia 550 ti (#1 on amazon when sorted on “best reviewed” which probably means nothing).
- My general policy is to switch vendors with every card, so it’s probably the AMD but some of their driver issues have me concerned (no custom res on HDTVs? I have a panel with a odd resolution).
- Here’s the post that originally got me concerned, especially dell’s response: http://en.community.dell.com/support-fo,88799.aspx “DELL-Chris M replied on 15 Apr 2012 11:04 AM I know you do not want to hear this but we never validated the Nvidia GTX680 on the XPS 8300.
In fact, we never validated the Nvidia GTX680 on the Alienware line either. We only validated the following Dell OEM video cards on our XPS 8300 460w power supply – KP8GM AMD Radeon HD5450 HWHRN AMD Radeon HD5670 GCJ42 AMD Radeon HD5770 2XTG4 AMD Radeon HD5870 HCVMH AMD Radeon HD6450 8F60V AMD Radeon HD6670 8PJF8 AMD Radeon HD6770 Y9XH7 AMD Radeon HD6870 4VDWW AMD Radeon HD6950 VH86X Nvidia G405 X78HM Nvidia GT420 WGP2G Nvidia GTX560 The XPS 8300 has been replaced by the XPS 8500 so there will not be any further video card validation on the XPS 8300.”
Add bookmark #9
You can probably google around- there should be enough XPS8300’s around to see if anyone’s tried a PCI-e 3.0 card in it, and honestly it has pretty good odds of working. (unlike your old board with PCI-e 1.0a, which is just too damned old )
Näytä koko vastaus
Contents
- 1 Can you put a 3.0 graphics card in a 2.0 slot?
- 2 Can I put PCIe 3.0 in PCIe 2.0 slot?
- 3 Can I use a PCIe 4.0 graphics card in a 2.0 slot?
- 4 Does it matter which slot I put my graphics card in?
- 5 Is PCIe 3.0 enough for gaming?
- 6 Is PCI-Express 3.0 backwards compatible?
- 7 Are all graphics card slots the same?
- 8 Does PCIe 3.0 have GPU in 4.0 slot?
- 9 Can I use a PCIe 4.0 GPU in a 5.0 slot?
- 10 Is PCI Express 3.0 backwards compatible?
- 11 Can I use a PCIe 3.0 graphics card in a 1.0 slot?
Can you put a 3.0 graphics card in a 2.0 slot?
Yes, you can connect a PCIe 3.0 graphics card on PCIe 2.0 slot and vice versa is also possible. Here 2.0 (GEN2) & 3.0 (GEN3) refers to the speed of the PCIe device. As per the PCIe specification, initially the PCIe link up always happens in 1.0 (GEN1).
Näytä koko vastaus
Can I put PCIe 3.0 in PCIe 2.0 slot?
Currently, PCIe 3.0 is the most common version that you can find on motherboards but if you have older PC, it may actually conform to the older PCIe v2.0. With most new expansion cards conforming to v3.0, it is common to ask: can I use a PCIe 3.0 card in a 2.0 slot? Basically, yes, you can use a PCIe 3.0 card in a 2.0 slot, but with some caveats which I will cover below.
A third-generation (PCIe 3.0) card will work in a second-generation (PCIe 2.0) slot because the PCIe standard is designed to be backward, and forward compatible, thus allowing the use of new cards on older hardware and vice versa. However, as mentioned earlier, there are few caveats that you need to take into consideration particularly in terms of performance.
In the following text I talk about PCIe backward and forward compatibility.
Näytä koko vastaus
Can I use a PCIe 4.0 graphics card in a 2.0 slot?
Answered: Is PCIe Backward Compatible? Disclosure: This post may contain affiliate links. Learn about PCI-Express version and lane backward compatibility. Kevin Jones / TechReviewer versions are backward compatible, meaning that you can use a PCIe 4.0 graphics card or storage device on a PCIe 3.0 or PCIe 2.0 system. However, PCI-Express will use speeds based on the lowest of the two versions for communication.
x1 Bandwidth | x2 Bandwidth | x4 Bandwidth | x8 Bandwidth | x16 Bandwidth | |
---|---|---|---|---|---|
PCIe 1.0 | 250 MB/s | 500 MB/s | 750 MB/s | 2 GB/s | 4 GB/s |
PCIe 2.0 | 500 MB/s | 1000 MB/s | 2 GB/s | 4 GB/s | 8 GB/s |
PCIe 3.0 | 1 GB/s | 2 GB/s | 4 GB/s | 8 GB/s | 16 GB/s |
PCIe 4.0 | 2 GB/s | 4 GB/s | 8 GB/s | 16 GB/s | 32 GB/s |
PCIe 5.0 | 4 GB/s | 8 GB/s | 16 GB/s | 32 GB/s | 63 GB/s |
PCIe 6.0 | 8 GB/s | 16 GB/s | 32 GB/s | 63 GB/s | 126 GB/s |
Check out my and PCIe 5.0 CPUs and motherboards below. PCIe versions are forward compatible, meaning that you can use a newer PCIe device with an older system. However, PCI-Express will use speeds based on the lowest of the two versions for communication.
Check out the table above to determine what the throughput limit would be for a specific configuration. Some CPUs and motherboards provide PCI-Express lanes at multiple PCIe versions, In these cases, you can choose which devices need the most bandwidth to decide which should be connected to the highest version PCIe lanes.
While laying out your system, keep in mind that PCIe speeds will be based on the lowest PCIe version between the slot/port and the device. You can insert a PCIe add-in card (AIC) into a slot that supports a higher number of lanes, In this case, it would use up to the number of PCIe lanes that the card has.
- For example, you could insert an x4 PCIe network card into an x16 PCIe AIC slot, and it would run at full x4 speed.
- In various scenarios, a PCIe device may not use the maximum number of lanes for which the device was designed.
- For example, some motherboards have x8 PCIe ports that are only electrically wired for x4 lanes.
In another case, a system may have limited lanes provided by the CPU, distributed based on availability or configuration. Devices will negotiate the number of lanes to use, based on system availability, and should still perform fine at a reduced overall bandwidth in most cases.
Our recommended Intel 13th gen high-performance enthusiast CPU : Amazon Affiliate Link
Up to 5.8 GHz max-turbo stock speed: perfect for games, video editing, and high-intensity tasks.24 cores (8 Performance + 16 Efficiency): This combination makes it a great all-around system that can handle any task you throw at it.Virtualization features make it great for running virtual machines. Check the latest price of the (affiliate link), For the Intel Core i9-13900K CPU, you’ll need a motherboard with overclocking support to overclock the CPU. Motherboards with the Z790 chipset typically support CPU overclocking. Otherwise, you can use a motherboard with Intel’s other 600-series desktop chipsets (e.g., Z690, H670, B660, H610).
Our recommended motherboard to pair with the i9-13900K: Amazon Affiliate Link
PCIe 5.0 supportIt supports up to 128GB of DDR5 memory (DDR5 provides the fastest memory speeds)! 2.5 Gbps Ethernet port is faster than any home Internet speed available with tons of room to spare for file transfers. Wi-Fi 6E makes it easy to reach the fastest speeds and future-proof your Wi-Fi system. is great for streaming music to Bluetooth headphones. Five x4 NVMe slots, which is fantastic! One of these supports PCIe 5.0 speeds, and the rest run at PCIe 4.0 speeds. USB 3.2 Gen 2×2 offers 20 Gbps USB speeds! Check the latest price of the (affiliate link),
Our recommended Intel 13th gen high-performance enthusiast CPU : Amazon Affiliate Link
Up to 5.4 GHz max-turbo stock speed: perfect for games, video editing, and high-intensity tasks.16 cores (8 Performance + 8 Efficiency): This combination makes it a great all-around system that can handle almost any task.Virtualization features make it great for running virtual machines. Check the latest price of the (affiliate link), For the Intel Core i7-13700K CPU, you’ll need a motherboard with overclocking support to overclock the CPU. Motherboards with the Z790 chipset typically support CPU overclocking. Otherwise, you can use a motherboard with Intel’s other 600-series desktop chipsets (e.g., Z690, H670, B660, H610).
Our recommended motherboard to pair with the i7-13700K: Amazon Affiliate Link
PCIe 5.0 supportIt supports up to 128GB of DDR5 memory (DDR5 provides the fastest memory speeds)! 2.5 Gbps Ethernet port is faster than any home Internet speed available with tons of room to spare for file transfers. Wi-Fi 6 makes it easy to reach the fastest speeds and future-proof your Wi-Fi system. is great for streaming music to Bluetooth headphones. Three x4 NVMe slots, which is fantastic! These slots all support PCIe 4.0 speeds. USB 3.2 Gen 2×2 offers 20 Gbps USB speeds! Check the latest price of the (affiliate link),
Our recommended Intel 12th gen high-performance enthusiast CPU : Amazon Affiliate Link
Up to 5.2 GHz : perfect for games, video editing, and high-intensity tasks.16 cores (8 Performance + 8 Efficiency): Quite a few cores considering the frequency! This combination makes it a great all-around system that can handle almost any task.Virtualization features make it great for running virtual machines. Check the latest price of the (affiliate link), For the Intel Core i9-12900K CPU, you’ll need a motherboard with overclocking support to overclock the CPU. Motherboards with the Z690 chipset typically support CPU overclocking. Otherwise, you can use a motherboard with Intel’s other 600-series desktop chipsets (e.g., H670, B660, H610).
Our recommended motherboard to pair with the i9-12900K: Amazon Affiliate Link
PCIe 5.0 supportIt supports up to 128GB of DDR5 memory (DDR5 provides the fastest memory speeds)! 10 Gbps Ethernet port is faster than any home Internet speed available with tons of room to spare for file transfers. Wi-Fi 6E makes it easy to reach the fastest speeds and future-proof your Wi-Fi system. is great for streaming music to Bluetooth headphones. Four x4 NVMe slots, which is fantastic! Three of these run at PCIe 4.0 speeds, and one at PCIe 3.0 speeds. USB 3.2 Gen 2×2 offers 20 Gbps USB speeds! Check the latest price of the (affiliate link),
Our recommended Intel 12th gen value CPU : Amazon Affiliate Link
Less than half the price of the i9-12900K, but still excellent performance at up to 4.9 GHz,10 cores (6 Performance + 4 Efficiency): This core count makes it suitable for everyday multi-threading tasks, such as having tons of browser windows open.Virtualization features make it great for running virtual machines. Check the latest price of the (affiliate link), For the Intel Core i5-12600K CPU, you’ll need a motherboard with overclocking support to overclock the CPU. Motherboards with the Z690 chipset typically support CPU overclocking. Otherwise, you can use a motherboard with Intel’s other 600-series desktop chipsets (e.g., H670, B660, H610).
Our recommended motherboard to pair with the i5-12600K: Amazon Affiliate Link
PCIe 5.0 supportIt supports up to 128GB of DDR5 memory! Three NVMe slots, which is excellent! These all run at PCIe 4.0 speeds.2.5 Gb Ethernet port is faster than most home Internet speeds with room to spare for file transfers. USB 3.2 Gen 2×2 offers 20 Gbps USB speeds! Check the latest price of the (affiliate link),
Before purchasing memory, review your motherboard specification to verify which speeds are supported, For example, if a DDR4 motherboard stated that it supports “DDR4 3400(O.C.) / 3333(O.C.) / 3300(O.C.) / 3200 / 3000,” that would mean that it could support DDR4-3400, DDR4-3333, and DDR4-3300 with memory overclocking, and DDR4-3200 and DDR4-3000 at stock speeds.
- Motherboard specifications also indicate the maximum capacity per stick of RAM (DIMM) and across all slots.
- Get RAM recommendations for a specific Intel CPU: Select a CPU.13th Gen Core i9-13900K 13th Gen Core i9-13900KF 13th Gen Core i7-13700K 13th Gen Core i7-13700KF 13th Gen Core i5-13600K 13th Gen Core i5-13600KF 12th Gen Core i9-12900KS 12th Gen Core i9-12900K 12th Gen Core i9-12900KF 12th Gen Core i9-12900F 12th Gen Core i9-12900 12th Gen Core i7-12700K 12th Gen Core i7-12700KF 12th Gen Core i7-12700F 12th Gen Core i7-12700 12th Gen Core i5-12600K 12th Gen Core i5-12600KF 12th Gen Core i5-12600 12th Gen Core i5-12500 12th Gen Core i5-12400F 12th Gen Core i5-12400 12th Gen Core i3-12300 12th Gen Core i3-12100 12th Gen Core i3-12100F 11th Gen Core i9-11900K 11th Gen Core i9-11900KF 11th Gen Core i9-11900F 11th Gen Core i9-11900 11th Gen Core i7-11700K 11th Gen Core i7-11700KF 11th Gen Core i7-11700F 11th Gen Core i7-11700 11th Gen Core i5-11600K 11th Gen Core i5-11600KF 11th Gen Core i5-11600 11th Gen Core i5-11500 11th Gen Core i5-11400F 11th Gen Core i5-11400 10th Gen Core i9-10900K 10th Gen Core i9-10900KF 10th Gen Core i9-10900F 10th Gen Core i9-10900 10th Gen Core i9-10850K 10th Gen Core i7-10700K 10th Gen Core i7-10700KF 10th Gen Core i7-10700F 10th Gen Core i7-10700 10th Gen Core i5-10600K 10th Gen Core i5-10600KF 10th Gen Core i5-10600 10th Gen Core i5-10505 10th Gen Core i5-10500 10th Gen Core i5-10400F 10th Gen Core i5-10400 10th Gen Core i3-10325 10th Gen Core i3-10320 10th Gen Core i3-10305 10th Gen Core i3-10300 10th Gen Core i3-10105F 10th Gen Core i3-10105 10th Gen Core i3-10100F 10th Gen Core i3-10100 9th Gen Core i9-9900K 9th Gen Core i9-9900KF 9th Gen Core i9-9900 9th Gen Core i7-9700KF 9th Gen Core i7-9700K 9th Gen Core i7-9700F 9th Gen Core i7-9700 9th Gen Core i5-9600K 9th Gen Core i5-9600KF 9th Gen Core i5-9600 9th Gen Core i5-9500F 9th Gen Core i5-9500 9th Gen Core i5-9400 9th Gen Core i5-9400F 9th Gen Core i3-9350K 9th Gen Core i3-9350KF 9th Gen Core i3-9320 9th Gen Core i3-9300 9th Gen Core i3-9100F 9th Gen Core i3-9100 Get RAM recommendations for a specific AMD CPU: Select a CPU.
Ryzen 9 7950X Ryzen 9 7900X Ryzen 7 7700X Ryzen 5 7600X Ryzen 9 5950X Ryzen 9 5900X Ryzen 7 5800X3D Ryzen 7 5800X Ryzen 7 Pro 5750G Ryzen 7 Pro 5750GE Ryzen 7 5700X Ryzen 7 5700G Ryzen 7 5700GE Ryzen 5 Pro 5650G Ryzen 5 Pro 5650GE Ryzen 5 5600X Ryzen 5 5600G Ryzen 5 5600GE Ryzen 5 5600 Ryzen 5 5500 Ryzen 3 Pro 5350G Ryzen 3 Pro 5350GE Ryzen 7 Pro 4750G Ryzen 7 Pro 4750GE Ryzen 5 Pro 4650G Ryzen 5 Pro 4650GE Ryzen 5 4600G Ryzen 5 4500 Ryzen 3 Pro 4350G Ryzen 3 Pro 4350GE Ryzen 3 4100 Ryzen 9 3950X Ryzen 9 3900XT Ryzen 9 3900X Ryzen 9 Pro 3900 Ryzen 7 3800XT Ryzen 7 3800X Ryzen 7 3700X Ryzen 7 Pro 3700 Ryzen 5 3600XT Ryzen 5 3600X Ryzen 5 Pro 3600 Ryzen 5 3600 Ryzen 5 3500X Ryzen 3 3300X Ryzen 3 3100 Ryzen 5 3400G Ryzen 5 Pro 3400G Ryzen 5 Pro 3400GE Ryzen 5 Pro 3350G Ryzen 5 Pro 3350GE Ryzen 3 3200G Ryzen 3 Pro 3200G Ryzen 3 Pro 3200GE Ryzen 7 2700X Ryzen 7 Pro 2700X Ryzen 7 2700 Ryzen 7 Pro 2700 Ryzen 7 2700E Ryzen 5 2600X Ryzen 5 2600 Ryzen 5 2600E Ryzen 5 2500X Ryzen 3 2300X Ryzen 5 1600 AF Ryzen 3 1200 AF Ryzen 5 2400G Ryzen 5 Pro 2400G Ryzen 5 2400GE Ryzen 5 Pro 2400GE Ryzen 3 2200G Ryzen 3 Pro 2200G Ryzen 3 2200GE Ryzen 3 Pro 2200GE Ryzen 7 1800X Ryzen 7 1700X Ryzen 7 1700 Ryzen 7 Pro 1700 Ryzen 5 1600X Ryzen 5 1600 Ryzen 5 Pro 1600 Ryzen 5 1500X Ryzen 5 Pro 1500 Ryzen 5 1400 Ryzen 3 1300X Ryzen 3 Pro 1300 Ryzen 3 1200 Ryzen 3 Pro 1200
At an effective frequency of 3200 MHz, this memory hits the fastest supported stock DDR4 speeds. Amazon Affiliate Link It is also available in other (effective) frequencies for overclockers, including 3600 MHz and 4000 MHz, Lower-speed versions are also available on Amazon, in various capacities, including (affiliate link), (affiliate link), and (affiliate link), The low-profile form factor ensures that the heat spreaders don’t get in the way of other devices, including your CPU heatsink.
Want to brush up on other new technologies to consider when building a computer? Check out these articles: Want to brush up on the latest PCIe products, versions, and features? Check out the articles in this PCI-Express series: Have a suggestion or correction for this article? Send us an email at: You can also contact the author at: : Answered: Is PCIe Backward Compatible?
Näytä koko vastaus
Can you put graphics card in 2nd slot?
Bad Ports –
PCI Express slots are prone to power surges and mechanical failure like any other part of the computer. In the event that one of the PCI Express slots fails, you can move the graphics card to the secondary slot. The secondary slot will still be influenced by BIOS and jumper settings even if the primary slot is dead. This can be a problem if the primary slot is down and you can’t get into the BIOS to adjust the settings. You may be able to reset the BIOS and get the motherboard to enable the second port in this case by unplugging the computer and removing the battery backup for five minutes before restarting.
Does it matter which slot I put my graphics card in?
Where You Shouldn’t Install Your Graphics Card – So, if you’re supposed to prioritize using the first available PCI Express x16 slot, what happens if you install it somewhere else? Well, it depends on the slot. If you install your graphics card in a PCI Express x8 slot instead of an x16 slot, you should experience only minimal performance loss when compared to using an x16 slot. However, graphics cards become particularly crippled by the use of weaker slots than that, especially x4 slots. You may still be able to get away with using a PCI Express x4 slot with new motherboards and lower-end graphics cards, but this still isn’t recommended.
Some PCIe Slots are hooked up to the Motherboard’s chipset instead of to the CPU. This can severely impact your Graphic Card’s performance as well. The GPU performs best if it can exchange data through PCIe-Lanes directly with the CPU, without the need of routing through your Chipset. Routing through the Chipset involves the DMA (Direct Memory Acces which is the connector between Chipset and CPU), which will become a bottleneck and also throttle any other components (such as storage) that are hooked up to the chipset.
Stick with your fastest x8 and x16 slots that have direct CPU PCIe-Lanes for the best results! Your Motherboard Manual will tell you which slot this is.
Näytä koko vastaus
What happens if you put a PCIe 3.0 in a 4.0 slot?
How does PCIe 4.0 affect my choice of SSD, NVMe, and GPU? – Like PCIe 3.0, PCIe 4.0 is forward and backward compatible. However, if you connect a PCIe 3.0 card to a PCIe 4.0 slot, the card will perform to the PCIe 3.0 specs. That said, PCIe 4.0 offers another key advantage in addition to its higher bandwidth outlined above, and that’s the ability for designers and system integrators to increase the amount of expansion cards on a platform.
Näytä koko vastaus
Is PCIe 3.0 enough for gaming?
Benchmarks – Starting with F1 2021, we see that limiting the PCIe bandwidth with the 8GB 5500 XT has little to no impact on performance. Then for the 4GB model we are seeing a 9% reduction in 1% low performance at a 6% hit to the average frame rate when comparing the stock PCIe 4.0 x8 configuration of the 5500 XT to PCIe 3.0 x4.
Jumping up to 1440p we see no real performance loss with the 8GB model, whereas the 4GB version drops ~12% of its original performance. This isn’t a significant loss in the grand scheme of things and the game was perfectly playable, but for a card that’s not exactly packing oodles of compute power, a double-digit performance hit will likely raise an eyebrow.
Things get much much worse in Shadow of the Tomb Raider. A couple of things to note here. although we’re using the highest quality preset for this game, it was released back in 2018 and with sufficient PCI Express bandwidth, the 5500 XT can easily drive 60 fps on average, resulting in an enjoyable and very playable experience.
- We see that PCIe bandwidth is far less of an issue for the 8GB model and that’s because the game does allocate up to 7 GB of VRAM using these quality settings at 1080p.
- The 4GB 5500 XT plays just fine using its stock PCIe 4.0 x8 configuration, there were no crazy lag spikes, the game was very playable and enjoyable under these conditions.
Even when limited to PCIe 4.0 x4 bandwidth, we did see a 6% drop in performance, though overall the gameplay was similar to the x8 configuration. If we then change to the PCIe 3.0 spec, performance tanks and while still technically playable, frame suffering becomes prevalent and the overall experience is quite horrible.
- We’re talking about a 43% drop in 1% low performance for the 4GB model when comparing PCIe 4.0 operation to 3.0, which is a shocking performance reduction.
- You could argue that we’re exceeding the VRAM buffer here, so it’s not a realistic test, but you’ll have a hard time convincing me of that, given how well the game played using PCIe 4.0 x8.
As you’d expect, jumping up to 1440p didn’t help and we’re still looking at a 43% hit to the 1% lows. When using PCI Express 4.0, the 4GB model was still able to deliver playable performance, while PCIe 3.0 crippled performance to the point where the game is simply not playable.
Resident Evil Village only requires 3.4 GB of VRAM in our test, so this is a good example of how these cards perform when kept within the memory buffer. We’re using the heavily dialed down ‘balanced’ quality preset, so those targeting 60 fps on average for these single player games will have some headroom to crank up the quality settings, though as we’ve seen you’ll run into performance related issues much sooner when using PCIe 3.0 with a x4 card.
Rainbow Six Siege is another example of why heavily limiting PCI Express bandwidth of cards with smaller VRAM buffers is a bad idea. The 4GB 5500 XT is already up to 27% slower than the 8GB version, with the only difference between the two models being VRAM capacity.
- But we see that limiting the PCIe bandwidth has a seriously negative impact on performance of the 4GB model.
- Halving the bandwidth from x8 to x4 in the 4.0 mode drops the 1% low by 21%.
- This is particularly interesting as it could mean even when used in PCIe 4.0 systems, the 6500 XT is still haemorrhaging performance due to the x4 bandwidth.
But it gets much worse for those of you with PCIe 3.0 systems, which at this point in time is most, particularly those seeking a budget GPU. Here we’re looking at a 52% drop in performance from the 4.0 x8 configuration to 3.0 x4. Worse still, 1% lows are not below 60 fps and while this could be solved by reducing the quality settings, the game was perfectly playable even with 4GB of VRAM when using the PCIe 4.0 x8 mode.
Moving on to Cyberpunk 2077, we tested using the medium quality preset with medium quality textures. This game is very demanding even using these settings, but with the full PCIe 4.0 x8 mode the 4GB 5500 XT was able to deliver playable performance with an average of 49 fps at 1080p.
We tested Watch Dogs: Legion using the medium quality preset and although the 4GB model is slower than the 8GB version as the game requires 4.5 GB of memory in our test using the medium quality preset, performance was still decent when using the standard PCIe configuration with 66 fps on average.
- Despite the fact that we must be dipping into system memory, the game played just fine.
- However, reducing the PCIe bandwidth had a significant influence on performance and we see that PCIe 4.0 x4 dropped performance by 24% with PCIe 3.0 x4, destroying it by a 42% margin.
- We’ve heard reports that the upcoming 6500 XT is all over the place in terms of performance, and the limited 4GB buffer along with the gimped PCIe 4.0 x4 bandwidth is 100% the reason why and we can see an example of that here at 1080p with the 5500 XT.
The PCIe 3.0 x4 mode actually looks better at 1440p relative to the 4.0 spec as the PCIe bandwidth bottleneck is less severe than the compute bottleneck at this resolution. Still, we’re talking about an unnecessary 36% hit to performance.
Assassin’s Creed Valhalla has been tested using the medium quality preset and we do see an 11% hit to performance for the 8GB model when using PCIe 3.0 x4, so that’s interesting as the game only required up to 4.2 GB in our test at 1080p. That being the case, the 4GB model suffered more, dropping 1% lows by 22% from 51 fps to just 40 fps.
The game was still playable, but that’s a massive performance hit to an already low-end graphics card. The margins continued to grow at 1440p and now the PCIe 3.0 x4 configuration for the 4GB model was 32% slower than what we saw when using PCIe 4.0 x8. Obviously, that’s a huge margin, but it’s more than just numbers on a graph.
The difference between these two was remarkable when playing the game, like we were comparing two very different tiers of product.
Far Cry 6, like Watch Dogs: Legion, is an interesting case study. Here we have a game that uses 7.2 GB of VRAM in our test at 1080p, using a dialed down medium quality preset. But what’s really interesting is that the 4GB and 8GB versions of the 5500 XT delivered virtually the same level of performance when fed at least x8 bandwidth in the PCIe 4.0 mode, which is the default configuration for these models.
- Despite exceeding the VRAM buffer, at least that’s what’s being reported to us, the 4GB 5500 XT makes out just fine in the PCIe 4.0 x8 mode.
- However, limit it to PCIe 4.0 x4 and performance drops by as much as 26% – and again, remember the 6500 XT uses PCIe 4.0 x4.
- That means right away the upcoming 6500 XT is likely going to be heavily limited by PCIe memory bandwidth under these test conditions, even in a PCI Express 4.0 system.
But it gets far worse. If you use PCIe 3.0, we’re looking at a 54% decline for the average frame rate. Or another way to put it, the 4GB 5500 XT was 118% faster using PCIe 4.0 x8 compared to PCIe 3.0 x4, yikes. Bizarrely, the 4GB 5500 XT still worked at 1440p with the full PCIe 4.0 x8 bandwidth but was completely broken when dropping below that.
Using the ‘favor quality’ preset, Horizon Zero Dawn required 6.4 GB of VRAM at 1080p. Interestingly, despite not exceeding the VRAM buffer of the 8GB model we still saw an 11% decline in performance when forcing PCIe 3.0 x4 operation. Then with the 4GB model that margin effectively doubled to 23%.
Doom Eternal is another interesting game to test with as this one tries to avoid exceeding the memory buffer by limiting the level of quality settings you can use. Here we’ve used the ultra quality preset for both models, but for the 4GB version we have to reduce texture quality from ultra to medium before the game would allow us to apply the preset.
At 1080p with the ultra quality preset and ultra textures the game uses up to 5.6 GB of VRAM in our test scene. Dropping the texture pool size to ‘medium’ reduced that figure to 4.1 GB. So the 8GB 5500 XT sees VRAM usage hit 5.6 GB in this test, while the 4GB model maxes out, as the game would use 4.1 GB if available.
Despite tweaking the settings, the 4GB 5500 XT is still 29% slower than the 8GB version when using PCIe 4.0 x8. Interestingly, reducing PCIe bandwidth for the 8GB model still heavily reduced performance, dropping 1% lows by as much as 16%. But it was the 4GB version where things went really wrong.
Näytä koko vastaus
Is PCI-Express 3.0 backwards compatible?
Are PCIe 4.0 and PCIe 3.0 backward and forward compatible? – Both PCIe 4.0 and PCIe 3.0 are backward and forward compatible. Remember those high-speed components (GPUs, NVME SSDs, etc.) that use PCIe slots to interface with the motherboard and provide additional functionality? Thanks to backward and forward compatibility, the new can be used with the old (backward compatibility), and the old can be used with the new (forward compatibility). Photo: Connectors of different-sized PCIe expansion cards. Credit: How-To Geek Your expansion card doesn’t have to be installed on a slot with the same number of lanes, either. For example, a PCIe 4.0 SSD with four lanes can be inserted into a x16 slot.
Photo: There are a few things to keep in mind when choosing PCIe 4.0 SSDs and PCIe 4.0 GPUs, mainly the population and PCIe generation of your motherboard’s PCIe slots.
Näytä koko vastaus
Is PCIe 3.0 a bottleneck?
truels2 New MemberTotal Posts : 19 Reward points : 0 Joined: 2021/06/04 18:06:20 Status: offline Ribbons : 0
So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! justin_43 CLASSIFIED Member
Total Posts : 3319 Reward points : 0 Joined: 2008/01/04 18:54:42 Status: offline Ribbons : 7
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:21:49 (permalink) A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. Intel Core i7 12700K • MSI Z690 Edge WiFi • 32GB G.Skill Trident Z • EVGA RTX 3090 FTW3 Ultra • EVGA 1600T2 PSU 3x 2TB Samsung 980 Pros in RAID 0 • 250GB Samsung 980 Pro • 2x WD 2TB Blacks in RAID 0 • Lian-Li PC-D600WB EK Quantum Velocity • EK Quantum Vector w/ ABP • EK Quantum Kinetic TBE 200 D5 • 2x Alphacool 420mm Rads LG CX 48″ • 2x Wasabi Mango UHD430s 43″ • HP LP3065 30″ • Ducky Shine 7 Blackout • Logitech MX Master Sennheiser HD660S w/ XLR • Creative SB X-Fi Titanium HD • Drop + THX AAA 789 • DarkVoice 336SE OTL EVGA_JacobF EVGA Product Manager
Total Posts : 16946 Reward points : 0 Joined: 2006/01/17 12:10:20 Location: Brea, CA Status: online Ribbons : 26
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:23:20 (permalink) ☼ Best Answer by Hoggle 2021/06/19 02:55:25 Nope, PCI-E 3.0 x16 is still plenty of bandwidth. Jstandaert Superclocked Member
Total Posts : 243 Reward points : 0 Joined: 2021/04/10 16:36:16 Status: offline Ribbons : 2
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:23:28 (permalink) the gain in frames from 3.0 to 4.0 to most isn’t worth it for most. Save some Dough-Use my Code scott91575 iCX Member
Total Posts : 344 Reward points : 0 Joined: 2008/03/27 17:41:02 Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:28:27 (permalink) Jstandaert the gain in frames from 3.0 to 4.0 to most isn’t worth it for most. The only gains to be had with 4.0 right now is for NVME drives. Even then most people won’t notice the difference in every day tasks. New Member
Total Posts : 19 Reward points : 0 Joined: 2021/06/04 18:06:20 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:29:58 (permalink) justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. justin_43 CLASSIFIED Member
Total Posts : 3319 Reward points : 0 Joined: 2008/01/04 18:54:42 Status: offline Ribbons : 7
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:36:32 (permalink) truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. New Member
Total Posts : 19 Reward points : 0 Joined: 2021/06/04 18:06:20 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:37:46 (permalink) justin_43 truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. You are more than fine sweet thanks for the info aka_STEVE_b EGC Admin
Total Posts : 17586 Reward points : 0 Joined: 2006/02/26 06:45:46 Location: OH Status: offline Ribbons : 68
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:38:32 (permalink) pcie 3.0 is still not fully saturated by graphics outputs – you will be good AMD RYZEN 9 5900X 12-core cpu~ ASUS ROG Crosshair VIII Dark Hero ~ EVGA RTX 3080 Ti FTW3~ G.SKILL Trident Z NEO 32GB DDR4-3600 ~ Phanteks Eclipse P400s red case ~ EVGA SuperNOVA 1000 G+ PSU ~ Intel 660p M.2 drive~ Crucial MX300 275 GB SSD ~WD 2TB SSD ~CORSAIR H115i RGB Pro XT 280mm cooler ~ CORSAIR Dark Core RGB Pro mouse ~ CORSAIR K68 Mech keyboard ~ HGST 4TB Hd.~ AOC AGON 32″ monitor 1440p @ 144Hz ~ Win 10 x64 justin_43 CLASSIFIED Member
Total Posts : 3319 Reward points : 0 Joined: 2008/01/04 18:54:42 Status: offline Ribbons : 7
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 10:39:50 (permalink) truels2 justin_43 truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. Superclocked Member
Total Posts : 101 Reward points : 0 Joined: 2021/06/10 21:38:51 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 11:37:25 (permalink) PCI 3 will be enouth for years and years from now. mdb983 Superclocked Member
Total Posts : 108 Reward points : 0 Joined: 2020/09/26 16:39:46 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 11:41:03 (permalink) truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine Zixinus Superclocked Member
Total Posts : 165 Reward points : 0 Joined: 2020/10/10 04:18:52 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/16 11:57:11 (permalink) truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! It isn’t the age of the mobo but its specifications that tell whether it is PCIe 4 or not.
- You may also want to check your BIOS settings (consult your manual!) and configuration too.
- It is possible that you may be running in 3.0 mode for some reason.
- However, using a PCIE 3 isn’t going to be a huge loss.
- Maybe you’ll lose a percent points of performance, but if you have a 3080ti, you should have plenty to spare.
oletorius New Member
Total Posts : 90 Reward points : 0 Joined: 2020/11/09 20:46:27 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 18:49:29 (permalink) Keep that bad boy. I have a very similar rig. It’s fine. 🙂 nezff SSC Member
Total Posts : 940 Reward points : 0 Joined: 2012/08/20 16:36:53 Location: Cajun Country Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 18:57:02 (permalink) truels2 justin_43 A PCIe 3.0 slot will not bottleneck the card. But what CPU are you running? That might be a different story. I have a I9-10900KF 3.7 gig not over clocked just factory default settings. Superclocked Member
Total Posts : 172 Reward points : 0 Joined: 2008/05/20 11:23:22 Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 23:32:37 (permalink) dmitri2k New Member
Total Posts : 66 Reward points : 0 Joined: 2016/04/16 07:38:48 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 23:47:05 (permalink) Edwin405 New Member
Total Posts : 100 Reward points : 0 Joined: 2021/06/08 09:22:25 Location: California Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/17 23:52:05 (permalink) Should be ok, bottleneck no more, PCIE gen 4 is for nvme m.2, since it Run! at faster speeds. And the cpu with the cpu you have is a good combo Gogod2020 iCX Member
Total Posts : 272 Reward points : 0 Joined: 2020/10/19 14:31:11 Status: offline Ribbons : 1
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 04:02:27 (permalink) The only things that will bottle neck you are CPU (huge impact), RAM (some impact), PSU (stability impact), temperatures (clocks impact). PCIE 3.0 vs 4.0 will not do anything at all at the moment and for the next few years give or take. SSC Member
Total Posts : 854 Reward points : 0 Joined: 2010/03/27 20:40:35 Location: Nebraska Status: offline Ribbons : 3
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 05:04:03 (permalink) I have a 9900KS @5ghz+ on a z390 board (PCIe 3.0) and a Ryzen 5 3600XT @4.5ghz+ on a x570 board (PCIe 4.0) and at 3440x1440p there is virtually no difference in FPS with a 3080 Ti in the games I play. kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 09:50:44 (permalink) Are half the people on this forum stupid now? 16x PCIe Gen 3.0 is fine for any current top end GPU. HOWEVER.8x PCIe Gen 4.0 is also fine, which frees up 8x pcie lanes off the CPU to be used for other things. Kokin Superclocked Member
Total Posts : 106 Reward points : 0 Joined: 2017/08/23 23:15:07 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:01:13 (permalink) kevinc313 Are half the people on this forum stupid now? 16x PCIe Gen 3.0 is fine for any current top end GPU. HOWEVER.8x PCIe Gen 4.0 is also fine, which frees up 8x pcie lanes off the CPU to be used for other things. Use my Associate Code to get 3-10% off your purchase: D7J9R5NG8G0BRER dmitri2k New Member
Total Posts : 66 Reward points : 0 Joined: 2016/04/16 07:38:48 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:16:05 (permalink) agreed, we are all learning and teaching each other, easy plz kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:30:17 (permalink) Kokin kevinc313 Are half the people on this forum stupid now? 16x PCIe Gen 3.0 is fine for any current top end GPU. HOWEVER.8x PCIe Gen 4.0 is also fine, which frees up 8x pcie lanes off the CPU to be used for other things. It’s not the lack of OBVIOUS, READILY AVAILABLE knowledge that I’m concerned about, it’s the number of people repeating “Gen 4 don’t matter, Gen 3 fine, HURRR DURRR” to answer the post. Go back to Reddit morans. kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:31:25 (permalink) mdb983 truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0. How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine It will run but there is a tested hit to performance. Superclocked Member
Total Posts : 106 Reward points : 0 Joined: 2017/08/23 23:15:07 Status: offline Ribbons : 0
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:52:35 (permalink) kevinc313 mdb983 truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0.
How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine It will run but there is a tested hit to performance. It’s been tested so many times by numerous sources on the internet. It’s a 0-3% in performance loss and even then it depends on the application used.
That’s margin of error type of differences and is negligible from a normal person’s perspective. PCIE 4.0 can easily be generalized to not make a difference unless it’s NVME performance and we’re only talking about a 3080Ti. Anyway, why bother answering if you’re just gonna call everyone stupid and morons (not even spelled correctly lol). Use my Associate Code to get 3-10% off your purchase: D7J9R5NG8G0BRER kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 10:59:50 (permalink) Kokin kevinc313 mdb983 truels2 So im looking at my CPU-z and realize my new 3080TI supports PCIE 4.0 yet my mother board which is only couple months old only supports PCIE 3.0.
How much am i bottle necking my card? Should i be looking to upgrade to a 4.0 board? Seems like the upgrading never stops!!! even PCIE 3 x8 should be fine It will run but there is a tested hit to performance. It’s been tested so many times by numerous sources on the internet. It’s a 0-3% in performance loss and even then it depends on the application used.
That’s margin of error type of differences and is negligible from a normal person’s perspective. PCIE 4.0 can easily be generalized to not make a difference unless it’s NVME performance and we’re only talking about a 3080Ti. Anyway, why bother answering if you’re just gonna call everyone stupid and morons (not even spelled correctly lol). CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 11:05:38 (permalink) KingEngineRevUp FTW Member
Total Posts : 1030 Reward points : 0 Joined: 2019/03/28 16:38:54 Status: offline Ribbons : 9
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 11:09:02 (permalink) kevinc313 CLASSIFIED ULTRA Member
Total Posts : 5004 Reward points : 0 Joined: 2019/02/28 09:27:55 Status: offline Ribbons : 22
Re: Am i bottle necking Running PCIE 3.0 with 3080 TI? 2021/06/18 11:13:41 (permalink)
Does PCIe Version matter for GPU?
Only search in – Title Description Content ID Sign in to access restricted content. The browser version you are using is not recommended for this site. Please consider upgrading to the latest version of your browser by clicking one of the following links.
Näytä koko vastaus
Will GPU work in 2nd PCIe slot?
Yes, but on most motherboards doing so will result in a loss of GPU performance. This is because on most motherboards, only the top PCIe x16 slot is connected directly to the CPU.
Näytä koko vastaus
Does PCIe 2.0 bottleneck GPU?
Not by much or at all since a PCIe 2.0 x16 is very close to a 3.0 x8. Very few GPU’s have reduced performance in real-world scenarios in this regard.
Näytä koko vastaus
What happens if you put 2 GPUs in a PC?
The Pros – There are a few main benefits of running multiple video cards, which include:
- Multiple graphics cards can offer an enhanced 3D gaming experience.
- Two GPUs are ideal for multi-monitor gaming.
- Dual cards can share the workload and provide better frame rates, higher resolutions, and extra filters.
- Additional cards can make it possible to take advantage of newer technologies such as 4K Displays.
- Depending on the make, running two mid-range cards is likely to be slightly cheaper than running one comparable high-end card.
- It can be cheaper to buy a second of your current card than upgrading to a newer model
What happens if you plug in 2 graphics cards?
Benefits – The primary benefit of running two graphics cards is increased video game performance. When two or more cards render the same 3D images, PC games run at higher frame rates and at higher resolutions with additional filters. This extra capacity improves the quality of the graphics in games.
Most graphics cards render games up to 1080p resolution. With two graphics cards, games run at higher resolutions, such as on 4K displays that offer four times the resolution. In addition, several graphics cards can drive additional monitors. A benefit of using an SLI or Crossfire-compatible motherboard is that a PC can be upgraded at a later time without replacing the graphics card.
Add a second graphics card later to boost performance without removing the existing graphics card. Manufacturers upgrade graphics cards about every 18 months and a compatible card may be difficult to find after two years.
Näytä koko vastaus
Can you have 2 graphics cards in 1 PC?
PCI Express Bifurcation – The CPU on the motherboard has a certain number of controllers and each one of the controllers can support only one device. For example, a 9th Gen Intel i9 processor supports 16 PCI lanes, it is split up into 4 controllers each controlling 4 lanes each which mean you can connect a maximum of 4 PCI express devices.
The PCI Express Bifurcation comes into play when you want to connect 4 PCIe devices like M.2 SSD, Graphics card etc. to a single PCIe slot. And to enable this you need to set PCIeX16 bandwidth to x4x4x4x4 from x16 from the BIOS settings. This will enable users to install multiple graphics cards to their system.
Although Gamers run only a single graphics card on a system, using PCI Express Bifurcation one can run multiple Graphic cards with x16 speeds on a single system. But splitting the PCIe into 2 and running each graphics card at x8 speed won’t make any difference in performance. If you have more than one PCIe express slots on your motherboard the lanes are already divided up. And if not you can use Riser Card that can split a PCIe lane into two lanes. This allows plugging in two graphics cards on a single port with the help of Riser cables. The only important thing to note is that the motherboard should support PCI Express bifurcation and it can be enabled via BIOS. Also Read: How to Create a Strong Password And Beat Security Experts And Hackers
Näytä koko vastaus
What PCIe is RTX 3060?
GeForce RTX 3060 is connected to the rest of the system using a PCI-Express 4.0 x16 interface.
Näytä koko vastaus
Can I put a 16x graphics card in a 4x slot and it still work?
Yes. It will work fine. It won’t run quite as fast as it will in a 16x slot, but the performance penalty isn’t 75 percent (4/16), it’s usually in the range of 5 to 10 percent.
Näytä koko vastaus
Are all graphics card slots the same?
Getting a new GPU and simply plugging it in might work, but if you don’t make sure that your system is compatible, you could seriously endanger it. Why take an unnecessary risk when checking the graphics card compatibility is so simple ? The good news is that most modern GPUs have been compatible with almost any motherboard from the last decade.
Even so, it’s better to be safe than sorry. You will only need to check for graphics card compatibility if you’re getting a dedicated GPU, If you’re planning to game using your integrated graphics card (which is possible and sometimes even decent with newer technology ), you can be sure it’s already compatible.
Let’s get started!
Näytä koko vastaus
Does PCIe 3.0 have GPU in 4.0 slot?
How Do PCIe 3.0 and 4.0 Affect Your SSD and GPU? – As mentioned above, both PCIe 4.0 and PCIe 3.0 are backward and forward compatible with the existing PCIe configurations. However, because of their bandwidth limitations, you won’t always get the full performance of your PCIe GPUs or SSDs.
- If you connect a PCIe 3.0 GPU to a PCIe 4.0 slot, you will only get the PCIe 3.0 standard performance.
- If you connect a PCIe 4.0 GPU to a PCIe 3.0 slot, you won’t be able to cash in on the increased bandwidth and data transfer speed of your PCIe 4.0 GPU.
- The same thing goes for PCIe SSDs.
- That said, it is not hard to see that a motherboard with PCIe 4.0 ports has an obvious advantage over those with PCIe 3.0 ones.
Using a motherboard with PCIe 4.0 ports, you have more room to increase the number of SSDs and GPUs to support higher bandwidth. For instance, to achieve 16 GB/s of bandwidth, you need only 8 PCIe 4.0 lanes instead of 16 lanes with PCIe 3.0.
Näytä koko vastaus
Will PCIe 3.0 bottleneck RTX 3060?
Yes and PCIe 3.0 won’t be a bottleneck for that card. Basically no existing mainstream consumer GPU can fully saturate 16 PCIe Gen 3.0 lanes. Do newer graphics cards that are compatible with PCI 4.0 actually function at a reduced performance level if slotted into a PCI 3.0?
Näytä koko vastaus
Can I use a PCIe 4.0 GPU in a 5.0 slot?
Is PCIe 5.0 forwards and backwards compatible? – Yes! PCIe 5.0 is both backwards and forwards compatible, as are all generations of PCIe. This means that a PCIe 5.0 card can be connected to a PCIe 4.0 slot, or a PCIe 4.0 card can be connected to a PCIe 5.0 slot.
Näytä koko vastaus
Can I use a PCIe 3.0 graphics card in a 5.0 slot?
What do I need for PCIe 4.0 or 5.0? – CPU. For PCIe 5.0, you’ll need a 12th Gen Intel® Core™ CPU, which is built to support gaming from the ground up with up to 16 CPU PCIe 5.0 lanes and up to four CPU PCIe 4.0 lanes. For 4.0, you’ll need an 11th Gen Intel® Core™ desktop CPU, which are built to support gaming from the ground up with features like PCIe 4.0 and up to 20 CPU PCIe lanes.
Motherboard. For a 12th Gen CPU, you’ll need a 600 Series chipset with an LGA 1700 socket. For 11th Gen, you’ll want a 500 Series motherboard from the Z590 or B560 lines. PCIe 4.0 and 5.0 devices. Though you might not spring for a PCIe 4.0 SSD or GPU during your initial build or purchase, it’s easy to see why support is useful down the road.
Maybe ports of new console games start relying more heavily on streaming in assets, and a PCIe 4.0 SSD provides a tangibly smoother experience. Or the next generation of GPUs benefits from the doubled throughput of PCIe 4.0 and 5.0 slots. (Note that PCIe 3.0 devices will also work normally on a PCIe 4.0 or 5.0 platform, thanks to backwards compatibility.)
Näytä koko vastaus
Is PCI Express 3.0 backwards compatible?
Are PCIe 4.0 and PCIe 3.0 backward and forward compatible? – Both PCIe 4.0 and PCIe 3.0 are backward and forward compatible. Remember those high-speed components (GPUs, NVME SSDs, etc.) that use PCIe slots to interface with the motherboard and provide additional functionality? Thanks to backward and forward compatibility, the new can be used with the old (backward compatibility), and the old can be used with the new (forward compatibility). Photo: Connectors of different-sized PCIe expansion cards. Credit: How-To Geek Your expansion card doesn’t have to be installed on a slot with the same number of lanes, either. For example, a PCIe 4.0 SSD with four lanes can be inserted into a x16 slot.
Photo: There are a few things to keep in mind when choosing PCIe 4.0 SSDs and PCIe 4.0 GPUs, mainly the population and PCIe generation of your motherboard’s PCIe slots.
Näytä koko vastaus
Can I use a PCIe 3.0 graphics card in a 1.0 slot?
PCIE 3.0 cards are backwards compatible with the 1.0 and the 2.0 slots, but you won’t be able to enjoy the full extent of your new PCIE card. It will be restricted to Gen 1 speed (2.5 Gt/sec) while Gen 3 allows 8 Gt/sec and a better encoding scheme which actually makes it about X4 times faster.
Näytä koko vastaus
Will a PCI Express 2.0 work with x16?
Yes. PCIE 2.0×16 card will work with PCIE 3.0 x 16 slot. But PCIE 2.0×16 card can work only at a max speed of Gen2(5.0GT/s).
Näytä koko vastaus