Why Dual and Multi-GPU PC are Dying
Why Dual and Multi-GPU PCs are Dying?
Gaming PCs stacked with multiple graphics cards were once the aspiration of every hardcore gamer. Nvidia's SLI and AMD's CrossFire technologies let us hook up to four GPUs together. Double the graphics power for an unmatched gaming experience.
Yet this multi-GPU setup, once considered the inevitable path to maximum framerates, now faces extinction. Though still beloved by a dedicated few, dual-card configurations have lost their glow for even the most devoted pixel hunter.
What changed to put the iconic multi-GPU desktop on its deathbed? Let's take a step back and see how this trend began.
The Promise of dual GPUs
Up to the mid-2010s, PC gaming hit a new reality. Game development was advancing rapidly, with graphics outstripping the capabilities of even the most high-end graphics cards.
GPU manufacturers struggled to keep up as each year brought unprecedented leaps in in-game graphics, effects, and details. New game releases began routinely crushing even the most powerful graphics cards under the weight of their advanced visuals and effects.
The innovative solution for hardcore gamers? Multiple graphics cards. Gaming required two or more GPUs to keep up with High-FPS 4K. If a single GPU could run games decently, logically, two or more linked together could unlock unparalleled performance.
Both major GPU makers introduced parallel multi-GPU technologies - NVIDIA pioneered SLI in 2004, and ATI brought Crossfire in 2005. These technologies enabled gamers to combine two or more identical graphics cards into a unified processing system. Though not seamless, it was an imperfect solution to ever-surging GPU requirements.
This thinking heralded a new era in GPU technology.
This approach had moments of glory, like the Radeon HD 6990 in 2011 – a beast that still powers games even over a decade later:
Advantages of dual GPUs
Installing two or more GPUs lets your computer share the graphical workload among the video cards. This multi-GPU system allows your PC to process more data simultaneously, enabling greater resolutions while maintaining high frame rates in games and applications.
When well-implemented, dual cards essentially combine forces - multiplying the processing muscle for next-level performance.
- Increased performance: For 3D and video editing, spreading the workload over two cards can speed things up.
- High-resolution gaming: Dual GPUs meant smoother framerates for 4K gaming or multi-monitor setups.
- Multi-tasking: Using one GPU per monitor allowed for immersive multi-display experiences.
- Cost savings: Pairing two mid-range cards could be cheaper than one high-end GPU.
- Future proofing: A second GPU is a buffer against quickly advancing game tech.
But while multi-GPU rigs work for specialized tasks, the reality often falls short of expectations. Inconsistent game support, visual glitches, high costs, and noisy cooling systems turned gamers away.
Now, most are content with a single, powerful GPU. It's a case of less being more – smoother performance, less noise, and fewer compatibility headaches. While dual and multi-GPU systems still have their niche, it's clear the tide has turned in favor of simpler, more efficient setups.
Disadvantages of dual GPUs
More power often seems like a no-brainer in the high-octane world of PC gaming. While piling on graphics cards delivers more raw power, multi-GPU configurations come with significant downsides you must weigh:
1. Soaring costs
Even before the pandemic, GPU price surges and dual cards meant doubling already hefty costs. Today's cutting-edge models like the Nvidia RTX 4090 retail at around $1,599 each; doubling up means shelling out nearly $4,4000. That's a hefty price for power.
2. Extreme power and heat
GPUs are notorious for their appetite for electricity. The RTX 4090 can pull 450 watts alone. Stack two, and you're requiring a 1300+ watt power supply unit (PSU).
Plus, all that power generates heat, turning your gaming rig into a mini furnace. So, you'll require top-tier cooling solutions to prevent literal meltdowns.
3. Diminishing returns
Adding a second GPU doesn't double performance. In the best-case scenarios, you might see about a 50% increase in performance. Yet, you're paying full price for a second GPU. Each subsequent card faces even diminishing performance boosts.
So, while doubling up sounds great, the actual payoff might not justify the extra cost and complexity.
4. Limited compatibility
You can't mix and match GPU cards. You'll need identical cards for optimal multi-GPU performance - mixing GPU generations or models is mainly ineffective.
The less powerful GPU can hold back the performance of the more powerful one. This mismatch can lead to the underutilization of your top card.
5. Game compatibility: A hit or miss
Does your favorite game even support multi-GPU setups? Games without support waste your second GPU.
Games like Grand Theft Auto V might be smooth sailing, but others like Forza Horizon 5 don't play ball with multi-GPU systems. And if you're in the DirectX 12 realm, you depend entirely on the game's native support for multiple GPUs.
Some games supporting multi-GPU can still be a headache as they require complex setups. Even then, you might face performance issues like frame drops. SLI and CrossFire can sometimes cause a glitch called micro stuttering, which makes the gameplay choppy.
6. Manufacturer support waning
As game developers slowly abandoned dual GPU support, AMD and NVIDIA, once multi-GPU champions, shifted their focus. AMD shut down CrossFire in 2017, and NVIDIA dropped SLI support in 2021.
While doubling or tripling up on GPUs has its allure, the reality is often less glamorous. With hefty costs, power demands, compatibility issues, and the need for specific hardware combinations, multi-GPU setups are becoming less of a go-to solution and a niche choice for those navigating its complexities. As single GPUs grow more powerful, they're proving that sometimes, less is more.
The death of multi-GPU
Ultimately, the direction of gaming hardware and software trends rang the death knell for multi-GPU setups.
As games grew exponentially more complex, developers had little incentive to optimize for the shrinking multi-GPU market. The extra work for a minimal payoff wasn't worth it.
DirectX 12 also shifted support duties from AMD/Nvidia directly onto game coders - further crushing optimization hopes. And even when new titles claimed multi-GPU support, gamers faced endless problems in reality - lower FPS, visual bugs, and crashes. The touted power scaling collapsed into frustration.
By 2016, new single-card GPUs like NVIDIA's GTX 1080 delivered strong enough performance for most. Multi-GPU struggled to compete. It soon became a specialist-only territory.
Developers and manufacturers have aligned on single GPUs, marching forward with each generation. Bereft of software and hardware support, yesterday's multi-GPU dream builds have become relics left behind.
Ultimately, dual graphics card setups aren't practical or cost-effective for most gamers anymore. They rack up big bills yet face endless compatibility issues and a lack of support. With steep pricing, limited scaling, hardware, and software roadblocks galore - multi-GPU's pitfalls have eclipsed its promise.
AMD, NVIDIA, and game developers have all shifted focus to ever-advancing single-card solutions rather than continuing to prop the niche world of multi-GPU builds. Drivers and optimizations faded away, leaving multi-GPU to wither. Perhaps one day, an entirely new usage case may emerge to resurrect its fortunes. But for now, yesterday's multi-GPU dream remains largely out of reach - a relic of the computing past rather than our present or future.
Robert is a Taiwan-based writer and digital marketer at iamrobert design. He has a passion for helping people simplify their lives through tech.