A game of graphics cards

Your non-gamer friendly guide to today’s GPU market. (Photo: Wccftech)

Even if I’d describe myself as a gamer through and through, I actually got my own gaming PC just last 2020. I still don’t understand much about computer parts but thankfully I have tech-savvy friends who offer to build and augment my current gaming PC for me (hence my writer’s bio).

While I could probably never build a PC on my own, I acknowledge that I should, at the very least, have a grasp of today’s market for computer parts. In the same vein that a cook needs to understand where their ingredients come from, I think the same should apply to your average gamer. So today I’m taking a deep dive into probably the most important and most pricey part of the computer—the GPU.

A GP—what now?

A graphic processing unit (GPU)—also commonly referred to as the graphics card or video card—is a computer component that creates images, scenes, and animations. So your phone, laptop, and tablet have their own GPUs.

Not all GPUs are created equally. The more powerful your GPU, the faster it can render 3D graphics. GPUs are mostly associated with gaming as powerful GPUs are needed to run games with complex animations smoothly. GPUs also come in handy for creatives that render 3D models or edit videos, hence most digital artists opt for MacBook Pros as their choice of laptop. 

GPUs also enable cloud gaming, video encoding, video streaming, and bitcoin mining—to the disappointment of many gamers. Essentially, the cryptocurrency industry requires very powerful GPUs for efficient blockchain mining. Not only does crypto mining produce excessive electrical waste, but it also hoards loads of GPUs from the market. That, along with supply chain restrictions set by the pandemic, led to the 2020 GPU shortage

But the good news for gamers is, Ethereum—the world's second-largest cryptocurrency protocol—announced its migration to proof of stake which does not require blockchain mining. No blockchain mining means no need for GPUs and about 99% less electricity consumed. I call that a win for gamers and environmentalists.

This series of GPU market shifts leads us to the GPU battle royale of the decade, with long-time defenders and a new challenger entering the ring.

Age-old rivalry: Nvidia and AMD

The GPU market was mainly governed by the duopoly that is NVIDIA and Advanced Micro Devices (AMD), à la Apple vs. Samsung. Ever since Ethereum’s newly-announced validation process, Nvidia and AMD are making up for lost time and making big tech moves—whether or not they’re lucrative moves depends on who you ask.

Nvidia recently launched their RTX 4000 series cards despite the current market saturation of used RTX 3000 series cards due to Ethereum’s shift. This left Ethereum miners with useless mining rigs as they scrambled to sell their GPUs to cut their losses. This all happened right before Nvidia announced and launched their new GPU RTX 4090, which boasts heightened efficiency and AI-powered graphics. And even at a $1,599 price point, it’s already sold out across most if not all retailers.

Literally hours before Nvidia’s RTX 4000 series announcement, AMD confirmed the launch of its Radeon RX 7000 graphics cards in early November. It’s said to offer a major gaming performance uplift and an almost 4 GHz GPU clock speed mark—potentially boosting it to the number one fastest GPU spot. It’s expected to be priced close to its predecessor, which amounted to $999. But we can only wait and see.

It seems that both Nvidia and AMD are playing their (graphics) cards close to the chest to one each other up. But they would be naive to count out the new challenger bound to shake up this rivalry.

Intel, the new challenger

For the first time in what seems like a lifetime, gamers have a third choice of GPU for their PC builds. Intel is mostly known for making microprocessors and semiconductor chips, some of the key parts that make up a GPU. Now, Intel is turning heads with the release of the Intel Arc A750 and A770 GPUs

Intel’s timely announcement following crypto movements and supply chain constraints served as the perfect segue into the GPU market. Intel took a page from AMD’s book and adopted a strategy to undercut Big Tech rivals in terms of price to performance—what you get for every peso spent. Intel’s price tag of $349 isn’t as daunting as GPUs that cost over $1,000.

And this shows in the reception of its performance. So far, they are performing well in running DirectX 12 (newer) games just fine, but are having trouble running DirectX 9 or DirectX 11 (older) games like Counter-Strike: Global Offensive. The consensus at the moment: it performs worse than some GPUs that are in the same price bracket.

The wild cards

The shifts in the GPU market seem to only be beginning. Don’t even get me started on President Biden’s move to cut off China from certain semiconductor chips made anywhere in the world with US equipment or the massive decline in PC demand and market. Despite all the externalities, I think Intel taking this leap of faith is a good thing.

The diversification of the GPU market keeps GPU manufacturers more accountable.

“If one of these major players misses a beat, customers in all markets—consumer, data center, HPC—are left with but one other choice in the highly competitive business of big, honking-fast GPUs,” said tech analyst and journalist Dave Altavilla.

Although Intel needs to deliver when it comes to GPU silicon and software execution, as long as it provides a healthy mix of products in the market, Intel will be a welcome player for consumers and a disruptor in the competitive GPU market.

We can’t really say for sure how accessible GPUs on an entire spectrum of price points and performance metrics will be, but a variety in the market is a good start. All we can do is see how these GPU manufacturers deal their (graphics) cards.

Sam Wong

Sam asked a friend to build her a gaming PC, and now she thinks she’s qualified to write about tech. Her dad once tried to get her to switch to Ubuntu, and failed. (Sorry, dad).

Previous
Previous

The art vs. the artist

Next
Next

The pandemic’s personality problem