What We Know About Nvidia’s GeForce RTX 30 Series

As much of the console world waits with bated breath for Sony and Microsoft’s new systems to drop this fall, Nvidia was kind enough to show us all what the future of PC gaming – and consoles – is going to look like in the future.

The GeForce RTX 30 Series promises to bring more of everything to users and, as Nvidia’s history has proven, it isn’t all marketing hype.

From Nvidia’s own marketing hype: “GeForce RTX 30 Series GPUs also feature several world firsts: they’re the first gaming-class graphics cards with up to 24GB of new, blazing-fast GDDR6X VRAM; they’re the first GPUs with HDMI 2.1, for 4K high refresh rate and 8K gaming; they’re the first discrete GPUs with support for the AV1 codec, enabling you to watch high-resolution streams using significantly less bandwidth; and our Founders Edition cards are the first with innovative dual axial flow through cooling solutions.

And of course, GeForce RTX 30 Series GPUs come packed with new technologies for esports competitors, livestreamers, creators, and gamers alike – NVIDIA Reflex reduces system latency, providing split-second PC gaming responsiveness; NVIDIA Broadcast turns any room into a home broadcast studio with AI-enhanced video and voice comms; updates to NVIDIA DLSS make 8K HDR gaming a reality on the GeForce RTX 3090 graphics card; and new NVIDIA Studio enhancements make creative applications run faster than ever before.”

NVIDIA also showed off their so-called Reflex tech that will help  “improve esports reaction time,” among other things. As anyone who follows eSports knows, that’s no small thing and lag can often mean the difference in many matches. The company also showed off a host of new streaming features to make it easier for gamers using NVIDIA tech to put their gameplay online. The company also demonstrated the NVIDIA Omniverse Machinima with Mount & Blade II: Bannerlord working the crowd.

NVIDIA Omniverse Machinima “gives you the power to remix, recreate, and redefine animated video game storytelling” the company claims. NVIDIA also displayed their next-gen ray tracing, DLSS, and discussed the RTX Ampere GPU. Of course, the marquee event of all of this talk was the demonstration of the next-gen games using this tech including Cyberpunk 2077.

All of this was capped off with a thorough introduction to the GeForce RTX 3080:

“Powered by the NVIDIA Ampere architecture, our second-generation of RTX, GeForce RTX 30 Series graphics cards deliver staggering rasterized, ray-traced, and AI-enhanced horsepower, giving you the performance to make your games more beautiful and realistic with maxed out settings.

GeForce RTX 30 Series GPUs power the next wave of amazing game titles, including Cyberpunk 2077, Fortnite with RTX, Call of Duty: Black Ops Cold War, Watch Dogs: Legion, and Minecraft with RTX for Windows 10. And they give competitive gamers the lightning response times needed to win in games like Valorant and Fortnite.”

What do you think about the GeForce RTX 30 Series? Did you watch the presentation? Let us know your thoughts in the comments.

And don’t forget to check out our other gaming-related content by clicking this link here.

Could AMD Big Navi Challenge Nvidia’s GeForce RTX 3080 in the Marketplace?

AMD is having a moment and the company looks set to extend it if tech rumors are to be believed.

The graphics card arms race continues and it looks like AMD is about to shake up the current world order with its Big Navi high-end option. Some people even think that Big Navi’s stats put it in line to challenge Nvidia’s own GeForce RTX 3080 but, with only paper to back up that claim, we’ll take a wait-and-see approach on that.

So what makes Big Navi special?

While reports prior to an official launch are necessarily vague, what we can tell you with some certainty is that the Big Navi graphics card will be a pretty big leap over rival Nvidia’s GeForce RTX 2080 ti. A lot of the specs for Big Navi are referenced in comparison to that card. YouTube channel Moore’s Law is Dead said, for example, that the GPU for Big Navi is 40 – 60 percent more powerful than the RTX 2080 ti. The source for this claim is “someone who hasn’t been wrong in the past” for whatever that is worth.

For reference, the (Founders Edition) GeForce RTX 2080 ti’s specs are as follows: GPU Architecture: Turing, RTX-OPS 78T, Boost Clock: 1635 MHz (OC), Frame Buffer: 11GB GDDR6, Memory Speed: 14 Gbps.        

And the entire basis for the claim is that the Big Navi’s performance eclipses the RTX 2080 ti in such a way that it is seemingly positioned right behind or on par with Nvidia’s next-gen graphics card.

Multiple reports indicate that Big Navi will use AMD’s next-gen RDNA 2 graphics architecture and that we can expect to see it, as well as the RTX 3080 from Nvidia, this September. For what it is worth, the Nvidia graphics card will use the company’s new Ampere graphics technology.

Much of this caps off a year of pretty big wins for AMD. Both console titans Sony and Microsoft announced their next-gen consoles would be using RDNA-2 architecture. Naturally, that doesn’t mean much for the PC graphics card race. Given that both will be debuting around the same time, it will be interesting to see which company comes out on top.

What do you think? Does AMD’s Big Navi sound like something that could bring Nvidia down a peg or two or just more boastful marketing? Does it even matter? What graphics card would you choose if you had the option? Let us know your thoughts in the comments section.

And be sure to check out our other video game articles by clicking here.

Razer’s New 300hz Blade Pro

Are 300 Hz displays and 10th gen Intel processors enough to get consumers to pick up the Razer Blade Pro 17?

That’s what the company hopes and they’re throwing everything but the kitchen sink at the project in order to woo the often fickle PC consumer back to Razer.

A notch above the Blade Stealth and Blade, the Blade Pro 17 is the one with the big 17 inch screen and it comes in three hot configurations that are all ready to burn a hole through your wallet.

These are the $2,599.99, $3,199.99, and $3,799.99 version with the cheapest variant rocking the Nvidia GeForce RTX 2070 Max-Q and the middle-tier and top-tier iterations use the Nvidia GeForce RTX 2080 Super Max-Q.

All share the same CPU, a Intel Core i7-10875H, and 16GB DDR4 RAM. The models also share the same battery, a 70.5 WHr, and the same wireless Intel AX201 Wi-Fi 6, Bluetooth 5.0.

The cheapest and middle versions of the Blade Pro 17 use a 512GB PCIe NVMe SSD for storage while the top-of-the-line model uses a 1TB PCIe NVME SSD.

The Razer Marketing Video of Razer Blade Pro 17

When it comes to the display, the Blade Pro 17 truly distinguishes each model from one another with different variants geared towards gamers who really care about this kind of thing.

The cheapest and middle versions of the computer come with a 17.3-inch, 1920 x 1080, 300 Hz display while the most expensive version has a 17.3-inch, 4K, 120Hz.

Basically, the display is pretty amazing at any level. But that’s also where we had our biggest issues with the product. More on that later.

To round out the Blade Pro 17’s list of features, you get an SD card reader, 3 USB 3.2 Gen 2 Type-A ports, RJ45 Ethernet,  HDMI, Thunderbolt 3, and  2 USB 3.2 Gen 2 Type-C ports.

The Razer Blade Pro 17 should start shipping to consumers and retail outlets later this month. In our opinion, it’s a pretty solid offering and we think the specs justify themselves.

The prices are also not uncommon for this kind of laptop and we didn’t really expect Razer to engage in intense price competition. What we did expect were the specs and the display, in particular, and that’s where we are concerned.

Can a gaming laptop consistently maintain 300 Hz over its lifetime?

We doubt it. Thermal throttling should become an issue because that’s a ton of power just to the display alone without even considering the rest of the power-hungry components that make up this system.

Outside of that, though, we’re impressed. If the system holds up when it starts to be subjected to real world conditions across multiple users, then we’ll change our mind but, for now, we’re cautiously optimistic.

Some people might take a wait-and-see approach as a verdict to not buy the product but, really, we just want to make sure its capable of delivering on the promises it is making.