June 26, 2025

excellentpix

Unlimited Technology

Nvidia Announces GeForce RTX 3050, RTX 3090 Ti and Laptop 3070 Ti and 3080 Ti

Nvidia has finally announced a “budget” RTX card with the GeForce RTX 3050. Not to be confused with the laptop RTX 3050 and 3050 Ti, the desktop variant is a slightly different beast as it doubles down on the VRAM. Along with the 3050 desktop card, Nvidia also revealed details on the already leaked RTX 3090 Ti, and announced RTX 3070 Ti and RTX 3080 Ti laptop GPUs. These will all compete to join our list of the best graphics cards. Here’s what you need to know.

The RTX 3050 desktop card will use the same trimmed down GA107 GPU found in the mobile 3050/3050 Ti. That’s not great news, since GA107 only has a 128-bit memory interface and up to 2560 CUDA cores. The desktop 3050 will use the maximum 20 SMs (streaming multiprocessors), giving it 2560 GPU cores, and of course the RTX 3050 also features RT cores for ray tracing and tensor cores for DLSS and other applications.

There is some good news in that the 3050 desktop card will at least feature 8GB of VRAM — much better than the 4GB used on the mobile 3050 Ti and 3050. Nvidia hasn’t revealed whether that’s 16Gbps GDDR6 or something else (18Gbps would be nice…), but we’ll find out more in the next few weeks. Also on the good news front is that the TGP (total graphics power) is just 130W for the reference specs, though it sounds as though we’ll see plenty of AIB partner cards still equipped with an 8-pin power connector — that’s sufficient for 225W of total power, including the PCIe x16 slot.

The GeForce RTX 3050 will go on sale on January 27, with a suggested starting price of $249. It should also feature Nvidia’s LHR anti-mining hardware, though as we’ve seen already, that’s unlikely to truly stop miners from trying to grab the cards. Maybe Ethereum’s long-awaited switch to proof of stake will help when that finally arrives sometime this year (fingers crossed). The RTX 3050 should also help to fill the gap between the RTX 3060 and the previous generation RTX 2060, though we’ll have to wait and see how it performs once we can get cards in for testing.

(Image credit: Nvidia)

Next up, the RTX 3090 Ti was previously leaked and will basically give a minor bump in GPU core counts at the top of Nvidia’s GeForce product stack. Given we now have Ti cards for the 3090, 3080, 3070, and 3060, it’s a bit interesting that there isn’t a desktop RTX 3050 Ti card, though that model does exist on laptops — except with lower performance than the desktop 3050. Anyway, the RTX 3090 Ti will utilize a fully-enabled GA102 GPU, giving it 84 SMs (streaming multiprocessors) and 10752 CUDA cores, compared to the RTX 3090’s 82 SMs and 10496 CUDA cores.

Nvidia will likely also bump up the maximum GPU clocks on the 3090 Ti (it didn’t reveal exact specs yet), along with running higher GDDR6X memory speeds. The 3090 used 24GB of 19.5Gbps memory with 24 8Gb chips, but the RTX 3090 Ti will feature 24GB of 21Gbps GDDR6X memory, presumably using 16Gb chips. If correct, that means all of the memory will now reside on one side of the PCB, which should hopefully help with cooling the hot and power-hungry GDDR6X.

Whatever the core specs, the RTX 3090 Ti Founders Edition will keep the same massive 3-slot design of the RTX 3090 Founders Edition. Nvidia’s partners will naturally experiment with their own custom designs, which will include factory overclocks. Get ready for a new halo GPU… at least until Lovelace and the RTX 40-series launch, which we still expect to happen before the end of 2022.

Nvidia didn’t announce a launch price for the RTX 3090 Ti, but considering the RTX 3090 routinely sells for over $2,000 already, we suspect it will start at $1,999. We also expect that, like the 3090, the 3090 Ti won’t implement Nvidia’s LHR technology, meaning it will offer full mining performance and should be capable of over 120 MH/s in Ethereum (at least until Ethereum 2.0 kills off mining). Nvidia will provide additional details on the 3090 Ti later this month.

Image 1 of 2

Nvidia

(Image credit: Nvidia)
Image 2 of 2

Nvidia

(Image credit: Nvidia)

Source News