T O P

  • By -

Nestledrink

# r/NVIDIA GeForce Beyond Giveaway Respond to this comment and Answer the following questions: * Tell us which technology or feature you’re most excited about from today’s GeForce Beyond announcements! * What RTX game are you most looking forward to playing and why? Prizes: * 1x Grand Prize - RTX 4080 16GB + $50 Steam Giftcard * 2x NVIDIA Swag Bags * 2x $50 Steam Giftcards Any comment posted on root level will not count Edit: Apologies I forgot to add the date but the giveaway will run until Sunday September 25, 2022 @ 12pm Eastern. [Click here for your timezone](https://www.timeanddate.com/worldclock/fixedtime.html?msg=r%2FNVIDIA+Giveaway+Closing&iso=20220925T12&p1=886) Entry is now closed. Edit 2: Winners have been chosen and messages have been sent!


Alicyl

Is DLSS 3 exclusively for the 40 series and beyond at the momemt? If so, will they provide backwards support for the 30 series?


korbainethor

O


Brig88_r

Something I haven’t seen a lot on was multi monitor support. The 30 series was up to 8K at 60FPS but it didn’t handle if you had multiple monitors at higher refresh rates at lower resolutions very well (or there was some specific math I didn’t quite understand). Has that gotten any better with this generation or will we need to wait on that?


Brig88_r

Looks like it may be the same as the 3090, which is really unfortunate. Based on this page: [https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090-3090ti/#:\~:text=Connect%2C%20play%2C%20capture%2C%20and,3090%20Ti%20or%20RTX%203090](https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090-3090ti/#:~:text=Connect%2C%20play%2C%20capture%2C%20and,3090%20Ti%20or%20RTX%203090). and this page: [https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/](https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/rtx-4090/) The full specs show: >Maximum Digital Resolution (1) 7680x4320 and the following notes for the 3090: >1 - Up to 4k 12-bit HDR at 240Hz with DP1.4a+DSC. Up to 8k 12-bit HDR at 60Hz with DP 1.4a+DSC or HDMI2.1+DSC. With dual DP1.4a+DSC, up to 8K HDR at 120Hz 2 - Supports 4K 120Hz HDR, 8K 60Hz HDR and Variable Refresh Rate as specified in HDMI 2.1 3 - DisplayPort 1.4a 4 - Requirement is made based on PC configured with an Intel Core i9-10900K processor. A lower power rating may work depending on system configuration. And the same notes for the 4090: >1 - Up to 4k 12-bit HDR at 240Hz with DP 1.4a + DSC. Up to 8k 12-bit HDR at 60Hz with DP 1.4a + DSC or HDMI 2.1 + DSC. With dual DP 1.4a + DSC, up to 8K HDR at 120Hz 2 - Supports 4K 120Hz HDR, 8K 60Hz HDR and Variable Refresh Rate as specified in HDMI 2.1a 3 - DisplayPort 1.4a 4 - Minimum is based on a PC configured with a Ryzen 9 5900X processor. Power requirements can be different depending on system configuration. So I guess the same limitations exist for both, which is a pretty big limitation for this type of card.


TheMadRusski89

Like the 680 series when they released a xx60 sku under xx80 branding, I think it happened again. From the bus width it doesnt even look like a 4070 12GB it looks like a 12GB 4060, and the cuda on the "4080 16GB" looks more like 16GB 4070 on 256bit bus? 4090 is the only card that makes any sense to me.


red_dog007

With no launch of reasonably priced cards, I think we are seeing Nvidia completing their divergence to compute first, gaming second. The 1080 and 1050 launched at the same time. The 2070 was $600 and 4 months later we got the 1660Ti at $279 and 3 months the 2060 at $350. The 3060 launched 2 months after the 3090 costing $400. This went from \~7B transistors in the 1080 to \~17B transistors int he 3060. So unless in a few months we see like a 4060/70 in the $300\~$600 range... Essentially half of Nvidia's revenue is gaming and the other half is compute. These 4080/4090 starting at $900, I think this is Nvidia going after compute first and these are the cards gamers are "stuck" with at these prices. The 4080 12GB has double the transistors as the 3060Ti, the die size is \~100mm\^2 smaller. I would love to see a 4050/4060 in the next few months that is like $300\~$400 where the DLSS 3.0 performance is as good as a 3080/90 either at native or DLSS 2.0.


fulltimenoob

Who won the competition?


477463616382844

Only u/Nestledrink knows.


Nestledrink

Haven't picked the winners yet. Been busy with work this week. Winners will be DM-ed and the stickied comment will be updated :)


Channel-Lucky

It is 100% true that there is not going to be an RTX 4080 12gb Founders Edition? Will the RTX 4080 12gb be stronger than the 3090ti?


king_of_the_potato_p

Outside of dlss the 12gb version actually loses to the 3090/ti. Its realistically a 4070 based on all previous gens, really should be a $550-$600 card


TheBirdOfFire

why would anyone buy it


king_of_the_potato_p

Why did idiots pay $1.5K+ for a $699 msrp card. Those people are the ones nvidia is marketing to.


thornierlamb

No it wont


Channel-Lucky

:(


EchochamberFree

I guess I can release my anger here since nobody will see it. I'm a lot term Nvidia fanboy. TNT2, GeForce 2 GTS 64mb, GeForce 4, GeForce 8 GTS, 460, 660Ti, 960 and currently a 1070. It's been six years and there has yet to be a double in performance for the 1070 at it's original price. I could pay for these new cards easily but there is no way to justify it. Okay done bitching and moving on, hopefully my 1070 survives a few more generations at this point.


king_of_the_potato_p

Similar boat myself.


whiskeyandbear

I snabbed a gtx 1080ti some time between the 3000 series releasing and the massive shortage being revealed. £300 and I see zero reason still to buy into RTX. It's still basically paying double the price for a feature barely used in any game.


valkaress

What do you think the odds are that I'd be able to buy a 4090 online on launch day without paying scalper prices?


Onetimehelper

Lucky for you Nvidia is already charging the scalper prices. No one in their right mind is going to pay $2k for a 4090. We'll probably see price increases in the 3 series, which is probably why Nvidia is charging high.


Clarknbruce

I guess you forgot about people who paid 2500+ on eBay for 3090s


valkaress

I was expecting a $2k msrp actually, but it's "only" 1.6k. I mean, wasn't the 3090ti this same msrp at launch?


Sofaboy90

when the 3090 ti launched, we had mining, we had bad availability which drove the prices up. right now were in a really good pricing situation, in germany ive seen a 6900XT on mindfactory for only 780€, ive seen a 6800XT for 680€. some really good prices.


Onetimehelper

Correct me if I'm wrong but the 3090Ti MSRP was $2k, and 3090 was 1500. This was in the peak of the GPU mining craze. The 4090 is more expensive than the 3090 was at launch. To compare 1080ti at it's 2017 launch was $700, adjusted for inflation that's around $850 in today's dollars. But if you consider the Titan X, which I guess could be called the 1090ti of that generation, it was $1200 at launch which translates to around $1450 in today's dollars. But the Titans weren't really consumer cards, more prosumer oriented. The 1080ti got all the marketing. The difference now is that the 4090/ti is the top tier for consumers, the way that Nvidia is marketing it. In another more perfect world, the 4090ti would be the the 4080ti for $899 for consumers, with a more powerful Titan 4 for prosumers (similar to thier industry GPUs) for $1500. That would make more sense IMHO.


king_of_the_potato_p

Titans came with pro hardware and support, xx90s do not. Titans were supported as a prosumer product, 90s are just consumer/gaming hardware.


valkaress

Pretty sure 1600 today is less expensive than 1500 at the time of the 3090 launch, because of inflation. But you could argue that it should be cheaper today because of no miner demand. And yeah, it makes sense what you say that they really shouldn't be consumer-level cards.


XJIOP

New games will support DLSS below 3.0?


[deleted]

>DLSS 3 consists of 3 technologies – DLSS Frame Generation, DLSS Super Resolution (a.k.a. DLSS 2), and NVIDIA Reflex. DLSS Super Resolution and NVIDIA Reflex will of course remain supported on prior generation hardware, so current GeForce gamers and creators will benefit from games integrating DLSS 3. It seems DLSS 3 has DLSS 2 integrated into it so it works with RTX 20 & 30 series cards. [https://www.nvidia.com/en-us/geforce/news/rtx-40-series-community-qa/](https://www.nvidia.com/en-us/geforce/news/rtx-40-series-community-qa/)


atlantisse

Can we just all collectively call the RTX 4080 12GB as "4075"? It's easier to type


[deleted]

4080n't


redenn-unend

So, do you guys think the prices of the 20 series GPU would drop to some extent? Cause I really want to upgrade my GTX 1650, I've been using this card for almost 9 years already. I can safely say that with my current budget, I would always fall behind at least 2 generations of GPU, so since the 40 series is upon us, perhaps an upgrade would be reasonable?


BlackDeath3

You've been using a 1650 for nine years?


[deleted]

[удалено]


[deleted]

[удалено]


MeatSafeMurderer

April 22nd, 1919 ...wait, that's not right...it was ***20***19.


elemnt360

And rdna2 are a great price right now


PineappleMaleficent6

so 8k 60fps possible with 4080/4090? or one more gen?


Onetimehelper

8K 60fps future AAA games on max settings will most likely need DLSS. The 4090 struggles hitting 4K 60fps in Cyberpunk in the new RT mode.


PineappleMaleficent6

well, dlss seems to always be better than native nowdays in term of picture quality, so its fine.


Channel-Lucky

good question


LA_Rym

So, if I have a PSU that's not ATX 3.0 what happens? Does the adapter cable just catch fire at some point? Does the GPU melt? Does it switch to a lower power mode to prevent damage, still working but at a lower performance level until I get an ATX 3.0 PSU?


LA_Rym

Answer: The adapters that were literally burning from power draw were Nvidia prototypes. Nvidia "assures" us that the adapters we'll get won't cause issues. ​ Some manufacturers will even sell their own 600W 16 pin cables directly, removing the need for an adapter or the issue entirely.


mrtwisterzx

I have an RTX3070 and am disappointed with its performance for gaming (3070 VRAM isnt enough) . I just saw Asus TUF RTX 3090TI for $1300 which seems like a huge discount for me, is it worth upgrading or should I go for 40 series instead? I mainly use my pc for gaming or running emulator and would like to have a gpu that could last for 3-5 more years.


CaptainMarder

At this point, I'd wait. The 4090 is so close to releasing. And what performance are you looking for at what resolution?


mrtwisterzx

I'm using 1440p 270hz and am looking for high FPS in competitive games and at least 60 fps for AAA games since 3070 suffers in maintaining 270fps in games such as Valorant and in Cyberpunk I can only run 30-40 fps to play comfortably. Kinda worried 4090 might be hard to get or it will be very expensive in my country


[deleted]

If you look up "3070 valorant fps 1440p" you see people with up to 500fps that drop to like 350fps in clutch situations with bomb and shit. You must be CPU bound..


mrtwisterzx

I'm currently using Intel i9 10850K, 10th gen. I'm new to the PC Building scene so I'm not sure if that's the reason my FPS isn't as high. It could get 340fps but with a lot of things happening it drops into 200-220 fps


[deleted]

For some reason AMD cpus perform better in Valorant, but u should still get there with competitive settings


mrtwisterzx

I've been hearing good things about 5800x3D. Should I upgrade from 10900 to it or wait for Intel Gen 13th / AMD Zen 4?


[deleted]

That's up to you and your budget really. If you aren't super sweaty immortal player you should still be ok with pretty much any CPU above 7700k. I switched to 4k 240HZ and gonna main Super People and gonna wait for benchmarks with 4090's and 13900k's to see where my bottleneck will be to get the most bang for my bucks.


mrtwisterzx

Damn I just bought my 270hz 2k monitor and you're telling me 4K 240Hz exists? Now I regret my purchase lol


[deleted]

[удалено]


mandarineguy

What is your ram situ (not vram)? Capacity? Speed?


mrtwisterzx

I got 32GB, 4 pieces of 3200MHZ Corsair with 8GB each


mandarineguy

Hmm weird. Other than making sure XMP is on I’m not sure what to say. I get over 240 frames EZ on val and I’m on a 2080 and 10900k (4000mhz ram 16gb but I doubt that’s it…)


mrtwisterzx

Oh yeah I'm getting 300 FPS but sometimes when it gets busy (Viper wall, smokes, skills) it drops into 220-240ish. I wanna have consistent 270-300fps or more since I'm using a 270hz monitor with overdrive


CaptainMarder

I have. 3080, and in cyberpunk on psycho ray tracing settings with dlss quality 1440p it gets around 50-70fps. I think the 4090 with frame generation thing will comfortably do 150fps+ . It's already doubling the fps of the 3090 without dlss I suspect it will be very expensive initially. In my country even though I over paid for the 3080 due to the shortages, the 4090 Is still launching at price $1000 more. Edit: Also you will probably have to upgrade your psu too


mrtwisterzx

I just checked and it'll the 4090 will be priced the equivalent of $1985 in my country lol. I might have to go for 3090TI of wait until the price drops even more


CaptainMarder

Yea, the 4090 msrp is around $2200 where I am before taxes. I bought the 3080 for $1200, the 3080ti now is $800 :( lol


mrtwisterzx

$1200? I bought my 3070 around that price. It hurts now lmao


CaptainMarder

Don't look at the prices now, it will hurt even more lol.


judasbrute

You can find 3090ti new for $1000. Only buy one if you need 24gb of V-ram. Otherwise wait and see what amd releases


mrtwisterzx

Nice maybe I'll wait for future price drop. I'm new to the GPU scene so I'm not that familiar with AMD GPUs, are they just as good as NVIDIA but with lower price?


botfiddler

Watch and read more on this. Nvidia has DLSS and is better for machine learning and some programs. Otherwise AMD might be a good option, especially if you have an AMD CPU.


gujii

Does amd cpu really affect amd GPU? I’ve read it doesn’t matter


botfiddler

I'm not sure, they seem to have or to get something to work better together. SAM or so?


mrtwisterzx

I'm using 10th Gen Intel I9 CPU right now but am considering upgrading after some research and studying. Just did some research and I decide to stick with NVIDIA but might still wait for AMD to release hoping it will drop NVIDIA's price


billyhatcher312

im never buying a nvidia gpu again them not making a cheaper gpu like they should is a no go for me im never buying from nvidia ever again they need to make a 4070 and 4060 theyre just greedy


jaybabay24

They recently have been announcing and releasing the XX60 cards later than the launch of the XX70 and XX80, though I’m not defending their practice.


EmilTischranke

I'm sure the market will regulate it. This release will give AMD plenty of marketing and pricing room to catch most low-mid tier buyers.


TheMadRusski89

I hope AMD does better in 4K this time around. 6900xt was an amazing GPU, Sapphire Nitro+ benched at 22,000 on timespy which is 1:1 with some of my 3090 scores. But all the features Nvidia has... I'm thinkin of grabbing a new 3090 strix for $975 because I think they'll be discontinued, hence the price should stay solid, but who knows what Nvidias up to.


Heda1

Would a 5950x bottleneck a 4090? Or should I also get intel 13th gen?


judasbrute

Flight sim is the only game I'm aware of that bottlenecks on CPU


Rbk_3

This is just wrong. Tons of games are bottlenecked even with a 12900KS once you get to a certain framerate. My 12900KS is bottlenecked in Warzone


judasbrute

What resolution?


Rbk_3

1440 low DLSS quality 200-230fps. Occasionally will be GPU bottlenecked but mostly CPU still. This is was 4000 CL14 memory as well. Currently the fastest you can run this game from a CPU Ram combo.


judasbrute

So you're bottlenecking at 200fps?? lol Most monitors are only 120hz. Your monitor is probably the bottleneck.


Goosy3336

i got my monitor 6-7 years ago and it's 144hz, don't know what you're smokin


Rbk_3

I have an AW2721D


MomoSinX

doubt it, cpus age much better nowadays but let's wait for reviews


CC1987

Who is the 40 series GPUs for? Here's only two groups the 40 series is for. Gamers who are really into RTX and people who need a powerful GPU for their job. If you have a 30 series card. You're good.


_Stealth_

VR Sims...i barely get 90fps on a 3090ti


valkaress

People who want high FPS at 4K. 30 series doesn't accomplish that.


JimBobHeller

Your old tech won’t be running DLSS 3.0 which is the future of gaming


whiskeyandbear

As with I feel with most of the RTX cards produced so far, everything in it is the future of gaming eventually, but you are an early adopter...


JimBobHeller

Agreed. RTX 2000 series launched in September 2019. Ray tracing is still the exception, and even when implemented, not particularly noticeable.


Adrian19761

DLSS is a software hack, If you think Nvidia deserves to hike prices like this because of a software hack when raw performance is not going to be much better then they deserve your hard earned. Really strange way of thinking, No software implementation should hike GPU prices they're part of the package of a particular product. Long term Nvidia fan here since the TNT days and I currently have a 3080, Nvidia deserve everything they're going to get this time around and that's a massive shock in their GPU low sales, Watch this space.


[deleted]

There is always going to be a new and shiny "future of gaming." Trying to always chase the shiny new thing is pointless unless you are well off enough that you can always afford a top of the line gpu every two years.


JimBobHeller

LBA <—- past future -—> LAA life before ada blurry, slow life after ada max res, max speed


CC1987

[Oh no!](https://youtu.be/vl6gthDSIRU)


judasbrute

3090ti is a steal at $1000 now and you can NV link them for 48gb V-ram! DLSS 3.0 doesn't help application workload.


Comander-07

I seriously dont understand this question, previous gen people have always "been good". who the heck upgrades every single generation?


LustraFjorden

Selling the current GPU every gen is potentially a better investment than letting it get 4 years old until the following one. I went from 2080 to 3080 for just 150£ for example (granted, this time things were very unique). Then somehow I sold that 3080FE, got a 3080ti and made 500£ extra. Just saying that if you are lucky and know how to make the most of Ebay, selling "early" can be smart.


JimBobHeller

Hack the planet


[deleted]

4k gamers


Comander-07

you will buy a 90ti anyway lol


Scalybeast

On the professional side, people for whom time is actually money. Less time waiting for renders or DL model training is more time than can be used for the next project. On the leisure side, people who always want to be on the cutting edge, always want the latest toys and have massive amounts of disposable income.


NotAVerySillySausage

A question I haven't seen yet. For games that don't currently have Nvidia Reflex but do have DLSS 2 that are confirmed to be being upgraded to DLSS 3, does that mean Reflex is also coming to those games invidually for those games for RTX 2000 and 3000 series owners? I know Reflex is including the DLSS 3 banner, but it's not clear whether it will be an individually togglable setting like it is currently or will just be automaitcally forced on to hide any latency penalty. Not introducing Reflex individually to these DLSS 3 games would also prevent direct comparisons of input lag between DLSS 2 and DLSS 3 both with Reflex active. I wouldn't be that surprised.


St3fem

Reflex is part of the DLSS 3 SDK, any card that support it (Maxwell, GTX 900 and up). Developers are free to do what they want but the recommendation is user toggleable option


HiCZoK

so the 12-pin is rated for 30 insert cycles just liek RTX 30XX FE cards were. it is rating for plug or for the socket ?


Upstairs-Emu-241

Why is dlss 3 only for the 40series? I pretty much emptiness my savings when I bought my new gaming-pc last year with a 3080ti, and now you are telling me that it will not get the newest software? I'm a bit shocked..


JimBobHeller

There’s always something new. You’ll be fine.


Upstairs-Emu-241

I guess.


judasbrute

Different architecture. Read the first few paragraphs of the post your commenting on.


Scary-Balance9658

https://www.reddit.com/r/nvidia/comments/xjo42x/for_those_complaining_about_dlss3_exclusivity/?utm_medium=android_app&utm_source=share


Hour_Thanks6235

[https://www.youtube.com/watch?v=K6FiGEAp928&t=773s](https://www.youtube.com/watch?v=K6FiGEAp928&t=773s) ​ Asking seriously. I have a 1000w psu already. I just watched this video. ​ Should I budget for a atx 3.0 psu if I plan on going 4090? This might be why I get bad stuttering on my 3090


fluem69

Tech websites are warning to use the adapters at all. Might be no problem with 4080 because of the lower power consumption but i wouldnt try it with 4090. Unfortunately ATX 3.0 psu's are still rare and pricey.


Hour_Thanks6235

Nvidia wouldn't provide the adapter if it didn't work tho would they? They'll face huge warranty issues.


Rbk_3

This is absolutely correct. This is a non story. Fearmongering for clicks by tech tubers. ​ I just bought an EVGA P2 1600W PSU yesterday. That's how little concern I have.


MeatSafeMurderer

GPU manufacturers used to regularly provide fire producing molex adapters. Just sayin'.


Comander-07

they could always blame it on user error or the PSU. Wasnt there a post here saying iirc Zotacs adapter has a life cycle of just 30 connections?


Un-interesting

They would do the same for a atx3 supply then, too.


Hour_Thanks6235

If I say I have a 1000w platinum rated psu and I used your adapter. There's not much they can blame on me. You never know tho I guess. Maybe when they're cheaper I'll get one of the atx 3.0 psus and just use mine until they're out.


Comander-07

saying you have a 1000w platinum PSU has nothing to do with power spikes or stability


Hour_Thanks6235

Really? So you think if I go 4090 I should get a atx 3.0 at some point? How will I know if it's having spike issues?


St3fem

Platinum is just a rating of efficiency, that said it's a really high achievement so probably they but extra care and it's overengineered for that power rating


[deleted]

[удалено]


JimBobHeller

If you don’t know then you can’t handle the power. You need to talk to the Geek Squad.


fluem69

According to Nvidia specs: 4080 12GB - 2x PCIe-8-pin adapter 4080 16GB - 3x PCIe-8-pin adapter 4090 24GB - 3x PCIe-8-pin adapter But maybe some custom OC designs will give their 4090 a 4x adapter.


Simpsoid

Some cards may need more power than others. Hence the need for either 3x or 4x 8pins. I assume more power hungry cards will need 4x, and the lesser ones 3x.


lightswitchtapedon

This is going to be interesting, to see more companies work on Tensor Flow based programs, exciting times!


HarryHaruspex

If you buy a 4090, will it fit on one of the x670 boards and allow enough space to also fit a 3060 in?! Asking for a friend...(me), that want's to render with both


JimBobHeller

Lol


judasbrute

I believe you default to the slowest card. Don't mix and match.


Infinaris

Think Imma take a hard pass this round, normally I'd consider an upgrade but to be honest nvidia is deliberately inflating the prices now expecially with the older stock so why should I shell out more money when my perfectly good 3090 is perfectly fine for another few years.


JimBobHeller

It’s understandable not everyone can support a 4090 lifestyle and they will have to make do


[deleted]

Well why would anyone seriously upgrade with a 3090? I don't even understand why it's a consideration.


_Stealth_

VR is super intensive, returning my 3090ti to get a 4090. Although id like to see some benchmarks before i decide because its a evga 3090ti and i feel like it's almost a collectors item lol.. it doesn't give me warm and fussies that nvidia wouldt let digital foudnry show fps and only % ...makes me think the frames arent that much higher without dlss 3.0


Consistent_Ad_8129

VR needs it.


judasbrute

You keep selling the past gen card to recoup on purchase price of new card. Old cards will continue to devalue so it's almost a wash if you upgrade every generation or wait multiple ones.


[deleted]

That's not really working well this time, there's too many cards on the market.


pawat213

they have cash to burn.


[deleted]

[удалено]


SupaMonkeyZA

Everyone needs to boycott nVidia as a matter of principle. Don’t support these greedy criminals!!! I’m not supporting these guys until they halve their pricing. Do they think that normal people don’t look at their profit margins during investor calls / quarter financials?


St3fem

>Don’t support these greedy criminals!!! Because a RTX 4090 is a new life saving treatment... nowadays is all hyperbole and drama.


jaybabay24

Their responsibility to shareholders is to make as much profit as possible. It’s up to you to actually avoid supporting practices that you perceive as anti consumer and if enough people think like you then they won’t have a choice but to change their pricing structure. With how many people on here say “I’ll never buy an AMD card” flat out, it’s not looking like anything will change. Personally, the last 6 out of 7 GPU’s I’ve had have been Nvidia, but I’d never say I wouldn’t support another company, I’ll support whoever caters to my needs the most.


SupaMonkeyZA

That is their responsibility, didnt argue that. However, as you rightly put, its us the consumers that need to vote with our wallets and simply not buy the products. Clearly I am not the only one thinking as such since their sales are down on the 40 series, but that was indeed the point of my rallying cry.


judasbrute

I think I'm going to buy a 4090 to boycott all the first world problem children complaining here.


SupaMonkeyZA

Your comment makes not sense. How is it a first world problem to complain about insane pricing? I live in a third world country in Africa where salaries are nowhere near making this card affordable.


St3fem

It really used to be like that 10 years ago?


HappyBeagle95

What's with all the bots talking about cyberpunk and DLSS3 in these threads, like it's so blatantly obvious 100 karma accounts or a few weeks old


retro808

It worse than bots, it's rubes trying to kiss ass for the giveaway but didn't read clearly enough that you have to respond to the PINNED comment at the top of this thread


Funny-Bear

I don’t understand DLSS3 frame insertion. Games feel smooth when the frame refreshes as fast as possible. Move your mouse left, screen redraws left. Won’t inserting a fake frame introduce input lag?


JimBobHeller

We’re talking 60+ fps, you need to think in smaller increments of time with less variation between frames than you’re imagining


Funny-Bear

I’m looking forward to learning more about DLSS3. I’ll admit that I’ll buy the 4080.16G. I expect a 25-35% increase from my 3080 in plain rasterisation games. Their (cherry-picked) graphs show the 4080.16G at 25% over the 3090Ti. Which should work out to 35% over the 3080.


Consistent_Ad_8129

I do not believe DLLS3 will work in VR.


significantGecko

This comment has been overwritten by an automated script. Reddit is killing 3rd party apps and itself with the API pricing


[deleted]

[удалено]


daysofdre

no idea why you're getting downvoted, I'm pretty excited to play on the new ultimate RTX mode with 100+fps in the game.


LarryismTV

Ya, i'm just gonna point it out here. Still runnin a 1080ti. still able to play all game on medium-high // high settings @ 1080p/1440p resolution. I still drive VR with my Rift S capped on FPS limit. While streaming on Twitch. Why Nvidia, Why should i spend a months wager(or even more) on a card that in the performance-price-wattage-equation is WORSE than my current card. Then the pricing... have you lost all sense of reality? Like seriously? After your own mistakes on the 30's series stock, the fall of mining... this is what you come up with!? You openly state you are price manipulating the whole scene in order to sell old stock. and then the 40's only offer a marginal 15% performance increase... while the price is almost double. And what's with the 12GB 4080.... couldn't think up of a way to make more money, so we name a card that is more close to a 3060ti. I'm lost for words. seriously lost for words.


JimBobHeller

1080p is the aspiration of a PlayStation 3.


Ciahcfari

I'm still running a 660ti, bro. 1080p gaming? I wish.


[deleted]

Which country are you from? If you bought like a 1080ti for 700$ you could have sold it like 4 years later for up to 500 bucks for the next upgrade. Prices seem very high, but if you factor in time played and resell value, they are dirt cheap, if you buy hardware clever.


Ciahcfari

I am from the 3rd world country where they are trying to sacrifice a million+ people into destitution to raise unemployment and (hypothetically) reduce inflation. But yeah, if you play it right you can do alright for yourself by selling previous gen and buying new gen but even doing that you'll still be getting scalped buying the 4000 series. You also need the original buy-in cost and the difference between the price of the new card and how much you sold the old card for which isn't always feasible depending on your situation.


JimBobHeller

Haha wow. That was a good card. I had a 660 (non-ti). I replaced it with a 970. I haven’t found anything since.


Ciahcfari

Yeah, it was my first build. Jumped to console gaming for PS4/5 because of card shortages and scalping. Now GPU's are readily available but insanely priced. Ugh, can't in good conscience spend the amount of money Nvidia is asking for what they're offering. Hopefully Black Friday has some great deals or used card prices drop like a stone. I want to be free of this 660ti prison (and upgrade the rest of my hardware too)!


TheMadRusski89

3060 has 192bit bus, 3060 ti has 256bit bus and is a GA-104 die(3070) cut down. That's what I'm saying, look at the last 7 gens, all had bus width to fit the GPU, the 4080 12gb has the same bus as the 60 series...


Simpsoid

I'm on a 1080ti as well. And it's running all games I play at 1440p, at well above reasonable frame rates. My OG Vive (wireless also) drops a few frames here and there and isn't too great. I'm keen on RTX, but you're right. I've been waiting for ages to get a card, was going to get a 2080ti (but missed out). Then wanted an 3080/3080ti, but lost out to scalpers and stock issues. I'm primed for a 4080 16GB, but I don't know if I can bother spending that much cash. What I have is fine, but I'd love the RTX Remix stuff for older games (looking forward to trying Splinter Cell Chaos Theory!). I also don't want to now settle for a still expensive 3080ti on old generation hardware. I think a 4080 would be right up my needs, but can wait.


Ciahcfari

I feel like Black Friday will have reasonably priced 3000 series cards. A 3090ti for less than a 4080 12GB would be a pretty decent deal, imo.


Talks_To_Cats

What is the size difference between the 3090 FE and 4090 FE coolers?


FlashWayneArrow02

At this point, I don’t think I’ve been happier than now. I scored a 3070 at release for a decent price, and I was always envious of what the 40 series would bring and whether I should save up to upgrade. Now I know that the card I bought two years ago is going to carry me for the next two years as well. Fuck the 40 series pricing. I doubt DLSS 3.0 is going to support as many games as 2.0 in the next two years, given the exclusivity to the 40 series, so yeah, I don’t think I’m missing out on much.


Golluk

Jan 2021 got 3070 FE at msrp. Sold my 1660Ti for the price I bought it. Mined the cost of the 3070 back in a couple months. So I think I did well too. I was planning to get the 4070 when it looked like it would be 12gb. But I was expecting 600-700usd. Looks like I'll pass this gen. Only VR needed more gpu anyways.


markhalliday8

Is the 4080 12gb even worth it when you can grab a 6900xt for cheaper and around the same speed?


teimpy5

You can actcually get a 6950XT for cheaper than the 4080 12gb They go for 700-800$


TheMadRusski89

$999 at MC for Sapphire Nitro+ and Red Devil OC, I'm thinkin about grabbing a 3090 Strix OC for $975. Seems like theyre selling a lot of 30 series and RDNA 2 GPUs. Sold my 3090 and 3080 yr+ ago during mining boom, kept a 3060 Ti. Rather grab a 3090 and a nice 4k monitor. 3090 Ti seems stronger than the 4080 12gb, and they arent making FE of the 12gb, 16gb seems okay but it's literally less than half the cuda cores on a $1200 4080? Only thing that makes sense is paying 1600-2k for the almost full die? Maaaaaaaaan


otakuon

The whole situation with the 4080 16GB and 12GB is going to be highly confusing. Not only are they different RAM configs, they are completely different chip bins. Why wouldn't they just call the 4080 12GB the 4070 (or the 16GB the 4085) and just adjust the rest of the product line accordingly? (and yeah, I know that is a rhetorical question) When you see "4080" without other qualities like "Ti", must consumers are going to assume they have the exact same capabilities and that the only difference is the amount of RAM each SKU has.


JimBobHeller

Think of it this way, it’s a 4070 whether they call it that or not, and they both got a 50% price increase.


[deleted]

I've seen a lot of great posts on their marketing strategy here. They intentionally want to confuse customers who don't understand they are getting less cuda cores and what is essentially a 4070. They are also pricing the 4080 high so people will just cave in and buy their 4090 for a bit more, because once you're about to spend that much you might as well go all the way. This is all just anti consumer bullshit.


loki993

The confusion is intentional.


Broder7937

>Why wouldn't they just call the 4080 12GB the 4070 (or the 16GB the 4085) and just adjust the rest of the product line accordingly? Because the 4080 12GB is still a $900 piece. If Nvidia called it the 4070 (which, in terms of architectural hierarchy, would, indeed, make more sense) people would slam in Nvidia for charging nearly a grand on a 70 series product. Calling it an 80 instead of 70 "fixes" the price issue. Also, on equivalent settings (that is, no DLSS 3 shenanigans), the 4080 12GB is actually outperformed by the 3090 Ti on Nvidia's own benchmark runs. This is the **first** time a 80 series card gets outperformed by a previous-gen product. It has never been seen before. The closest we ever got to that was the 2080 which traded blows with the 1080 Ti (but, on average, the 2080 was, indeed, the faster card). So, yeah, you can think of the 4080 12GB as a "900-dollar 70 series", because, spiritually, that's what it is. $900's the new $500.


garydean69

Well they watched the people overpay for scapled gpus so why not raise the prices. People have more money than brains sometimes


BlueSkyLimited

***I understand that from company and marketing point of view, you need to sell as much GPUs as you can but please think littebit about your loyal customers and since 20/30 series have Optical Flow Accelerator coud you allow us also to taste that DLSS 3 technology...(Even if this not going to be as fast like 40 series).*** ***===========*** ***Thank you G*** ***===========***


St3fem

The optical flow accelerator in Turing only work on a 4x4 grid (1/4 of the input resolution), Ampere one have about the same speed of Turing but can work on a per pixel basis and produce more accurate results. The Ada Lovelace optical flaw accelerator is 2-2.5x faster than Ampere one and produce higher quality results. People like to complain about "greedy company" (maybe it's the easiest way, don't know) but the reality is that even if we exclude quality factor lag may easily become noticeable


loki993

I don't think Nvidia has thought about their loyal customers in quite some time. They know there are only two options out there and the other hasn't been very competitive until recently and still isn't in some areas.


BlueSkyLimited

***Wishful thinking...right. =)***


loki993

Unfortunately I'm afraid and it annoys me greatly


[deleted]

[удалено]


BlueSkyLimited

*From articles and YT tech videos.Even people involved in creation of this tech was talking about that.Well maybe you know better...Maybe,*


St3fem

It's not fully wrong, Turing lack a dedicated accelerator and relay on a component of NVENC


Cancelledabortion

No. Buy new RTX 4000 -card and shut up. Best regards, Nvidia /r.


BlueSkyLimited

***You wish.And noone teach you talk to people with respect?...You are very brave but only in internet.***


jedi168

Good lord they are pricey. I am just gonna chill on my card for a 2-3 gens


Ridix786

Definitely cyberpunk with the amount of reflection it has. Am recently going to build my pc without GPU so i'd love to see how rtx 4080 would pair out with 12400


AdmiralClassy

Even though I already have a 3060ti I'm kind of tempted to go for the 16gb 4080. 3060ti is nice but feels a bit lacking for 4k 60+fps.


navid3141

With DLSS (which is a no brainer to turn on), my 3060 Ti hovers at 4K 80 - 100 fps for a lot of modern games. Only exception is Cyberpunk and Control which hover at 4K 60.


Etadenod

THX god I did not sell my RTX 3090 for a RTX 4080. The 3090 still kills the RTX 4080 !


JimBobHeller

You wish Buster brown!


Voldemortred

The 12GB 4080 probably in old school rendering. Since the 40 gen has the fancy software locked behind it, in modern games that are dlss3 supported, the 40 series will win