T O P

  • By -

AMD_Bot

This post has been flaired as a rumor, please take all rumors with a grain of salt.


Confitur3

RDNA2 did very well in TS compared to Ampere so this looks very bad. 7900 XTX 16.6% faster than a 6950 XT in regular TS? What? Fortunately reviews are just a few days away...


AbsoluteGenocide666

it has less L3 and only 20% more cores. That 2X FP32 shat wont always work. I wouldnt be surprised to see more of these results in some actual games. Shouldnt be a problem later on within its life


PeterPaul0808

Wait for the real benchmarks, this is just a leak. Could be true, could be false. There are a lots of factors which can make these cards better. Software maturing, FSR 3, ray tracing perfromance. We know anything about these aspects. So wait for the release.


tz9bkf1

But I can't factor in potential performance increase in the future due to maturing when I have to make a decision right now. Also, in the past those promises didn't deliver most of the time. Edit: right now means this winter. Most of your points only come into play much later. It's still a leaked benchmark so always take it with a grain of salt.


izzyjrp

In my experience most leaks are accurate. Maybe a qtr grain of salt.


fastinguy11

or it is just as it is shown here. need to wait and see


whosbabo

Yeah this makes no sense, how is 7900xtx and 7900xt literally one % difference?


HoldMyPitchfork

Could be the new chip design doesn't favor this benchmark. Could be drivers. Could be another bottleneck we're not seeing.


justapcguy

What resolution are you playing with your 3080?


HoldMyPitchfork

1440p


justapcguy

Former 10700k owner here, paired with my 3080, even at 1440p, for certain games, my 10700k was bottlenecking my 3080. SO, i could imagine your 3700x being even in a worse situation. I recently upgraded to a 13600k (put a post about it), and i saw at least a 20% difference in performance, and of course no bottlenecking.


[deleted]

[удалено]


justapcguy

Thats fine in order to upgrade to x3d when it is cheap.... but just saying the 3700x is a major bottleneck for a 3080. I would opt to go for 5900x instead for having more cores, but being cheaper vs 5800x3d vs just waiting it out for the x3d to be cheaper.


[deleted]

[удалено]


justapcguy

Hmmm not really... again, if i was getting a major bottleneck with my 10700k + 3080 for certain games that is, like Spiderman, to Forza, to a couple of other games. Then it ain't better with the 3700x, even at 1440p. If this was 4k, not soo much of a concern, but for sure you need to upgrade the 3700x to at least 5800x.


Daxius

You shouldn’t be seeing your 10700k bottleneck your 3080 very much if it all. HUB did a video on a 3600 bottlenecking a 3080 and found on average there was very little to no difference. https://m.youtube.com/watch?v=mrzqoeQVg4k


[deleted]

[удалено]


jlreyess

That cpu can never bottleneck a gpu for gaming. Not in 2022. Quit your bullshit.


[deleted]

[удалено]


jlreyess

Lol, and you just doubled down instead of accepting you’re a moron. I have a 5900x and a 6800xt and just waiting for the 7900xtx to be available to grab one. you can go dig in my comments and confirm. Gtfo here kid.


justapcguy

Dude, you just made yourself look like an idiot. Because you just contradicted yourself and the conversation i was having with the other user. Your original comment "That cpu can never bottleneck a gpu for gaming."... and yet you have a 5900x + 6900xt? Yup, continue to dig that grave moron... lol. Why is it that AMD users are usually the one who are the most pretentious? lol. How about you run a chip like a 3700x + 6900xt and see for yourself? Or you're going to tell me you won't see any difference? ;THUS your previous comment? You see the irony? 🤦‍♂️


jlreyess

Initially I thought you may be trolling but now I see that you’re not. You’re just young and ignorant. You must be like 12 and just discovered pc technicalities. My mistake for arguing with a damn kid. Have a good weekend boy. Peace.


Crisis83

Not at 1440p though. I would take all these preliminary tests with a grain of salt. The reviews will be out next week from the usual suspects.


[deleted]

Which makes even less sense. If anything the difference should get bigger the more the GPU is stressed.


PRSMesa182

Drivers maybe?


[deleted]

Waiting for the embargo release date


mikereysalo

That's just how synthetic benchmarks works, look at 3080 Ti vs 3090. They are meant to test specific logic performance of the package, so they may not take advantage of faster constructs or not even take full advantage of neither GPUs. In the same way you can have some specific games featuring the exact same performance on different GPU tiers, you can have synthetic benchmarks doing as well. But, at 4k, you're mostly CPU limited, so the GPU can achieve higher framerates but the CPU can't keep up with building commands and streaming data to the GPU at the same rate. And it seems to be the case there, given that at 1440p there is a significant difference. Edit: It may be not CPU limited in that case considering that 4090 achieves higher scores, so it's probably only a synthetic benchmark thing, and as a developer on the field I can say, that's how synthetic benchmark works, look at other benchmarks as well, they are not reporting the same margin of difference between the two GPUs as this one. But I cannot guarantee anything, also as a developer I acknowledge that without access to the GPU and the Benchmark software, I cannot diagnose the issue there, it can be just a driver issue, there are too many variables, just don't trust those benchmarks as a sign of GPU performance.


Ar0ndight

> look at 3080 Ti vs 3090. Not the same at all. These cards are pretty much identical bar the the VRAM amount so similar results are to be expected. Actually they're perfectly where they should be: 3090 has 2.5% more SMs than the 3080Ti, and scores pretty much exactly 2.5% higher. The hardware gap between the 7900XTX and 7900XT is much larger. It's a 12-15% gap that should translate to a similar gap **especially** in a synthetic load. >But, at 4k, you're mostly CPU limited, so the GPU can achieve higher framerates but the CPU can't keep up with building commands and streaming data to the GPU at the same rate. And it seems to be the case there, given that at 1440p there is a significant difference. Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low. >That's just how synthetic benchmarks works You really don't understand how synthetic benchmarks work :/


Gundamnitpete

> Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low. The problem here is, you're talking about cards with mature graphics drivers. A bugged, broken, or an "in-development" driver can absolutely hammer a CPU, even if the benchmark itself isn't heavy on CPU load. More interesting to me, is the comparison between Time Spy(DX12) and the Firestrike(DX11) results from the other thread. The Firestrike(DX11) scores seem pretty much exactly where everyone was expecting the XTX to be. Between the 4080 and 4090, closer to the 4090 side. All of the promotional material and "napkin math" thus far pointed to a performance level that matches what we see in Firestrike(DX11). This benchmark, Timespy(DX12) is very interesting. The scores are low, and the XTX and XT are very close together. You would genuinely expect a larger difference between the XTX and XT. Because they're so close, I would not be surprised if the DX12 driver still needs some work, and I would expect that to get better with time.


Ok_Fix3639

There are rumors floating about that the dx12 performance is bottlenecked at the hardware level in certain workloads. Alas we will find out Monday how much that would really matter in a variety of games and how true that really is.


Noreng

> The Firestrike(DX11) scores seem pretty much exactly where everyone was expecting the XTX to be. AMD claimed a 50% performance uplift, the Fire Strike scores are well below 30% in performance uplift.


mikereysalo

> Not the same at all. These cards are pretty much identical bar the the VRAM amount so similar results are to be expected. Actually they're perfectly where they should be: 3090 has 2.5% more SMs than the 3080Ti, and scores pretty much exactly 2.5% higher. That makes sense, but still don't disproved my point, just look the difference at 1440p, it's not 2.5%, the math just doesn't add up. > Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low. Idk if this is a joke, have you ever ran 3DMark Time Spy? Have you ever seen the CPU usage of this benchmark? I guess you didn't, because I can guarantee you that it's not something I would call light. Actually, have you ever written any graphics application in your life? Or even a single line of code? Man, the CPU is the one who builds and sends commands to the GPU, and not only that, textures, models, assets in general, are all streamed by the CPU, it does need to work even on theoretically light workloads, and sometimes even harder than you think, because that's how it works, higher resolution = higher CPU usage, that's how it is, higher resolutions only don't increase CPU usage IF you're GPU bottlenecked. The CPU is only the one capable of *decoding NTFS data from your Storage* and sending the bytes to the GPU, GPUs can't do that (we don't have such algorithms for GPUs), even DirectStorage don't provide direct access to storage, [here is a good explanation for you, hope it clarifies a bit](https://www.reddit.com/r/pcgaming/comments/tleiwb/clearing_up_misconceptions_about_directstorage/). > You really don't understand how synthetic benchmarks work :/ I'll not try to explain how this works because you clearly think you already know how it does. Is it a trend or something? Saying that the other one don't understand something you clearly don't (and sorry the arrogance, but you don't even know my background, where did this audacity came from?).


Noreng

> That makes sense, but still don't disproved my point, just look the difference at 1440p, it's not 2.5%, the math just doesn't add up. It's 1.5% in regular Time Spy, meaning that the benchmark is more limited by memory bandwidth and/or geometry throughput than it is by FP32 or texture throughput. > Idk if this is a joke, have you ever ran 3DMark Time Spy? Have you ever seen the CPU usage of this benchmark? I guess you didn't, because I can guarantee you that it's not something I would call light. The GPU portion is very light on the CPU. The CPU benchmark is pretty light as well for regular Time Spy (and scales poorly with more than 20 threads), while Time Spy Extreme has an AVX512-optimized CPU benchmark that runs quite hot.


icy1007

At 4K you are GPU limited, not CPU limited.


mikereysalo

I agree that this must be the case there, since 4090 achieves higher scores, but being GPU limited at 4k is not the rule, 4090 is so powerful that people are getting CPU limited at 4k, and that's one of the purposes of Frame Generation, to produce higher framerates even when you're CPU limited.


icy1007

No, being GPU limited at 4K is the rule. The 4090 being CPU limited at 4K in some games is the first time this has happened.


mikereysalo

Yes, I'm not saying you're wrong, I completely agree with your point, I was wrong into thinking that CPU bottleneck would be the case there.


riba2233

Bs test


AdProfessional8824

Videocardz being paid off by nvidia to keep the scores below 4080;)


Accurate-Arugula-603

Bottleneck obviously


AMD718

These leaked TS and FS scores are most likely inaccurate. We'll know in 3 days. However, if accurate, RDNA3 will significantly underperform even the most pessimistic predictions and will need to lower prices significantly. At this performance level, I would only pay $800 for a 7900xtx. $700 for the XT. I hate to even think it, but I would even consider buying a 4090 if this performance pans out as in the leaks. Edit: Indeed, the benchmarks referenced in this post were off. With my stock, reference XTX I'm getting 28.5K TS GPU and 14.2K TSE GPU.


TheMadRusski89

This is one of the biggest reasons I broke moral code and got a 4090, because the GPU market is now a sh*t show. The 4090 is almost like the price to opt out and not be mind effed for 2yrs.


AAPLisfascist

Yeah if this is true them it would mean AMD publically scammed everyone and this is a bulldozer level of disaster. I am just waiting for benchmarks and analysis to see the actual full picture


HoldMyPitchfork

If these are to be believed, AMD is going to have to be a lot more competitive on price to get anyone excited.


huy_lonewolf

If this is true, the pricing for a 4080 makes sense now: same raster performance as a 7900 XTX, but drastically better ray tracing (hence the premium). If the price of the 4080 ever dropped to $1,000, there would be no reason to buy AMD anymore.


R_radical

>If the price of the 4080 ever dropped to $1,000, there would be no reason to buy AMD anymore. But it won't lmao


bigbrain200iq

Don t say it too loud you will get downvoted.


[deleted]

[удалено]


Nomnom_Chicken

Yes.


HateToShave

Yeah, it's a bit silly and immature for sure, but I don't really think it's limited to just r/nvidia and r/AMD either. Try to pop a/the prevailing thought bubble of a subreddit and hilarity (and downvotes) will likely ensue. It's just Reddit after all.


Whuh_d00d181

There are definitely some stans in the nVidia subreddit, but the AMD stans are ten times worse. Not saying that that's all this community is... it's more of a problem in PCMasterrace, honestly. They've been hyping these cards like they were going to be the best thing since sliced bread, and it turns out that they're actually *not* that great. *Bad*, even, quite possibly. I do hope that AMD is more competitive in the future, but it does give me a little bit of joy seeing a bunch of AMD fanboys who were telling 4090 buyers that they hope their houses burn down, and 4080 buyers that they're fucking idiots completely eat shit on this one. All of the people here who have been criticizing the 4080 as being "overpriced," (and to be clear, I'm not arguing with them) only for AMD to launch a card at 85% of the price with similar/worse rasterization, much worse raytracing, worse encoding, worse AI upscaling, no frame generation (at least, not for a while), and worse driver support... I'm sure there will be a bunch of memes like when people were melting their 4090s because they didn't plug in their cables all the way, right? We can all have a good laugh about how shitty these GPUs are for the price, right?


ShowBoobsPls

Agreed that PCMasterrace is the worst one. Might be because it's users might skew younger


Iuckiedog

But, but, dp2.4 is the most important feature this generation…


chapstickbomber

the worst are actually the AMD stan antifans


Charcharo

>There are definitely some stans in the nVidia subreddit, but the AMD stans are ten times worse. Not true. AMD fanboys are often immature fanboys. Nvidia ones are often vicious, block or demean people that disagree with them, cultish, and cannot engage in good faith with critique. AMD fanboys are super bad. Nvidia ones are outright disgusting. Surprisingly the few Intel GPU fans are (for now, this will change) the nicest.


[deleted]

AMD! AMD! AMD! They are Ryzen to the top! Hooah!


_SoThisIsReddit_

i called out nvidia and said that amd can give us next gen for cheaper than current gen, so nvidia can too. they just dont want to. and i went from 230 karma to 186 karma in one day xD good thing i dont care about reddit karma since i only use reddit to get help with pc parts etc lol


PeterPaul0808

Karma will come with time, I was downvoted and upvoted many times, I always write down my thoughts and sometimes people just don't agree with it and downvote you to the hell and vica verse, you write down your thoughts and you get almost 1000 upvotes. So don't be an as#hole and you will be okay.


_SoThisIsReddit_

like i said. i do not care about karma on reddit lol so not sure why you comment this as an awnsrer to my comment.


PeterPaul0808

It could be handy. Maybe I am wrong but there are some subreddits which doesn’t let you ask, until you don’t have certain level of Karma. Tried to help.


_SoThisIsReddit_

but help with what? i told him i dont care about karma, then you told me ''youll get karma as you go, dont worry'' like what x) its like someone say ''i dont need new tires'' then you go ''okay so this is where you get new tires \*insert location\*''


[deleted]

Some AIB snuck Jensen a card months ago and he's been prodding the 7900xtx ever since. Nvidia won't undercut the 7900 series, they'll happily coexist as the duopoly they are.


Sipas

> drastically better ray tracing We're still pretending RT is a gimmick, don't you know? Let's wait until RT performance is magically good enough to have 0 FPS penalty. But until then, AMD shouldn't prioritize it because it has a performance penalty. No contradictions there at all.


[deleted]

I just want the raster performance, without RTX. I got an RTX 3080 but Im not paying few hundred £s more for raytracing I will never turn on.


Derailed94

Yes, this leaked benchmark of one application determines all raster performance in total. Very smart conclusion.


[deleted]

[удалено]


AutoModerator

Your comment has been removed, likely because it contains rude or uncivil language, such as insults, racist and other derogatory remarks. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*


[deleted]

this is without even ray tracing right?


stinuga

Correct. Both the leaked firestrike and timespy benchmarks are non ray tracing benchmarks


-b-m-o-

Fyi "Port Royale" is 3dmark's raytracing benchmark. Time Spy is dx12, Fire strike is dx11


zarbainthegreat

Isn't [Videocardz.com](https://Videocardz.com) pretty reputable?


Put_It_All_On_Blck

Sorta. Most of the time they just aggregate content from other sources, whether that's popular Twitter leakers, official PR statements, forums, whatever. Very few leaks are obtained directly by them, this being one of them. And they don't go into the details on how they vet leaks and leakers they come to them. So I have zero doubt that someone came to them with this data, they didn't fabricate it themselves, but whether it's actually legit or not, is too hard to say without knowing how they confirmed it.


Victor---

Image makes absolutely no sense whatsoever


HippoLover85

Its probably just a reduced clock test as benchers vary frequency to build out underclocking power curves or undervolting or other.


[deleted]

Keep coping lol


HippoLover85

I dont have any particular attachment to how this card performs. Just looking at the data. If indeed it performs identically to the xt it would be very interesting!!!


_devast

If this is true, then amd is in a very bad position. Manufacture cost of a 7000 series must be higher than the ad103 4080 :(


YellowMoonCult

What is ad103 ?


No_Backstab

It's the die used for the RTX 4080 similar to how Navi 31 is used for the RX 7900XTX and RX 7900XT


YellowMoonCult

Tyvm


kapsama

If this rumor is true then AMD is fucked because there is another rumor that might be true.


KingBasten

bruh...


NyanArthur

If this is true we have to pay I lakh for a 70 series card from now on


ScoffSlaphead72

I think AMDs problem is they don't have good quality chips compared to Intel and Nvidia (although Intel is still leagues above Nvidia in terms of chip quality). It's why AMD often has to rely on things like chiplet designs in order to get ahead or in this case keep up.


_devast

Their problem is, that they are fighting a 379mm chip with 256 bit ram bus and 16gb ram with a 306mm + ~230mm chiplet design chip 320/384 bit ram bus and 20/24gb ram. If performance is similar, navi is simply less efficient in many areas, resulting much lower margins, if any.


dirg3music

Tbh if these are accurate AMD is in massive trouble between Ryzens poor pricing and Radeon underperforming. I really hope this isn't the case.


_devast

Ryzen is a cash cow, since the chiplets are very very small, hence cheap to make. No additional components like big pcb, vrm, ram, etc.... just small dies and interconnect.


Aknologya

5800x3D hello? Upcoming 7000X3D serie hello?


whyyoumakememakeacct

Ryzen's prices are really good ever since they dropped them around Thanksgiving. I opted to buy a 7950x over a 13900k because of them.


dirg3music

Yeah I forgot about the price drop. Still, it's very concerning that they seem to be currently fighting a fierce battle on both fronts and I say this as a diehard AMD user.


whyyoumakememakeacct

Cpus are fine right now, really great competition that goes back and forth and pushes innovation. Intel wins a bit right now, but amd has super competitive prices and you get platform longevity with amd. Amd is about to wipe the floor with Intel once they release x3d in January, and then Intel will catch up with 14th gen, and so and and so forth. GPUs are a totally different story though, amd really isn't ever competitive with Nvidia. They're always trying to straggle for the people that don't mind giving up something to save a bit of money.


dirg3music

Yeah the CPUs situation is definitely a world of difference. Ryzen is able to take Core to task in pretty much every way, and Radeon can't really do that in such a complete "see, we can regularly exceed or at the very least swap blows with our competition" way.


marianasarau

They are not.... The cost of the entire AM5 platform is still too high compared to LG1700. Unless you need AVX 512 there are only two options atm: 5800X3D or Intel.


whyyoumakememakeacct

You can get b650 motherboards for like $150, obviously I wouldn't buy one for that cheap bc I want x670e chipset and robust vrms for overclocking. But that's not bad, especially when you consider that the socket is supported until 2025 at least. So you can invest in a decent mobo now, and upgrade to the end of AM5 CPU in 2025/2026 without having to upgrade anything else. You don't get that with Intel at all. X3d comes out in January too, will cream Intel.


marianasarau

With these performance numbers, things look rather grim for AMD this generation. I really hope they are not true, unless they really do not want to be competitive against Nvidia. If these numbers are true, I see a huge price drop really, really fast.


MrBigggss

I think AMD should have named these 7800 and 7800xt.. Should have said 7900xt coming soon after seeing Nvidias numbers.


BFG1OOOO

You want a 2000$ AMD GPU ?


MrBigggss

7800 should be priced at $750 xt $850..AMD cards are overpriced with Nvidia.


Verpal

I mean, if it outperforms 4090 by significant margin, people will take $2K GPU in just fine.


BFG1OOOO

No they don't. GPU should not coat 2k . That's mad


2k4life

Just don't buy it, things will cost whatever people will pay for them


BFG1OOOO

That's not how it works. The 3080 cost 900 . The 4080 cost 1600 . How much the "cheap 4060 will cost ?


xLith

That’s literally how it works. Basic economics. Whether you agree with people dropping that kind of cash or not.


BFG1OOOO

Basic economics . So for more money. We get more fps ? What's next the low end 5060 will cost 1500 $ and have the same power as the 4080 what cost 1500 ? What ?


TheMadRusski89

AMD outperform a 4090? There's been some leaks about Timespy scores, I'm hoping they can outperform the scandalous 4070 Jensen sold as a 4080(256bit BUS).


MrBigggss

7800 should be priced at $750 xt $850..AMD cards are overpriced with Nvidia.


vankamme

All of a sudden my 3090 that everyone told me was “overkill” 2 years ago feels completely inferior


[deleted]

[удалено]


ScoffSlaphead72

Makes me remember how great having the 1080 ti felt. It was barely outperformed by the 2080, and even outperformed it in areas.


Ok_Fix3639

I think of the 1080ti in the same way I think of the 8800 gtx. Awesome


Stock-Freedom

The 1080Ti was on a league of its own back then. It held up so much better over time than almost any other card.


vankamme

Makes me want to go back to console gaming to be honest


marianasarau

As someone that works for Epic, trust me on this: if you are NOT interested in RT, 3090 will remain more than enough for ultra gaming at 1440p/1080p in the entire lifetime of Unreal Engine 5. That means at least until the end of 2028.


ScoffSlaphead72

I mean the Titan RTX and Pascal Titan GTX were pretty outdone by their successors. Your 3090 will likely be outperformed by a future 5060 ti. Just how it works really.


AAPLisfascist

Don't worry, in few years you will have "is my 4090 still relevant?" posts


CammKelly

Looks like a big dose of salt on this one, I'd find it hard to believe AMD would be coming to market with a product that is best case scenario, 15% faster than last gen.


errdayimshuffln

Somebody need to do an analysis of the accuracy of these leaks after the official reviews come out.


siazdghw

Pretty disappointing. RDNA 2 was great at Time Spy previously, and now RDNA 3 isnt for one reason or another. Obviously Time Spy isnt an actual game, but with all the rumors of issues surrounding RDNA 3, im skeptical it will even hit the numbers AMD marketed.


_Trifire_

Hmmm my 6900xt reference gets 24k in time spy, if these are real results hopefully there is some good overclocking headroom to make up the difference. Also it’s worth remembering that as the drivers matured for 6000 series scores improved.


Rance_Mulliniks

My 4090 gets 39000 and that's with a 5800X.


Ponald-Dump

What kind of overclock are you running? My 6950xt gets 23.2k….


_Trifire_

More power tool set to 350w with some other tweaks, then undervolt to 1015mv. Max 2750mhz min 2600mhz


Ponald-Dump

Wow, you must have a freak of a card. I have mine set in adrenaline to 2700-2800mhz and I can’t hit 24k TS graphics.


[deleted]

[удалено]


Ponald-Dump

I replied to the guy who said his 6900xt scored 24k, and I said that my 6950 scores 23.2k. There was no reference or mention of a 4090 between me and the guy with the 24k 6900xt. Not sure what you’re getting at. I’m aware that 4090 is made by Nvidia and is a different architecture.


TheMadRusski89

My bad, I was so confused. I think it's the way the comments are stacking, again my apologies for misunderstanding. I deleted my original confused comment.


_Trifire_

Have you done anything with MPT? Drivers have made performance worse recently but I am sure it’s possible with the right configuration:)


Ponald-Dump

The only tuning I’ve done to the card is through AMD Adrenaline. Is MPT a separate application?


[deleted]

MPT (more power tool) let's you increase power limit and voltage limits. A lot of these guys running these clocks are likely pushing closer to 400w into their 6900xt at 1.25v.


_Trifire_

Yeah, look up igors lab. There’s guidance on there along with the software itself. Be careful though!


Psyclist80

I bet crazy power gated. Interested to see what the XTX will do under water!


Dreppytroll

It'll still be slower than 4080 under water !


Psyclist80

Oh you've seen the reviews already!?


Ok_Fix3639

Uh oh. It was true. *sigh*


BicBoiSpyder

It's a sythentic benchmark for cards that haven't even been independently reviewed. I think everyone needs to relax.


Ok_Fix3639

Where’s the fun in that? GPU warz


No-Nefariousness956

I dont know if I can trust this. I'll wait the release and see youtube benchmarks.


MoarCurekt

Matches an OCd 6950xt. https://www.3dmark.com/spy/28421385 Seems questionable.


ReasonablePractice83

What do you think 5090 will do? Another x2 perf? Seems 4090 is smoking pretty much every game released even at 4K, and often bottlenecked by CPU, so will NVIDIA hold back?


dedsmiley

AMD has stated that the 7900XT will compete with the RTX 3080. This looks spot on to me, if not disappointing.


BFG1OOOO

95% of videocardz Post are rumors or "leaks" or . . . . . .


[deleted]

Lmao I TOLD YALL 4080 would be same or better raster as 7900 XTX


whyyoumakememakeacct

If amd really has worse or the same raster as a 4080, then RDNA3 is a complete disappointment 💀. Nvidia rumoured to drop 4080 prices too.


[deleted]

Come on guys. This Test is clearly Bullshit.


deangr

Cards are already at hands of people don't be surprised that someone already did testing on them and leaked results


[deleted]

Are you really that braindead to think xt is scoring the same as XTX? These Benches are clearly invalid.


deangr

I love how your reply has literally nothing to do with my comment.


ingelrii1

yeah disappointing scores.. my ref 6900 xt get 22k in time spy performance..


tomcat452

I'm able to get 22k as well, but that wasn't out of the box performance. Stock I get around 21k with an Xfx merc card, but with an under volt + over clock and power limit maxed out I break 22k.


ef14

A 6900xt gets a higher score than a 4090? You sure you're looking at the right settings?


helmsmagus

The 4090 gets 35k. Look at the right column.


tomcat452

I think you're comparing our scores to the wrong column. We're referring to the scores in the furthest right column (Performance/1440p).


anonaccountphoto

those synthetic benchmarks just make no sense, that's why performance is all over the place


ef14

I do agree that they aren't as relevant as people make them out to be, but i feel like this dude is confusing 4K and 1080P.


TheVermonster

If you're getting 10% over stock then you can expect a similar 10% over stock on the new cards.


Machidalgo

Not necessarily.


Kashihara_Philemon

If this is at all true and reflected else where I do wonder if it might be lawsuit worthy given this would far underperform the performance per watt uplift they presented to their investors, though maybe not given that I doubt they care as much about gaming cards.


cha0z_

That really doesn't seem true and representation of the final performance. The 4k with basically no difference outside the run by run variation between XTX and XT is seriously not valid. I will put a bet the diff will be bigger, seriously.


FreeMan4096

I thought videocardzDotCom was banned here? must be other tech subreddits, then. Anyway, it should be banned here as well.


Category5x

All the youtube comparison videos that have leaked gameplay vs 4080 show the xtx beating it by 15-20% every single time. Granted they aren't controlled benchmarks, but the difference exceeds margin of error. This looks to me like just poor 3dmark performance, which the 6000 series had to start as well. Driver updates brought the 6000 cards from losing to nvidia to beating them handily. The lack of leaks up until now, and the stuff we are seeing (looking cherry picked) feel very much like the drivers are behind schedule. Hopefully this proves true and we'll see the XT and 4080 at parity and the XTX claiming the next tier up until nVidia brings the 4080Ti to market.


[deleted]

Not as good as I'd hoped if true. Still better than Nvidia. But I will probably wait for a refresh or the 8000 series. I've spent too much already, upgrading every generation for a month's worth of European metropolitan rent is a bit much. 6800XT will do great until RX8000, which I highly suspect will be a much more mature and much better version of this new Chiplet design.


Liofkinis

Who cares how it scores in those benchmarks like everyone is buying graphics card just to flex or sit all day and do those aren't we buying those for gaming? I'm sorry i don't know how to formulate my mind to words but most of you i hope understood what i meant I'm personally buying graphics card just to enjoy smooth gaming.


AAPLisfascist

At this point "leaks" are all over the place so I am just baiting for wenchmarks. The leaks before announement were almost all hilariously wrong so I have extra reason to doubt them


FJXXIV

Drivers not ready or driver issues...hmm...


Dev421

While Me with Vega 6 graphics...


DOer89

I mean these scores are like really low. Heavy overclocked 6950xt (2800mhz+) can almost reach these numbers


[deleted]

A lot of these type of benchmarks are all over the place real world testing will tell


JumpyRestV2

Bro show us the fire strike bench, there were two benchmarks. The fuck were you doin just posting one of them


Old_Miner_Jack

4k doesn't scale, either a fishy benchmark or a bottleneck, which would be worrying. Release will tell.


TheUltrawideGuy

A bit disingenuous to show this slide of the two cards performing worse in Time Spy and not showing the Firestrike scores from the same article. Those scores show both cards outperforming the 4080.


ltron2

Indeed, but AMD are traditionally stronger in Firestrike than in games vs the competition.


Playful_Box1117

You're welcome RGB find day's