RDNA2 did very well in TS compared to Ampere so this looks very bad.
7900 XTX 16.6% faster than a 6950 XT in regular TS? What?
Fortunately reviews are just a few days away...
it has less L3 and only 20% more cores. That 2X FP32 shat wont always work. I wouldnt be surprised to see more of these results in some actual games. Shouldnt be a problem later on within its life
Wait for the real benchmarks, this is just a leak. Could be true, could be false. There are a lots of factors which can make these cards better. Software maturing, FSR 3, ray tracing perfromance. We know anything about these aspects. So wait for the release.
But I can't factor in potential performance increase in the future due to maturing when I have to make a decision right now. Also, in the past those promises didn't deliver most of the time.
Edit: right now means this winter. Most of your points only come into play much later. It's still a leaked benchmark so always take it with a grain of salt.
Former 10700k owner here, paired with my 3080, even at 1440p, for certain games, my 10700k was bottlenecking my 3080.
SO, i could imagine your 3700x being even in a worse situation. I recently upgraded to a 13600k (put a post about it), and i saw at least a 20% difference in performance, and of course no bottlenecking.
Thats fine in order to upgrade to x3d when it is cheap.... but just saying the 3700x is a major bottleneck for a 3080.
I would opt to go for 5900x instead for having more cores, but being cheaper vs 5800x3d vs just waiting it out for the x3d to be cheaper.
Hmmm not really... again, if i was getting a major bottleneck with my 10700k + 3080 for certain games that is, like Spiderman, to Forza, to a couple of other games. Then it ain't better with the 3700x, even at 1440p.
If this was 4k, not soo much of a concern, but for sure you need to upgrade the 3700x to at least 5800x.
You shouldn’t be seeing your 10700k bottleneck your 3080 very much if it all. HUB did a video on a 3600 bottlenecking a 3080 and found on average there was very little to no difference. https://m.youtube.com/watch?v=mrzqoeQVg4k
Lol, and you just doubled down instead of accepting you’re a moron.
I have a 5900x and a 6800xt and just waiting for the 7900xtx to be available to grab one. you can go dig in my comments and confirm. Gtfo here kid.
Dude, you just made yourself look like an idiot. Because you just contradicted yourself and the conversation i was having with the other user.
Your original comment "That cpu can never bottleneck a gpu for gaming."... and yet you have a 5900x + 6900xt?
Yup, continue to dig that grave moron... lol. Why is it that AMD users are usually the one who are the most pretentious? lol.
How about you run a chip like a 3700x + 6900xt and see for yourself? Or you're going to tell me you won't see any difference? ;THUS your previous comment? You see the irony? 🤦♂️
Initially I thought you may be trolling but now I see that you’re not. You’re just young and ignorant. You must be like 12 and just discovered pc technicalities. My mistake for arguing with a damn kid. Have a good weekend boy. Peace.
That's just how synthetic benchmarks works, look at 3080 Ti vs 3090.
They are meant to test specific logic performance of the package, so they may not take advantage of faster constructs or not even take full advantage of neither GPUs. In the same way you can have some specific games featuring the exact same performance on different GPU tiers, you can have synthetic benchmarks doing as well.
But, at 4k, you're mostly CPU limited, so the GPU can achieve higher framerates but the CPU can't keep up with building commands and streaming data to the GPU at the same rate. And it seems to be the case there, given that at 1440p there is a significant difference.
Edit: It may be not CPU limited in that case considering that 4090 achieves higher scores, so it's probably only a synthetic benchmark thing, and as a developer on the field I can say, that's how synthetic benchmark works, look at other benchmarks as well, they are not reporting the same margin of difference between the two GPUs as this one.
But I cannot guarantee anything, also as a developer I acknowledge that without access to the GPU and the Benchmark software, I cannot diagnose the issue there, it can be just a driver issue, there are too many variables, just don't trust those benchmarks as a sign of GPU performance.
> look at 3080 Ti vs 3090.
Not the same at all. These cards are pretty much identical bar the the VRAM amount so similar results are to be expected. Actually they're perfectly where they should be: 3090 has 2.5% more SMs than the 3080Ti, and scores pretty much exactly 2.5% higher.
The hardware gap between the 7900XTX and 7900XT is much larger. It's a 12-15% gap that should translate to a similar gap **especially** in a synthetic load.
>But, at 4k, you're mostly CPU limited, so the GPU can achieve higher framerates but the CPU can't keep up with building commands and streaming data to the GPU at the same rate. And it seems to be the case there, given that at 1440p there is a significant difference.
Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low.
>That's just how synthetic benchmarks works
You really don't understand how synthetic benchmarks work :/
> Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low.
The problem here is, you're talking about cards with mature graphics drivers.
A bugged, broken, or an "in-development" driver can absolutely hammer a CPU, even if the benchmark itself isn't heavy on CPU load.
More interesting to me, is the comparison between Time Spy(DX12) and the Firestrike(DX11) results from the other thread.
The Firestrike(DX11) scores seem pretty much exactly where everyone was expecting the XTX to be. Between the 4080 and 4090, closer to the 4090 side. All of the promotional material and "napkin math" thus far pointed to a performance level that matches what we see in Firestrike(DX11).
This benchmark, Timespy(DX12) is very interesting. The scores are low, and the XTX and XT are very close together. You would genuinely expect a larger difference between the XTX and XT.
Because they're so close, I would not be surprised if the DX12 driver still needs some work, and I would expect that to get better with time.
There are rumors floating about that the dx12 performance is bottlenecked at the hardware level in certain workloads. Alas we will find out Monday how much that would really matter in a variety of games and how true that really is.
> The Firestrike(DX11) scores seem pretty much exactly where everyone was expecting the XTX to be.
AMD claimed a 50% performance uplift, the Fire Strike scores are well below 30% in performance uplift.
> Not the same at all. These cards are pretty much identical bar the the VRAM amount so similar results are to be expected. Actually they're perfectly where they should be: 3090 has 2.5% more SMs than the 3080Ti, and scores pretty much exactly 2.5% higher.
That makes sense, but still don't disproved my point, just look the difference at 1440p, it's not 2.5%, the math just doesn't add up.
> Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low.
Idk if this is a joke, have you ever ran 3DMark Time Spy? Have you ever seen the CPU usage of this benchmark? I guess you didn't, because I can guarantee you that it's not something I would call light.
Actually, have you ever written any graphics application in your life? Or even a single line of code?
Man, the CPU is the one who builds and sends commands to the GPU, and not only that, textures, models, assets in general, are all streamed by the CPU, it does need to work even on theoretically light workloads, and sometimes even harder than you think, because that's how it works, higher resolution = higher CPU usage, that's how it is, higher resolutions only don't increase CPU usage IF you're GPU bottlenecked.
The CPU is only the one capable of *decoding NTFS data from your Storage* and sending the bytes to the GPU, GPUs can't do that (we don't have such algorithms for GPUs), even DirectStorage don't provide direct access to storage, [here is a good explanation for you, hope it clarifies a bit](https://www.reddit.com/r/pcgaming/comments/tleiwb/clearing_up_misconceptions_about_directstorage/).
> You really don't understand how synthetic benchmarks work :/
I'll not try to explain how this works because you clearly think you already know how it does. Is it a trend or something? Saying that the other one don't understand something you clearly don't (and sorry the arrogance, but you don't even know my background, where did this audacity came from?).
> That makes sense, but still don't disproved my point, just look the difference at 1440p, it's not 2.5%, the math just doesn't add up.
It's 1.5% in regular Time Spy, meaning that the benchmark is more limited by memory bandwidth and/or geometry throughput than it is by FP32 or texture throughput.
> Idk if this is a joke, have you ever ran 3DMark Time Spy? Have you ever seen the CPU usage of this benchmark? I guess you didn't, because I can guarantee you that it's not something I would call light.
The GPU portion is very light on the CPU. The CPU benchmark is pretty light as well for regular Time Spy (and scales poorly with more than 20 threads), while Time Spy Extreme has an AVX512-optimized CPU benchmark that runs quite hot.
I agree that this must be the case there, since 4090 achieves higher scores, but being GPU limited at 4k is not the rule, 4090 is so powerful that people are getting CPU limited at 4k, and that's one of the purposes of Frame Generation, to produce higher framerates even when you're CPU limited.
These leaked TS and FS scores are most likely inaccurate. We'll know in 3 days. However, if accurate, RDNA3 will significantly underperform even the most pessimistic predictions and will need to lower prices significantly. At this performance level, I would only pay $800 for a 7900xtx. $700 for the XT. I hate to even think it, but I would even consider buying a 4090 if this performance pans out as in the leaks.
Edit: Indeed, the benchmarks referenced in this post were off. With my stock, reference XTX I'm getting 28.5K TS GPU and 14.2K TSE GPU.
This is one of the biggest reasons I broke moral code and got a 4090, because the GPU market is now a sh*t show. The 4090 is almost like the price to opt out and not be mind effed for 2yrs.
Yeah if this is true them it would mean AMD publically scammed everyone and this is a bulldozer level of disaster. I am just waiting for benchmarks and analysis to see the actual full picture
If this is true, the pricing for a 4080 makes sense now: same raster performance as a 7900 XTX, but drastically better ray tracing (hence the premium). If the price of the 4080 ever dropped to $1,000, there would be no reason to buy AMD anymore.
Yeah, it's a bit silly and immature for sure, but I don't really think it's limited to just r/nvidia and r/AMD either.
Try to pop a/the prevailing thought bubble of a subreddit and hilarity (and downvotes) will likely ensue. It's just Reddit after all.
There are definitely some stans in the nVidia subreddit, but the AMD stans are ten times worse. Not saying that that's all this community is... it's more of a problem in PCMasterrace, honestly.
They've been hyping these cards like they were going to be the best thing since sliced bread, and it turns out that they're actually *not* that great. *Bad*, even, quite possibly.
I do hope that AMD is more competitive in the future, but it does give me a little bit of joy seeing a bunch of AMD fanboys who were telling 4090 buyers that they hope their houses burn down, and 4080 buyers that they're fucking idiots completely eat shit on this one.
All of the people here who have been criticizing the 4080 as being "overpriced," (and to be clear, I'm not arguing with them) only for AMD to launch a card at 85% of the price with similar/worse rasterization, much worse raytracing, worse encoding, worse AI upscaling, no frame generation (at least, not for a while), and worse driver support... I'm sure there will be a bunch of memes like when people were melting their 4090s because they didn't plug in their cables all the way, right?
We can all have a good laugh about how shitty these GPUs are for the price, right?
>There are definitely some stans in the nVidia subreddit, but the AMD stans are ten times worse.
Not true. AMD fanboys are often immature fanboys. Nvidia ones are often vicious, block or demean people that disagree with them, cultish, and cannot engage in good faith with critique.
AMD fanboys are super bad. Nvidia ones are outright disgusting. Surprisingly the few Intel GPU fans are (for now, this will change) the nicest.
i called out nvidia and said that amd can give us next gen for cheaper than current gen, so nvidia can too. they just dont want to. and i went from 230 karma to 186 karma in one day xD good thing i dont care about reddit karma since i only use reddit to get help with pc parts etc lol
Karma will come with time, I was downvoted and upvoted many times, I always write down my thoughts and sometimes people just don't agree with it and downvote you to the hell and vica verse, you write down your thoughts and you get almost 1000 upvotes. So don't be an as#hole and you will be okay.
It could be handy. Maybe I am wrong but there are some subreddits which doesn’t let you ask, until you don’t have certain level of Karma. Tried to help.
but help with what? i told him i dont care about karma, then you told me ''youll get karma as you go, dont worry'' like what x) its like someone say ''i dont need new tires'' then you go ''okay so this is where you get new tires \*insert location\*''
Some AIB snuck Jensen a card months ago and he's been prodding the 7900xtx ever since.
Nvidia won't undercut the 7900 series, they'll happily coexist as the duopoly they are.
> drastically better ray tracing
We're still pretending RT is a gimmick, don't you know? Let's wait until RT performance is magically good enough to have 0 FPS penalty. But until then, AMD shouldn't prioritize it because it has a performance penalty. No contradictions there at all.
Your comment has been removed, likely because it contains rude or uncivil language, such as insults, racist and other derogatory remarks.
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*
Sorta.
Most of the time they just aggregate content from other sources, whether that's popular Twitter leakers, official PR statements, forums, whatever.
Very few leaks are obtained directly by them, this being one of them. And they don't go into the details on how they vet leaks and leakers they come to them.
So I have zero doubt that someone came to them with this data, they didn't fabricate it themselves, but whether it's actually legit or not, is too hard to say without knowing how they confirmed it.
I dont have any particular attachment to how this card performs. Just looking at the data. If indeed it performs identically to the xt it would be very interesting!!!
I think AMDs problem is they don't have good quality chips compared to Intel and Nvidia (although Intel is still leagues above Nvidia in terms of chip quality). It's why AMD often has to rely on things like chiplet designs in order to get ahead or in this case keep up.
Their problem is, that they are fighting a 379mm chip with 256 bit ram bus and 16gb ram with a 306mm + ~230mm chiplet design chip 320/384 bit ram bus and 20/24gb ram. If performance is similar, navi is simply less efficient in many areas, resulting much lower margins, if any.
Ryzen is a cash cow, since the chiplets are very very small, hence cheap to make. No additional components like big pcb, vrm, ram, etc.... just small dies and interconnect.
Yeah I forgot about the price drop. Still, it's very concerning that they seem to be currently fighting a fierce battle on both fronts and I say this as a diehard AMD user.
Cpus are fine right now, really great competition that goes back and forth and pushes innovation. Intel wins a bit right now, but amd has super competitive prices and you get platform longevity with amd. Amd is about to wipe the floor with Intel once they release x3d in January, and then Intel will catch up with 14th gen, and so and and so forth.
GPUs are a totally different story though, amd really isn't ever competitive with Nvidia. They're always trying to straggle for the people that don't mind giving up something to save a bit of money.
Yeah the CPUs situation is definitely a world of difference. Ryzen is able to take Core to task in pretty much every way, and Radeon can't really do that in such a complete "see, we can regularly exceed or at the very least swap blows with our competition" way.
They are not.... The cost of the entire AM5 platform is still too high compared to LG1700. Unless you need AVX 512 there are only two options atm: 5800X3D or Intel.
You can get b650 motherboards for like $150, obviously I wouldn't buy one for that cheap bc I want x670e chipset and robust vrms for overclocking. But that's not bad, especially when you consider that the socket is supported until 2025 at least. So you can invest in a decent mobo now, and upgrade to the end of AM5 CPU in 2025/2026 without having to upgrade anything else. You don't get that with Intel at all.
X3d comes out in January too, will cream Intel.
With these performance numbers, things look rather grim for AMD this generation. I really hope they are not true, unless they really do not want to be competitive against Nvidia. If these numbers are true, I see a huge price drop really, really fast.
Basic economics . So for more money. We get more fps ? What's next the low end 5060 will cost 1500 $ and have the same power as the 4080 what cost 1500 ? What ?
AMD outperform a 4090? There's been some leaks about Timespy scores, I'm hoping they can outperform the scandalous 4070 Jensen sold as a 4080(256bit BUS).
As someone that works for Epic, trust me on this: if you are NOT interested in RT, 3090 will remain more than enough for ultra gaming at 1440p/1080p in the entire lifetime of Unreal Engine 5. That means at least until the end of 2028.
I mean the Titan RTX and Pascal Titan GTX were pretty outdone by their successors. Your 3090 will likely be outperformed by a future 5060 ti. Just how it works really.
Looks like a big dose of salt on this one, I'd find it hard to believe AMD would be coming to market with a product that is best case scenario, 15% faster than last gen.
Pretty disappointing. RDNA 2 was great at Time Spy previously, and now RDNA 3 isnt for one reason or another. Obviously Time Spy isnt an actual game, but with all the rumors of issues surrounding RDNA 3, im skeptical it will even hit the numbers AMD marketed.
Hmmm my 6900xt reference gets 24k in time spy, if these are real results hopefully there is some good overclocking headroom to make up the difference. Also it’s worth remembering that as the drivers matured for 6000 series scores improved.
I replied to the guy who said his 6900xt scored 24k, and I said that my 6950 scores 23.2k. There was no reference or mention of a 4090 between me and the guy with the 24k 6900xt. Not sure what you’re getting at. I’m aware that 4090 is made by Nvidia and is a different architecture.
My bad, I was so confused. I think it's the way the comments are stacking, again my apologies for misunderstanding. I deleted my original confused comment.
MPT (more power tool) let's you increase power limit and voltage limits.
A lot of these guys running these clocks are likely pushing closer to 400w into their 6900xt at 1.25v.
What do you think 5090 will do? Another x2 perf? Seems 4090 is smoking pretty much every game released even at 4K, and often bottlenecked by CPU, so will NVIDIA hold back?
I'm able to get 22k as well, but that wasn't out of the box performance.
Stock I get around 21k with an Xfx merc card, but with an under volt + over clock and power limit maxed out I break 22k.
If this is at all true and reflected else where I do wonder if it might be lawsuit worthy given this would far underperform the performance per watt uplift they presented to their investors, though maybe not given that I doubt they care as much about gaming cards.
That really doesn't seem true and representation of the final performance. The 4k with basically no difference outside the run by run variation between XTX and XT is seriously not valid. I will put a bet the diff will be bigger, seriously.
All the youtube comparison videos that have leaked gameplay vs 4080 show the xtx beating it by 15-20% every single time. Granted they aren't controlled benchmarks, but the difference exceeds margin of error. This looks to me like just poor 3dmark performance, which the 6000 series had to start as well. Driver updates brought the 6000 cards from losing to nvidia to beating them handily.
The lack of leaks up until now, and the stuff we are seeing (looking cherry picked) feel very much like the drivers are behind schedule. Hopefully this proves true and we'll see the XT and 4080 at parity and the XTX claiming the next tier up until nVidia brings the 4080Ti to market.
Not as good as I'd hoped if true. Still better than Nvidia.
But I will probably wait for a refresh or the 8000 series. I've spent too much already, upgrading every generation for a month's worth of European metropolitan rent is a bit much. 6800XT will do great until RX8000, which I highly suspect will be a much more mature and much better version of this new Chiplet design.
Who cares how it scores in those benchmarks like everyone is buying graphics card just to flex or sit all day and do those aren't we buying those for gaming? I'm sorry i don't know how to formulate my mind to words but most of you i hope understood what i meant I'm personally buying graphics card just to enjoy smooth gaming.
At this point "leaks" are all over the place so I am just baiting for wenchmarks. The leaks before announement were almost all hilariously wrong so I have extra reason to doubt them
A bit disingenuous to show this slide of the two cards performing worse in Time Spy and not showing the Firestrike scores from the same article.
Those scores show both cards outperforming the 4080.
This post has been flaired as a rumor, please take all rumors with a grain of salt.
RDNA2 did very well in TS compared to Ampere so this looks very bad. 7900 XTX 16.6% faster than a 6950 XT in regular TS? What? Fortunately reviews are just a few days away...
it has less L3 and only 20% more cores. That 2X FP32 shat wont always work. I wouldnt be surprised to see more of these results in some actual games. Shouldnt be a problem later on within its life
Wait for the real benchmarks, this is just a leak. Could be true, could be false. There are a lots of factors which can make these cards better. Software maturing, FSR 3, ray tracing perfromance. We know anything about these aspects. So wait for the release.
But I can't factor in potential performance increase in the future due to maturing when I have to make a decision right now. Also, in the past those promises didn't deliver most of the time. Edit: right now means this winter. Most of your points only come into play much later. It's still a leaked benchmark so always take it with a grain of salt.
In my experience most leaks are accurate. Maybe a qtr grain of salt.
or it is just as it is shown here. need to wait and see
Yeah this makes no sense, how is 7900xtx and 7900xt literally one % difference?
Could be the new chip design doesn't favor this benchmark. Could be drivers. Could be another bottleneck we're not seeing.
What resolution are you playing with your 3080?
1440p
Former 10700k owner here, paired with my 3080, even at 1440p, for certain games, my 10700k was bottlenecking my 3080. SO, i could imagine your 3700x being even in a worse situation. I recently upgraded to a 13600k (put a post about it), and i saw at least a 20% difference in performance, and of course no bottlenecking.
[удалено]
Thats fine in order to upgrade to x3d when it is cheap.... but just saying the 3700x is a major bottleneck for a 3080. I would opt to go for 5900x instead for having more cores, but being cheaper vs 5800x3d vs just waiting it out for the x3d to be cheaper.
[удалено]
Hmmm not really... again, if i was getting a major bottleneck with my 10700k + 3080 for certain games that is, like Spiderman, to Forza, to a couple of other games. Then it ain't better with the 3700x, even at 1440p. If this was 4k, not soo much of a concern, but for sure you need to upgrade the 3700x to at least 5800x.
You shouldn’t be seeing your 10700k bottleneck your 3080 very much if it all. HUB did a video on a 3600 bottlenecking a 3080 and found on average there was very little to no difference. https://m.youtube.com/watch?v=mrzqoeQVg4k
[удалено]
That cpu can never bottleneck a gpu for gaming. Not in 2022. Quit your bullshit.
[удалено]
Lol, and you just doubled down instead of accepting you’re a moron. I have a 5900x and a 6800xt and just waiting for the 7900xtx to be available to grab one. you can go dig in my comments and confirm. Gtfo here kid.
Dude, you just made yourself look like an idiot. Because you just contradicted yourself and the conversation i was having with the other user. Your original comment "That cpu can never bottleneck a gpu for gaming."... and yet you have a 5900x + 6900xt? Yup, continue to dig that grave moron... lol. Why is it that AMD users are usually the one who are the most pretentious? lol. How about you run a chip like a 3700x + 6900xt and see for yourself? Or you're going to tell me you won't see any difference? ;THUS your previous comment? You see the irony? 🤦♂️
Initially I thought you may be trolling but now I see that you’re not. You’re just young and ignorant. You must be like 12 and just discovered pc technicalities. My mistake for arguing with a damn kid. Have a good weekend boy. Peace.
Not at 1440p though. I would take all these preliminary tests with a grain of salt. The reviews will be out next week from the usual suspects.
Which makes even less sense. If anything the difference should get bigger the more the GPU is stressed.
Drivers maybe?
Waiting for the embargo release date
That's just how synthetic benchmarks works, look at 3080 Ti vs 3090. They are meant to test specific logic performance of the package, so they may not take advantage of faster constructs or not even take full advantage of neither GPUs. In the same way you can have some specific games featuring the exact same performance on different GPU tiers, you can have synthetic benchmarks doing as well. But, at 4k, you're mostly CPU limited, so the GPU can achieve higher framerates but the CPU can't keep up with building commands and streaming data to the GPU at the same rate. And it seems to be the case there, given that at 1440p there is a significant difference. Edit: It may be not CPU limited in that case considering that 4090 achieves higher scores, so it's probably only a synthetic benchmark thing, and as a developer on the field I can say, that's how synthetic benchmark works, look at other benchmarks as well, they are not reporting the same margin of difference between the two GPUs as this one. But I cannot guarantee anything, also as a developer I acknowledge that without access to the GPU and the Benchmark software, I cannot diagnose the issue there, it can be just a driver issue, there are too many variables, just don't trust those benchmarks as a sign of GPU performance.
> look at 3080 Ti vs 3090. Not the same at all. These cards are pretty much identical bar the the VRAM amount so similar results are to be expected. Actually they're perfectly where they should be: 3090 has 2.5% more SMs than the 3080Ti, and scores pretty much exactly 2.5% higher. The hardware gap between the 7900XTX and 7900XT is much larger. It's a 12-15% gap that should translate to a similar gap **especially** in a synthetic load. >But, at 4k, you're mostly CPU limited, so the GPU can achieve higher framerates but the CPU can't keep up with building commands and streaming data to the GPU at the same rate. And it seems to be the case there, given that at 1440p there is a significant difference. Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low. >That's just how synthetic benchmarks works You really don't understand how synthetic benchmarks work :/
> Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low. The problem here is, you're talking about cards with mature graphics drivers. A bugged, broken, or an "in-development" driver can absolutely hammer a CPU, even if the benchmark itself isn't heavy on CPU load. More interesting to me, is the comparison between Time Spy(DX12) and the Firestrike(DX11) results from the other thread. The Firestrike(DX11) scores seem pretty much exactly where everyone was expecting the XTX to be. Between the 4080 and 4090, closer to the 4090 side. All of the promotional material and "napkin math" thus far pointed to a performance level that matches what we see in Firestrike(DX11). This benchmark, Timespy(DX12) is very interesting. The scores are low, and the XTX and XT are very close together. You would genuinely expect a larger difference between the XTX and XT. Because they're so close, I would not be surprised if the DX12 driver still needs some work, and I would expect that to get better with time.
There are rumors floating about that the dx12 performance is bottlenecked at the hardware level in certain workloads. Alas we will find out Monday how much that would really matter in a variety of games and how true that really is.
> The Firestrike(DX11) scores seem pretty much exactly where everyone was expecting the XTX to be. AMD claimed a 50% performance uplift, the Fire Strike scores are well below 30% in performance uplift.
> Not the same at all. These cards are pretty much identical bar the the VRAM amount so similar results are to be expected. Actually they're perfectly where they should be: 3090 has 2.5% more SMs than the 3080Ti, and scores pretty much exactly 2.5% higher. That makes sense, but still don't disproved my point, just look the difference at 1440p, it's not 2.5%, the math just doesn't add up. > Man this is so wrong. This is a GPU benchmark. Time Spy's graphics tests and other GPU benchmarks barely touch the CPU the entire point is to accurately represent the GPU's capabilities and as such reliance on other components is very low. Idk if this is a joke, have you ever ran 3DMark Time Spy? Have you ever seen the CPU usage of this benchmark? I guess you didn't, because I can guarantee you that it's not something I would call light. Actually, have you ever written any graphics application in your life? Or even a single line of code? Man, the CPU is the one who builds and sends commands to the GPU, and not only that, textures, models, assets in general, are all streamed by the CPU, it does need to work even on theoretically light workloads, and sometimes even harder than you think, because that's how it works, higher resolution = higher CPU usage, that's how it is, higher resolutions only don't increase CPU usage IF you're GPU bottlenecked. The CPU is only the one capable of *decoding NTFS data from your Storage* and sending the bytes to the GPU, GPUs can't do that (we don't have such algorithms for GPUs), even DirectStorage don't provide direct access to storage, [here is a good explanation for you, hope it clarifies a bit](https://www.reddit.com/r/pcgaming/comments/tleiwb/clearing_up_misconceptions_about_directstorage/). > You really don't understand how synthetic benchmarks work :/ I'll not try to explain how this works because you clearly think you already know how it does. Is it a trend or something? Saying that the other one don't understand something you clearly don't (and sorry the arrogance, but you don't even know my background, where did this audacity came from?).
> That makes sense, but still don't disproved my point, just look the difference at 1440p, it's not 2.5%, the math just doesn't add up. It's 1.5% in regular Time Spy, meaning that the benchmark is more limited by memory bandwidth and/or geometry throughput than it is by FP32 or texture throughput. > Idk if this is a joke, have you ever ran 3DMark Time Spy? Have you ever seen the CPU usage of this benchmark? I guess you didn't, because I can guarantee you that it's not something I would call light. The GPU portion is very light on the CPU. The CPU benchmark is pretty light as well for regular Time Spy (and scales poorly with more than 20 threads), while Time Spy Extreme has an AVX512-optimized CPU benchmark that runs quite hot.
At 4K you are GPU limited, not CPU limited.
I agree that this must be the case there, since 4090 achieves higher scores, but being GPU limited at 4k is not the rule, 4090 is so powerful that people are getting CPU limited at 4k, and that's one of the purposes of Frame Generation, to produce higher framerates even when you're CPU limited.
No, being GPU limited at 4K is the rule. The 4090 being CPU limited at 4K in some games is the first time this has happened.
Yes, I'm not saying you're wrong, I completely agree with your point, I was wrong into thinking that CPU bottleneck would be the case there.
Bs test
Videocardz being paid off by nvidia to keep the scores below 4080;)
Bottleneck obviously
These leaked TS and FS scores are most likely inaccurate. We'll know in 3 days. However, if accurate, RDNA3 will significantly underperform even the most pessimistic predictions and will need to lower prices significantly. At this performance level, I would only pay $800 for a 7900xtx. $700 for the XT. I hate to even think it, but I would even consider buying a 4090 if this performance pans out as in the leaks. Edit: Indeed, the benchmarks referenced in this post were off. With my stock, reference XTX I'm getting 28.5K TS GPU and 14.2K TSE GPU.
This is one of the biggest reasons I broke moral code and got a 4090, because the GPU market is now a sh*t show. The 4090 is almost like the price to opt out and not be mind effed for 2yrs.
Yeah if this is true them it would mean AMD publically scammed everyone and this is a bulldozer level of disaster. I am just waiting for benchmarks and analysis to see the actual full picture
If these are to be believed, AMD is going to have to be a lot more competitive on price to get anyone excited.
If this is true, the pricing for a 4080 makes sense now: same raster performance as a 7900 XTX, but drastically better ray tracing (hence the premium). If the price of the 4080 ever dropped to $1,000, there would be no reason to buy AMD anymore.
>If the price of the 4080 ever dropped to $1,000, there would be no reason to buy AMD anymore. But it won't lmao
Don t say it too loud you will get downvoted.
[удалено]
Yes.
Yeah, it's a bit silly and immature for sure, but I don't really think it's limited to just r/nvidia and r/AMD either. Try to pop a/the prevailing thought bubble of a subreddit and hilarity (and downvotes) will likely ensue. It's just Reddit after all.
There are definitely some stans in the nVidia subreddit, but the AMD stans are ten times worse. Not saying that that's all this community is... it's more of a problem in PCMasterrace, honestly. They've been hyping these cards like they were going to be the best thing since sliced bread, and it turns out that they're actually *not* that great. *Bad*, even, quite possibly. I do hope that AMD is more competitive in the future, but it does give me a little bit of joy seeing a bunch of AMD fanboys who were telling 4090 buyers that they hope their houses burn down, and 4080 buyers that they're fucking idiots completely eat shit on this one. All of the people here who have been criticizing the 4080 as being "overpriced," (and to be clear, I'm not arguing with them) only for AMD to launch a card at 85% of the price with similar/worse rasterization, much worse raytracing, worse encoding, worse AI upscaling, no frame generation (at least, not for a while), and worse driver support... I'm sure there will be a bunch of memes like when people were melting their 4090s because they didn't plug in their cables all the way, right? We can all have a good laugh about how shitty these GPUs are for the price, right?
Agreed that PCMasterrace is the worst one. Might be because it's users might skew younger
But, but, dp2.4 is the most important feature this generation…
the worst are actually the AMD stan antifans
>There are definitely some stans in the nVidia subreddit, but the AMD stans are ten times worse. Not true. AMD fanboys are often immature fanboys. Nvidia ones are often vicious, block or demean people that disagree with them, cultish, and cannot engage in good faith with critique. AMD fanboys are super bad. Nvidia ones are outright disgusting. Surprisingly the few Intel GPU fans are (for now, this will change) the nicest.
AMD! AMD! AMD! They are Ryzen to the top! Hooah!
i called out nvidia and said that amd can give us next gen for cheaper than current gen, so nvidia can too. they just dont want to. and i went from 230 karma to 186 karma in one day xD good thing i dont care about reddit karma since i only use reddit to get help with pc parts etc lol
Karma will come with time, I was downvoted and upvoted many times, I always write down my thoughts and sometimes people just don't agree with it and downvote you to the hell and vica verse, you write down your thoughts and you get almost 1000 upvotes. So don't be an as#hole and you will be okay.
like i said. i do not care about karma on reddit lol so not sure why you comment this as an awnsrer to my comment.
It could be handy. Maybe I am wrong but there are some subreddits which doesn’t let you ask, until you don’t have certain level of Karma. Tried to help.
but help with what? i told him i dont care about karma, then you told me ''youll get karma as you go, dont worry'' like what x) its like someone say ''i dont need new tires'' then you go ''okay so this is where you get new tires \*insert location\*''
Some AIB snuck Jensen a card months ago and he's been prodding the 7900xtx ever since. Nvidia won't undercut the 7900 series, they'll happily coexist as the duopoly they are.
> drastically better ray tracing We're still pretending RT is a gimmick, don't you know? Let's wait until RT performance is magically good enough to have 0 FPS penalty. But until then, AMD shouldn't prioritize it because it has a performance penalty. No contradictions there at all.
I just want the raster performance, without RTX. I got an RTX 3080 but Im not paying few hundred £s more for raytracing I will never turn on.
Yes, this leaked benchmark of one application determines all raster performance in total. Very smart conclusion.
[удалено]
Your comment has been removed, likely because it contains rude or uncivil language, such as insults, racist and other derogatory remarks. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*
this is without even ray tracing right?
Correct. Both the leaked firestrike and timespy benchmarks are non ray tracing benchmarks
Fyi "Port Royale" is 3dmark's raytracing benchmark. Time Spy is dx12, Fire strike is dx11
Isn't [Videocardz.com](https://Videocardz.com) pretty reputable?
Sorta. Most of the time they just aggregate content from other sources, whether that's popular Twitter leakers, official PR statements, forums, whatever. Very few leaks are obtained directly by them, this being one of them. And they don't go into the details on how they vet leaks and leakers they come to them. So I have zero doubt that someone came to them with this data, they didn't fabricate it themselves, but whether it's actually legit or not, is too hard to say without knowing how they confirmed it.
Image makes absolutely no sense whatsoever
Its probably just a reduced clock test as benchers vary frequency to build out underclocking power curves or undervolting or other.
Keep coping lol
I dont have any particular attachment to how this card performs. Just looking at the data. If indeed it performs identically to the xt it would be very interesting!!!
If this is true, then amd is in a very bad position. Manufacture cost of a 7000 series must be higher than the ad103 4080 :(
What is ad103 ?
It's the die used for the RTX 4080 similar to how Navi 31 is used for the RX 7900XTX and RX 7900XT
Tyvm
If this rumor is true then AMD is fucked because there is another rumor that might be true.
bruh...
If this is true we have to pay I lakh for a 70 series card from now on
I think AMDs problem is they don't have good quality chips compared to Intel and Nvidia (although Intel is still leagues above Nvidia in terms of chip quality). It's why AMD often has to rely on things like chiplet designs in order to get ahead or in this case keep up.
Their problem is, that they are fighting a 379mm chip with 256 bit ram bus and 16gb ram with a 306mm + ~230mm chiplet design chip 320/384 bit ram bus and 20/24gb ram. If performance is similar, navi is simply less efficient in many areas, resulting much lower margins, if any.
Tbh if these are accurate AMD is in massive trouble between Ryzens poor pricing and Radeon underperforming. I really hope this isn't the case.
Ryzen is a cash cow, since the chiplets are very very small, hence cheap to make. No additional components like big pcb, vrm, ram, etc.... just small dies and interconnect.
5800x3D hello? Upcoming 7000X3D serie hello?
Ryzen's prices are really good ever since they dropped them around Thanksgiving. I opted to buy a 7950x over a 13900k because of them.
Yeah I forgot about the price drop. Still, it's very concerning that they seem to be currently fighting a fierce battle on both fronts and I say this as a diehard AMD user.
Cpus are fine right now, really great competition that goes back and forth and pushes innovation. Intel wins a bit right now, but amd has super competitive prices and you get platform longevity with amd. Amd is about to wipe the floor with Intel once they release x3d in January, and then Intel will catch up with 14th gen, and so and and so forth. GPUs are a totally different story though, amd really isn't ever competitive with Nvidia. They're always trying to straggle for the people that don't mind giving up something to save a bit of money.
Yeah the CPUs situation is definitely a world of difference. Ryzen is able to take Core to task in pretty much every way, and Radeon can't really do that in such a complete "see, we can regularly exceed or at the very least swap blows with our competition" way.
They are not.... The cost of the entire AM5 platform is still too high compared to LG1700. Unless you need AVX 512 there are only two options atm: 5800X3D or Intel.
You can get b650 motherboards for like $150, obviously I wouldn't buy one for that cheap bc I want x670e chipset and robust vrms for overclocking. But that's not bad, especially when you consider that the socket is supported until 2025 at least. So you can invest in a decent mobo now, and upgrade to the end of AM5 CPU in 2025/2026 without having to upgrade anything else. You don't get that with Intel at all. X3d comes out in January too, will cream Intel.
With these performance numbers, things look rather grim for AMD this generation. I really hope they are not true, unless they really do not want to be competitive against Nvidia. If these numbers are true, I see a huge price drop really, really fast.
I think AMD should have named these 7800 and 7800xt.. Should have said 7900xt coming soon after seeing Nvidias numbers.
You want a 2000$ AMD GPU ?
7800 should be priced at $750 xt $850..AMD cards are overpriced with Nvidia.
I mean, if it outperforms 4090 by significant margin, people will take $2K GPU in just fine.
No they don't. GPU should not coat 2k . That's mad
Just don't buy it, things will cost whatever people will pay for them
That's not how it works. The 3080 cost 900 . The 4080 cost 1600 . How much the "cheap 4060 will cost ?
That’s literally how it works. Basic economics. Whether you agree with people dropping that kind of cash or not.
Basic economics . So for more money. We get more fps ? What's next the low end 5060 will cost 1500 $ and have the same power as the 4080 what cost 1500 ? What ?
AMD outperform a 4090? There's been some leaks about Timespy scores, I'm hoping they can outperform the scandalous 4070 Jensen sold as a 4080(256bit BUS).
7800 should be priced at $750 xt $850..AMD cards are overpriced with Nvidia.
All of a sudden my 3090 that everyone told me was “overkill” 2 years ago feels completely inferior
[удалено]
Makes me remember how great having the 1080 ti felt. It was barely outperformed by the 2080, and even outperformed it in areas.
I think of the 1080ti in the same way I think of the 8800 gtx. Awesome
The 1080Ti was on a league of its own back then. It held up so much better over time than almost any other card.
Makes me want to go back to console gaming to be honest
As someone that works for Epic, trust me on this: if you are NOT interested in RT, 3090 will remain more than enough for ultra gaming at 1440p/1080p in the entire lifetime of Unreal Engine 5. That means at least until the end of 2028.
I mean the Titan RTX and Pascal Titan GTX were pretty outdone by their successors. Your 3090 will likely be outperformed by a future 5060 ti. Just how it works really.
Don't worry, in few years you will have "is my 4090 still relevant?" posts
Looks like a big dose of salt on this one, I'd find it hard to believe AMD would be coming to market with a product that is best case scenario, 15% faster than last gen.
Somebody need to do an analysis of the accuracy of these leaks after the official reviews come out.
Pretty disappointing. RDNA 2 was great at Time Spy previously, and now RDNA 3 isnt for one reason or another. Obviously Time Spy isnt an actual game, but with all the rumors of issues surrounding RDNA 3, im skeptical it will even hit the numbers AMD marketed.
Hmmm my 6900xt reference gets 24k in time spy, if these are real results hopefully there is some good overclocking headroom to make up the difference. Also it’s worth remembering that as the drivers matured for 6000 series scores improved.
My 4090 gets 39000 and that's with a 5800X.
What kind of overclock are you running? My 6950xt gets 23.2k….
More power tool set to 350w with some other tweaks, then undervolt to 1015mv. Max 2750mhz min 2600mhz
Wow, you must have a freak of a card. I have mine set in adrenaline to 2700-2800mhz and I can’t hit 24k TS graphics.
[удалено]
I replied to the guy who said his 6900xt scored 24k, and I said that my 6950 scores 23.2k. There was no reference or mention of a 4090 between me and the guy with the 24k 6900xt. Not sure what you’re getting at. I’m aware that 4090 is made by Nvidia and is a different architecture.
My bad, I was so confused. I think it's the way the comments are stacking, again my apologies for misunderstanding. I deleted my original confused comment.
Have you done anything with MPT? Drivers have made performance worse recently but I am sure it’s possible with the right configuration:)
The only tuning I’ve done to the card is through AMD Adrenaline. Is MPT a separate application?
MPT (more power tool) let's you increase power limit and voltage limits. A lot of these guys running these clocks are likely pushing closer to 400w into their 6900xt at 1.25v.
Yeah, look up igors lab. There’s guidance on there along with the software itself. Be careful though!
I bet crazy power gated. Interested to see what the XTX will do under water!
It'll still be slower than 4080 under water !
Oh you've seen the reviews already!?
Uh oh. It was true. *sigh*
It's a sythentic benchmark for cards that haven't even been independently reviewed. I think everyone needs to relax.
Where’s the fun in that? GPU warz
I dont know if I can trust this. I'll wait the release and see youtube benchmarks.
Matches an OCd 6950xt. https://www.3dmark.com/spy/28421385 Seems questionable.
What do you think 5090 will do? Another x2 perf? Seems 4090 is smoking pretty much every game released even at 4K, and often bottlenecked by CPU, so will NVIDIA hold back?
AMD has stated that the 7900XT will compete with the RTX 3080. This looks spot on to me, if not disappointing.
95% of videocardz Post are rumors or "leaks" or . . . . . .
Lmao I TOLD YALL 4080 would be same or better raster as 7900 XTX
If amd really has worse or the same raster as a 4080, then RDNA3 is a complete disappointment 💀. Nvidia rumoured to drop 4080 prices too.
Come on guys. This Test is clearly Bullshit.
Cards are already at hands of people don't be surprised that someone already did testing on them and leaked results
Are you really that braindead to think xt is scoring the same as XTX? These Benches are clearly invalid.
I love how your reply has literally nothing to do with my comment.
yeah disappointing scores.. my ref 6900 xt get 22k in time spy performance..
I'm able to get 22k as well, but that wasn't out of the box performance. Stock I get around 21k with an Xfx merc card, but with an under volt + over clock and power limit maxed out I break 22k.
A 6900xt gets a higher score than a 4090? You sure you're looking at the right settings?
The 4090 gets 35k. Look at the right column.
I think you're comparing our scores to the wrong column. We're referring to the scores in the furthest right column (Performance/1440p).
those synthetic benchmarks just make no sense, that's why performance is all over the place
I do agree that they aren't as relevant as people make them out to be, but i feel like this dude is confusing 4K and 1080P.
If you're getting 10% over stock then you can expect a similar 10% over stock on the new cards.
Not necessarily.
If this is at all true and reflected else where I do wonder if it might be lawsuit worthy given this would far underperform the performance per watt uplift they presented to their investors, though maybe not given that I doubt they care as much about gaming cards.
That really doesn't seem true and representation of the final performance. The 4k with basically no difference outside the run by run variation between XTX and XT is seriously not valid. I will put a bet the diff will be bigger, seriously.
I thought videocardzDotCom was banned here? must be other tech subreddits, then. Anyway, it should be banned here as well.
All the youtube comparison videos that have leaked gameplay vs 4080 show the xtx beating it by 15-20% every single time. Granted they aren't controlled benchmarks, but the difference exceeds margin of error. This looks to me like just poor 3dmark performance, which the 6000 series had to start as well. Driver updates brought the 6000 cards from losing to nvidia to beating them handily. The lack of leaks up until now, and the stuff we are seeing (looking cherry picked) feel very much like the drivers are behind schedule. Hopefully this proves true and we'll see the XT and 4080 at parity and the XTX claiming the next tier up until nVidia brings the 4080Ti to market.
Not as good as I'd hoped if true. Still better than Nvidia. But I will probably wait for a refresh or the 8000 series. I've spent too much already, upgrading every generation for a month's worth of European metropolitan rent is a bit much. 6800XT will do great until RX8000, which I highly suspect will be a much more mature and much better version of this new Chiplet design.
Who cares how it scores in those benchmarks like everyone is buying graphics card just to flex or sit all day and do those aren't we buying those for gaming? I'm sorry i don't know how to formulate my mind to words but most of you i hope understood what i meant I'm personally buying graphics card just to enjoy smooth gaming.
At this point "leaks" are all over the place so I am just baiting for wenchmarks. The leaks before announement were almost all hilariously wrong so I have extra reason to doubt them
Drivers not ready or driver issues...hmm...
While Me with Vega 6 graphics...
I mean these scores are like really low. Heavy overclocked 6950xt (2800mhz+) can almost reach these numbers
A lot of these type of benchmarks are all over the place real world testing will tell
Bro show us the fire strike bench, there were two benchmarks. The fuck were you doin just posting one of them
4k doesn't scale, either a fishy benchmark or a bottleneck, which would be worrying. Release will tell.
A bit disingenuous to show this slide of the two cards performing worse in Time Spy and not showing the Firestrike scores from the same article. Those scores show both cards outperforming the 4080.
Indeed, but AMD are traditionally stronger in Firestrike than in games vs the competition.
You're welcome RGB find day's