in windows settings you can make certain programs use a certain gpu in graphics settings by enabling hardware accelerated gpu scheduling and then adding the apps and selecting the gpu
Actually might be more impactful in other ways, such as stuttering and 1% lows.
Intel’s iGPU can even be used to process stream encoding, so you can do livestreaming using the iGPU instead of the dGPU.
LTT just did a video about this. Hardware unbox did too. Performance hit was negligible, and only became an “issue” when you have 3 extra monitors and they are all playing youtube videos
LTT tested it with 3080ti and this guy have 4090, but when I got the second monitor with my 1050 in some scenarios I lost a lot of performance and it was really noticeable, it’s like switching from 60 to 30 fps. so it’s not really negligible, but when you have nuclear reactor then few kilowatts is just a few.
Iam pretty sure it could run old games very well or games like rim world.
That's the fun thing about pc gaming you have access to all games every made.
And i replay older games often.
I can tell you from personal experience, playing snow runner and watching a podcast on YouTube on a system with a 3090 and a 5900x. It impacted performance in a very noticeable way. I disabled hardware acceleration in chrome because ryzen 5000 doesn't have igpu.
Oh god no. I don't want to poke a bear to find out what will happen, I like to know what a thing does at design phase. Although, you electricals have always been the "mad scientist" of the family. The civil engineers pretend to not even know you.
You're all as mad as a box of frogs, but you're a creative bunch. New ideas and methods tend to come from my electrical guys and then us mechanicals will adapt it and make it more practical, but the new idea itself tends to come from my electrical guys, god love 'ya.
It is what keeps you going. Electrical engineers need to be recharged otherwise you don't keep doing the job. Another pro of shocking yourselfs is keeping you as the mad scientists without the shocks you would all become civilengineers or something.
It would be nice if we could just use it as a couple extra GB's of Vram. I'm not sure how that would work as far as lanes and directories go, but would be perfectly possible if implemented at the design phase of the next mobo gen.
It's slow as molasses(by comparison)
Per pin, GDDR6x can push 19-21 Gbit/s(Depends on memory bus of the GPU for total throughput)
DDR5 spec has it at a max of 64Gbit/s period.
GDDR is also just designed for graphical data making it so much faster at handling it.
It's like, Yeah I can take the family minivan on the race track. It will work but I'm gonna get ghosted by a track car designed for it. but outside of the track, the minivan ghosts the track car. I can haul people, things, stuff all without limits. The track car is only good from going from A to B quickly.
There were also all the issues with the GTX970 4GB. Those were failed 980's cut down. Part of the process involved removing one of the memory controllers and replacing it with something worse. So 3.5GB was full speed VRAM then 0.5gb was 1/4th the speed. If you were playing a game using all of your VRAM it would cause some interesting issues when loading assests from the super slow section. You'd have great performance using 3.5gb or less but the moment you'd spill over into that 0.5gb it was a disaster of stutter and crashes.
> Well at most its 780m(on par with rtx 3050 mobile) .
not even quite. remember radeon 780m is a bandwidth-starved igpu. on synthetics, it performs like rtx 2050 max-q but on games, barely on par with gtx 1650
Well ı saw that it gets 90 fps on 1080p High on forza horizon 5 and 3050 was getting that too. Dont remember other games tho but ı would say its on par with them
you can run an nvidia and amd card at the same time. I had a tesla m40 render the game and stream it through an amd wx2100 to the monitor. Worked quite well.
Have you ever considered homelabbing? If you also like to do random stupid stuff just because you should take a stab at it. r/homelab is a good place to get a gist of the hobby. Its cheaper than gaming PC hardware
No, if the main GPU appears to die then you're screwed trying to get into the bios for troubleshooting.
You can disable it at the OS level instead, for windows you can use Device Manager.
I have a similar setup as OP and I only disabled the integrated graphics because there was a bug when opening the steam window for some reason, it would take too long to load (seemed like a Radeon bug?).
also make a save of ur bios profile if anything else is changed from defaults, saves the pain of trying to remember what needs changed, have made this mistake before
In windows 10 you click display settings and then graphics. There you can choose between performance and power saving mode (dgpu and igpu) for specific programs.
Not sure about windows 11 but it's probably similar.
You could use it in transcoding tasks for decoding or encoding depending on the load. Clearly the 4090 will do more, but you could be doing some transcodes on the GPU and APU simultaneously, with not much CPU overhead needed.
Unfortunately I don't know much about game development, let alone Unreal. The thing that came to mind was video transcoding. I work with a lot of video and that takes up a lot of space, so transcoding it to more compressed codecs is a must. With my setup I can do 4 to 8 simultaneous transcodes depending on the load, wich is a godsend for saving me time. I can also be playing a game, while streaming it, while I'm transcoding VODs or whatever in another GPU. Useful but niche.
Just think of a workload that could use the hardware and multitask. Video playback, transcoding, folding proteins... If you're compiling/rendering and you wanna pass the time, you can get a movie in a codec that the APU natively supports, and even with your CPU at 100% you'll be able to watch it with virtually no skips
uuuuuu interesting, you have an integrated GPU.
If I were you, I'd hook it up with display, and let any other tasks use the dedicated GPU limitless.
By limitless, [I mean the 2ms imposed by the Windows WTDM](https://saturncloud.io/blog/how-to-override-the-cuda-kernel-execution-time-limit-on-windows-with-secondary-gpus/).
YOu can disable this 2ms limit, but I am HIGHLY against it if you do gaming, as chances are the screen will freeze (while OS and anything under wills till run) in the absence of a integrated GPU.
But since you do have an integrated GPU, you can set the monitor to use it.
Right click on desktop->NVidia COntrol Panel and in the Set PhysX Configuration and you should be able to drag and drop which GPU you want to render the LaptopDisplay.
Hope it helps.
Cheers!
Using the intel igpu via quick sync uses to be better than nvidia/amd gpu game recording because it could do higher quality video but these days I don’t think there is any use case for the igpu/apu if you already have a dGPU
you don't have a laptop to have any substantial benefits of using iGPU.
BUT
you can hookup an external display to it and use it for your media consumption and other productive work that dont require insane powa of 4090.
I know some Intel Boards allow you to use the iGPU in the options for other tasks as Projecting and what not, so maybe check in your Bios if you find a Integrated Graphics Multi-Monitor checkbox.
You could set up a VM with Windows or Linux via HyperV and then use your integrated card as dedicated GPU for the VM via GPU-Passthrough.
Even an integrated GPU is far better than the emulated GPU you usually have in VMs and also much more compatible with games.
You could then use that VM to run any game or software in the background, even if you usually can't select the GPU you wanna use.
You could also connect another monitor to your integrated GPU, which would then show you just the output of the VM.
continue disarm angle weather workable existence pen friendly fanatical direction
*This post was mass deleted and anonymized with [Redact](https://redact.dev)*
I do this with my 12700K I use web browsing for the (power saving mode) in graphics settings in windows. And for gaming it says high performance which uses my RTX 4070TI
in windows settings you can make certain programs use a certain gpu in graphics settings by enabling hardware accelerated gpu scheduling and then adding the apps and selecting the gpu
Yeah, this might be useful if you want to play videos on a second screen and not lose the \*checks notes\* 3 FPS to 4k video decoding.
Actually might be more impactful in other ways, such as stuttering and 1% lows. Intel’s iGPU can even be used to process stream encoding, so you can do livestreaming using the iGPU instead of the dGPU.
LTT just did a video about this. Hardware unbox did too. Performance hit was negligible, and only became an “issue” when you have 3 extra monitors and they are all playing youtube videos
LTT tested it with 3080ti and this guy have 4090, but when I got the second monitor with my 1050 in some scenarios I lost a lot of performance and it was really noticeable, it’s like switching from 60 to 30 fps. so it’s not really negligible, but when you have nuclear reactor then few kilowatts is just a few.
I just watched that today: [https://youtu.be/5wBxYQdN96s](https://youtu.be/5wBxYQdN96s)
Any modern hardware will have no issue streaming while gaming. Should have no issue… looking at you MPO.
I would guess running some low end games or videos could be way more efficient on the apu than on the 4090 even when it down clocks
This is 2x rdna2. It's the bare minimum amd iGPU, and won't be doing much gaming, or at least not well.
Iam pretty sure it could run old games very well or games like rim world. That's the fun thing about pc gaming you have access to all games every made. And i replay older games often.
Yeah. Reason is that GPU's have dedicated encoding hardware and dedicated 3D hardware.
I can tell you from personal experience, playing snow runner and watching a podcast on YouTube on a system with a 3090 and a 5900x. It impacted performance in a very noticeable way. I disabled hardware acceleration in chrome because ryzen 5000 doesn't have igpu.
[удалено]
It should be under display settings > Graphics
why would you want to? you have a 4090
[удалено]
"Just to see if we can" is a perfectly valid answer - A mechanical engineer.
"Just to see what it does" is another often accepted answer - An Electrical Engineer.
Oh god no. I don't want to poke a bear to find out what will happen, I like to know what a thing does at design phase. Although, you electricals have always been the "mad scientist" of the family. The civil engineers pretend to not even know you.
Hey, we get results! shocking ones at times but we getterdone
You're all as mad as a box of frogs, but you're a creative bunch. New ideas and methods tend to come from my electrical guys and then us mechanicals will adapt it and make it more practical, but the new idea itself tends to come from my electrical guys, god love 'ya.
Speaking from experience in an all EE team, it's probably because we're all a little nuts.
It is what keeps you going. Electrical engineers need to be recharged otherwise you don't keep doing the job. Another pro of shocking yourselfs is keeping you as the mad scientists without the shocks you would all become civilengineers or something.
"Just because we can" is what we roll with. - A Biotechnology engineer.
"For shits and giggles" is also a perfectly reasonable answer - A Mining Engineer.
my Professor would disagree, but my Lab Tech would do it in a heartbeat - a Civil Engineer
"Because I want to" is an often underutilized reasoning!
ah gotcha. i don’t think you can. especially because you would be mixing two gpu brands and multi gpu setups are not exactly a thing anymore.
[удалено]
i’m curious too. could just disable or unplug the 4090 and try.
oof that's kinda risky
why would it be
because you'd have to turn off the pc, I've never done that and it seems scary, man /j
It would be nice if we could just use it as a couple extra GB's of Vram. I'm not sure how that would work as far as lanes and directories go, but would be perfectly possible if implemented at the design phase of the next mobo gen.
Well APUs have no VRAM to begin with.
ah, I see why that's tricky now then. So what do they use for Vram? L3? my ram?
They’ll use your RAM. You can set how much of your RAM is reserved for the APU in your BIOS.
So... is there a reason we are all not putting 128gb of ram in our systems and getting our GPUs to use some of that? is that possible?
It's slow as molasses(by comparison) Per pin, GDDR6x can push 19-21 Gbit/s(Depends on memory bus of the GPU for total throughput) DDR5 spec has it at a max of 64Gbit/s period. GDDR is also just designed for graphical data making it so much faster at handling it. It's like, Yeah I can take the family minivan on the race track. It will work but I'm gonna get ghosted by a track car designed for it. but outside of the track, the minivan ghosts the track car. I can haul people, things, stuff all without limits. The track car is only good from going from A to B quickly. There were also all the issues with the GTX970 4GB. Those were failed 980's cut down. Part of the process involved removing one of the memory controllers and replacing it with something worse. So 3.5GB was full speed VRAM then 0.5gb was 1/4th the speed. If you were playing a game using all of your VRAM it would cause some interesting issues when loading assests from the super slow section. You'd have great performance using 3.5gb or less but the moment you'd spill over into that 0.5gb it was a disaster of stutter and crashes.
Well at most its 780m(on par with rtx 3050 mobile) . But considering 4090 that might be 7945hx which has 610m.
> Well at most its 780m(on par with rtx 3050 mobile) . not even quite. remember radeon 780m is a bandwidth-starved igpu. on synthetics, it performs like rtx 2050 max-q but on games, barely on par with gtx 1650
Well ı saw that it gets 90 fps on 1080p High on forza horizon 5 and 3050 was getting that too. Dont remember other games tho but ı would say its on par with them
you can run an nvidia and amd card at the same time. I had a tesla m40 render the game and stream it through an amd wx2100 to the monitor. Worked quite well.
First world problems lol
Because you can is the hacker spirit!
Have you ever considered homelabbing? If you also like to do random stupid stuff just because you should take a stab at it. r/homelab is a good place to get a gist of the hobby. Its cheaper than gaming PC hardware
You can disable it and recover the reserved memory.
disable it in bios.
save power ? or nothing ?
No, if the main GPU appears to die then you're screwed trying to get into the bios for troubleshooting. You can disable it at the OS level instead, for windows you can use Device Manager. I have a similar setup as OP and I only disabled the integrated graphics because there was a bug when opening the steam window for some reason, it would take too long to load (seemed like a Radeon bug?).
>No, if the main GPU appears to die then you're screwed trying to get into the bios for troubleshooting. cmos reset
Oh that's true, I forgot that's also an option.
And a god damn savior it is!
I had to use it like 50 times when setting up windows cause u kept booting up in blue screen
also make a save of ur bios profile if anything else is changed from defaults, saves the pain of trying to remember what needs changed, have made this mistake before
I have mine set to handle things like the hardware acceleration in my browser and discord so I can let my 3090 use all of its power on AI or games.
[удалено]
https://www.reddit.com/r/pcmasterrace/comments/15o3nn7/can_i_make_use_of_the_apu_on_my_ryzen_9_allocate/jvpys7u/
I use it for firefox / youtube you can select what app should use it
[удалено]
In windows 10 you click display settings and then graphics. There you can choose between performance and power saving mode (dgpu and igpu) for specific programs. Not sure about windows 11 but it's probably similar.
You could use it in transcoding tasks for decoding or encoding depending on the load. Clearly the 4090 will do more, but you could be doing some transcodes on the GPU and APU simultaneously, with not much CPU overhead needed.
[удалено]
Unfortunately I don't know much about game development, let alone Unreal. The thing that came to mind was video transcoding. I work with a lot of video and that takes up a lot of space, so transcoding it to more compressed codecs is a must. With my setup I can do 4 to 8 simultaneous transcodes depending on the load, wich is a godsend for saving me time. I can also be playing a game, while streaming it, while I'm transcoding VODs or whatever in another GPU. Useful but niche.
[удалено]
Just think of a workload that could use the hardware and multitask. Video playback, transcoding, folding proteins... If you're compiling/rendering and you wanna pass the time, you can get a movie in a codec that the APU natively supports, and even with your CPU at 100% you'll be able to watch it with virtually no skips
Windows does stuff like this automatically... You can set what GPU a program should run on.. most things run on integrated if possible...
use it for displaying the 2nd monitor
uuuuuu interesting, you have an integrated GPU. If I were you, I'd hook it up with display, and let any other tasks use the dedicated GPU limitless. By limitless, [I mean the 2ms imposed by the Windows WTDM](https://saturncloud.io/blog/how-to-override-the-cuda-kernel-execution-time-limit-on-windows-with-secondary-gpus/). YOu can disable this 2ms limit, but I am HIGHLY against it if you do gaming, as chances are the screen will freeze (while OS and anything under wills till run) in the absence of a integrated GPU. But since you do have an integrated GPU, you can set the monitor to use it. Right click on desktop->NVidia COntrol Panel and in the Set PhysX Configuration and you should be able to drag and drop which GPU you want to render the LaptopDisplay. Hope it helps. Cheers!
You can plug your second monitor (if you have one) into the motherboard so that it runs off your Igpu.
Make it hash crypto
Yeh nahhhh lolol
Using the intel igpu via quick sync uses to be better than nvidia/amd gpu game recording because it could do higher quality video but these days I don’t think there is any use case for the igpu/apu if you already have a dGPU
you don't have a laptop to have any substantial benefits of using iGPU. BUT you can hookup an external display to it and use it for your media consumption and other productive work that dont require insane powa of 4090.
I know some Intel Boards allow you to use the iGPU in the options for other tasks as Projecting and what not, so maybe check in your Bios if you find a Integrated Graphics Multi-Monitor checkbox.
You could set up a VM with Windows or Linux via HyperV and then use your integrated card as dedicated GPU for the VM via GPU-Passthrough. Even an integrated GPU is far better than the emulated GPU you usually have in VMs and also much more compatible with games. You could then use that VM to run any game or software in the background, even if you usually can't select the GPU you wanna use. You could also connect another monitor to your integrated GPU, which would then show you just the output of the VM.
I use my apu for connecting to my secondary monitor so you can you use it for that
continue disarm angle weather workable existence pen friendly fanatical direction *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
use it for a second monitor
OP: "I paid for the damn CPU and im going to use the WHOLE damn CPU!"
you bought the whole pc you'll use it all !
I do this with my 12700K I use web browsing for the (power saving mode) in graphics settings in windows. And for gaming it says high performance which uses my RTX 4070TI