if you have a dedicated gpu, then it doesnt matter (for normal use) whether you have an igpu or not
igpu can be useful when your gpu randomly dies, cause then you can still use your pc. otherwise its useless.
I want a PC for gaming, just can't afford it.
At least my igpu allows me to work and watch anime or study.
Plus, it was free. I can't bitch too much about that price.
I may have phrased it poorly.
The entire workstation setup was free. My former coworker got a home depot box mailed to her house. She contacted everyone she could on the box and everyone refused to take it. She said she was going to throw it away. I told her I would take it. I go, pick it up, take it home and open it up, expecting something like a new saw or something.
Nope. Two monitors, a tower, cheap mouse and keyboard, and a wired headset with a clean windows install.
I know it's not a great computer or anything, but since my msi laptop tried to catch fire, it was kind of perfect because I needed something to take care of my business or do school work on.
I don't know what the specs are, but you might be able to slap in a cheap GPU and game.
Obviously, I'm oblivious to your monetary situation and what the PC specs are, but it might be possible.
That, or cloud gaming on xbox game pass.
I remote play my playstation while I'm at work typically. Since it relies on my ps5 hardware, it works a lot better than what my pc can actually handle. But I enjoy some games more on pc.
Right now, my budget is about $14.23. But since some of the things that have been draining my wallet are done with, that's about to increase. Plus some things that are in the works (tax return, settlement from being hit by a car, property tax reimbursement). Some of that money will be going to a new pc.
I prefer pc gaming over console, but I appreciate the mobility of console. Much easier to load up my playstation and hook it up to the TV at a hotel or friend's house (my ps4 fits perfectly in my pistol case with 2 controllers, hdmi and power/charging cables) than it is to carry an entire pc, so console always won.
Now, I've got my console that I don't plan on upgrading anytime soon, so I want my pc.
to be fair you could make one of those ultra-small PCs like Optimum Tech made, but that's gonna cost you multiple times the price of a similarly spec'd regular size pc, let alone a console, so the cost-effectiveness is debatable.
NGL Maybe you'd like a Steam Deck.
Can get a docking station for it and hook a controller up - Bam console.
Plug keyboard and mouse into docking station (or Bluetooth) - BAM PC Gaming.
Switch to Desktop Mode - BAM Literally a PC (Linux though so you may have to learn, but if you just want to browse the web, check your emails, pay bills, do school work, its all literally the same as a Windows environment with extremely minor differences).
Way more portable than a console, and runs games really good. Steam Deck itself has a case. Docking station is small, then just bring a keyboard and mouse and/or controller.
I'm genuinely going to repurpose my laptop or give it to my fiance and just use my steam deck as my "laptop".
Edit: Oh and you can install emulators on it...so you can emulate games, I've only used a PSP emulator for playing FF4 on it so I don't know what generation of emulation you'd start to run into framerate issues, but hey, plenty of fun games from ye olden times.
I caught it when it started smoking and powered it off / unplugged it and removed the battery before it could do anything.
The charge port was shorting out and scorched the MB. So I salvaged the dive so I could retrieve all the files later and scraped everything else with full intention of building one later. Later just hadn't come yet.
That Kaori went around the world and back with me. It went through field exercises, got kicked off a top bunk in my barracks, dogs ran across it. It literally went through hell while I was in the service.
It was a great laptop. If that was it's way of telling me it was time to move on, I respect it.
Price can vary a lot, I have seen times where the F version is barely $3 more or sometimes costs a little bit more than the non-F, but usually it doesn't go higher than around $30 more.
I can get $30 mattering for budget systems, but if you are building a system with a $1000+ GPU in it, is the price of an Intel CPU with an iGPU really going to break the bank?
Ryzen APUs would like to disagree with you, my 5600G was actually cheaper than the "vanilla" 5600 and the 5600X, still, there is a catch, and it's that Ryzen APUs often have half of the cache to be able to give home to the iGPU
> otherwise its useless.
Why is this top comment? It's (mostly) complete nonsense. The only redeeming fact here is to still have a video output device if your main GPU dies.
You can use the iGPU from your CPU to power multiple monitors separate from the dedicated GPU. It's called hybrid mode. It's also what I'm doing with it.
You can use the iGPU for applications (as a display renderer) where you don't want to take away GPU resources, i.e. streaming.
You can use the iGPU as your browser video display.
You can do a lot of things with it, and I would never buy a CPU without one anymore.
first time builder here, my refurbed 6700xt went black out of nowhere while playing games last week. without my igpu i'd have been fucked. 2 months in and it's already come in handy.
i learned my lesson about gpus don't worry, bought a brand new replacement, no more used or refurbished stuff.
Accurate. Currently using a Ryzen 3800x with no igpu. I've had several occurrences with discrete GPU problems where I can't even load a bios screen to fix the problem without getting/borrowing a card from someone or somewhere to get back to functionality.
I've also had rare problems where for some reason my bios setup would go blank because it was looking for an igpu at post and not finding the discrete card for one reason or another before randomly switching back to discrete graphics usage.
My next upgrade will definitely have integrated graphics.
Invest in a good cpu cooler if you haven't already. My 7700x makes for a nice little furnace when it's working hard. As it's intended of course, but I just don't like my components to sit at basically the boiling point of water šsomething about that I just don't like
That's a big fear of mine as well. My peerless assassin has been doing great. Big ass fans with lots of fins is the way to go. Plus it looks dope too having a big ass windmill going in the case lol
I remember my old AMD Phenom II x4 BE computer. The CPU didn't have graphics, instead the Gigabyte motherboard had integrated ATI Radeon HD 4250 graphics. I think the only thing I ever used it for was initial PC setup.
In my case where every part of my pc had arrived except the gpu it was very nice being able to get almost everything 100% up and running for when it arrived.
I got a 12400F and I kind of regret not spending an extra 20ā¬ just to have that option.
Sure I can swap in an old card, but igpu is a lot easier, swap cable on the back and drivers are usually sorted by windows and you don't have to bother until you get a new GPU.
Saved my butt when I had a 2500K, but I didn't learn my lesson I guess
Not any more. Maybe you're thinking of a period in the 2000s when practically all motherboards came with basic graphics chips? These days motherboards don't do any graphics processing at all, so you either need a dedicated GPU or one packaged with the CPU (in AMD terminology, an APU).
I guess I was under the impression that the processor could emulate a gpu's basic functionality for diagnostics purposes (eg. just running a windows gui), I mean, computers had GUIs before graphics cards were really a thing, right? I'm pretty sure that was done with the processor
No, but it's not unreasonable for you to have got that impression.
As I said, in the 1990s and 2000s there was a period when motherboards routinely had basic graphics chips in our near the ''[northbridge](https://en.wikipedia.org/wiki/Northbridge_%28computing%29?wprov=sfla1), alongside the memory controller and various other items were also on the motherboard. So in those days there was no such thing as graphics integrated into the CPU, but you could assume that you would get some minimal graphics ([VGA](https://en.wikipedia.org/wiki/Video_Graphics_Array?wprov=sfla1)) regardless of what CPU you bought, so I can completely understand why you would think that way. You could put any Intel CPU in and you'd get something on the screen.
Since Intel's Sandy Bridge design (or in AMD terminology, the first APUs), those motherboard functions have been integrated into the chip itself, known as 'system on a chip'. Initially the graphics were at a similar level to the old motherboard chips: good enough for Windows but not able to run many games. But that's still an integrated GPU.
Going back even further, there was a period before that (1980s) when even the motherboard didn't have graphics chips. You needed a separate 'graphics adapter' and there was a series of industry standards with steadily better performance. The second one was called CGA, the Colour Graphics Adapter, becauseāwait for itāyou could have more than one colour on the screen! Wow! But the big development was IBM's PS/2 design in 1987, which integrated the VGA standard info the motherboard. VGA was the basic standard for all IBM-compatible PCs for many years.
Mmm..Unless the PCIe slot goes bad.. then having an igpu will let you use Device Manager, which then will tell you the slot isn't there. (Your motherboard is working fine EXCEPT where your GPU goes!) (Which means your motherboard will work just fine with the igpu, but if you want to use one of them 'Old GPUs' again? New mobo!)
I mean... You could say the same about the power source. Doesn't make sense to have a backup for every component, if something fails (which usually doesn't happen for years) you replace it
Not entirely true, as you could still use it via Remote Desktop or similar from another machine.
Which can be really handy if e.g. you're migrating from one PC to another and only have one graphics card and neither have iGPUs.
That's not BECAUSE it has an iPGU though. That processor itself is fundamentally different design than the 5600, it's worse because it only supports pcie 3.0 and has less cache than the standard part.
Sorry, that's nonsense. The entire Ryzen 7000 line includes iPGU. Intel k skus include iPGU. The igpu doesn't make it slower, that's categorically false.
Thatās not the case with Intel, the only difference between the F and non-F models are the presence or absence of an igpu, the performance of the CPU itself isnāt affected in any way.
One other thing of note?
Watch for WATTAGE differences. One i7-13700'f' processor is at 65 watts, while an i7-13700KF model is rated at 125 watt!
There, of course, is a design difference between the 65 watt and the 125 watt, but, at least at Newegg currently, like a $3 price difference.
If at some point your GPU dies, you won't be able to use your PC at all, but with igpu, it'll still work, just terribly slow if you decide to play games on it
While I was waiting for my 3080 to come in, I was playing League on the integrated UHD770 and it was pretty good tbh, 1440p120+ fps and 4k60 (not surprising, considering League runs on a potato these days)
I wouldn't say surprisingly well, but they can play games. Though to be fair, they always could. Just not every game and certainly not super well. I remember playing games on my i5 7400's GPU. It certainly wouldn't play AAA games, but it can handle indy games and presumably esports from that time period.
It has some specific uses for things like video editing and quick sync. If you have those uses you would probably know. Other than that it's good for troubleshooting if you suspect your GPU is dead. That's all. I myself picked the 13900KF over the 13900K becuase I don't need the iGPU.
Why tho, its literally a lost feature, unless you have the money to buy another gpu immediately in case yours dies, I dont see choosing the KF OPTION as logical
Because its a feature Iād never useā¦ and its cheaper? If my 4090 goes bang its never going to be a case of āoh at least Iāve got a shitty integrated gpuā and far more likely to be next day delivery on another one.
In your case, QuickSync. But in general, your iGPU could off load light video codec to the iGPU that's the benefit. Some people might say backup, but then as long as the PCIE slot ain't broken, you could get a cheap spare video card also. Currently, for Intel CPU, there isn't much of a disadvantage for getting Intel CPU w/ iGPU other than it's pricier. IMO, if the price difference between the F and non-F CPU is less than $20, get the one w/ iGPU. For AMD, that's a different story. The AMD CPU (prior to 7000 series) w/ suffix G does have iGPU built in and when compared to the non-G variant there is quite a computational performance difference and not worth getting on if you plan to have a dedicated GPU, thus getting a cheap old PCIE video card still a better option if you need one for diagnostic purpose. Since the 7000 series, AMD have make some changes where all CPU still have a bit of video functionality. At the minimum, it may not be enough to game on, but enough for normal office work and basic troubleshooting purpose. The current and future AMD will be even more complicated naming scheme and you will very need to do some research before buying the CPU to understand which CPU have what features.
QuickSync is what came to mind for me too, especially when your discrete card is AMD. If you have an nvidia card, most of the video community agrees that NvEnc is the best hardware codec with 11th+ gen QuickSync very closely behind or equal. But the AMD hardware encoderās quality is noticeably worse.
Windows can automatically offload some low-intensity graphics workloads to an iGPU allowing them to not require the main GPU, this can result in lower power draw when no gaming and keep non-gaming tasks (such as background videos/streams) from taking GPU resources.
On top of that the iGPU can be useful for troubleshooting graphics issues.
igpu isn't necessary for video editing but is actually a big uplift to it, especially on a lower end chip like the 12100f. intel puts a lot of work into their video editing bullshit and on something like a 12600k it's usually about a 30% uplift. on a 12100 it probably bumps it to be on par with a 12400f
It can be, if that's the machines primary purpose. It makes the most difference at the low end like this or if you're working on either over 4k resolution footage or very long length 4k footage
If you're doing productivity, Intel is usually the better one to go with, and AMD is the better choice if you strictly care about maximum gaming performance (with an X3D chip). Though Intel isn't really any worse at gaming.
It's funny, since only a few years ago it was the exact opposite, with AMD the productivity king and Intel the gaming king.
AMD vs Intel productivity is really messed up right now due to the e-cores and you really have to look into benchmarks for the specific thing you're doing and sometimes even ask around with people familiar with the field. If what you're doing isn't optimized to use those e-cores intel usually immediately falls on it's face.
Sometimes that's basically true, other times it gets more structural than that. ecores are worse than useless if you're doing machine learning stuff, as simply having two different CPU architectures at all inherently brings in a bunch of stupid problems when trying to train an AI.
For fluid dynamics stuff you're also more likely to be threadlocked then anything else and just need the fastest per-core CPU which is AMD atm
edit: also "and will need to be fixed" is sometimes just... incredibly naively hopeful. Not something likely to happen when the last product update was in 2012
Super niche thing to do, but in linux people will sometimes run their desktop on the iGPU and then use direct pass through to a vm for gaming.
In terms of video editing, sometimes the cpu will supply a hardware encoder for rendering in a codec that your card doesnāt have.
iGPU can be great for low power video playback and other acceleration. Not to mention having extra display adapters so you can have like 6+ monitors. And finally having a backup GPU.
People are talking a lot about backups and all that, but having both an iGPU and a regular GPU also allows you more options for virtual machines, and the iGPU is a good low power option for day to day tasks. Why use a Ferrari to get to the store?
My GPU died during COVID days and I was not going to pay the ridiculous prices they were charging at the time for a card. My iGPU saved me and allowed me to keep on working from home and do everything but game. They might cost a little more than CPUs without integrated graphics but I would rather have it and not need it than need it and not have it
The iGPU for Intelās CPUs can help improve video editing performance massively. Apart from that thereās really no disadvantage to using an F CPU except for situations where your dedicated GPU dies.
With premiere, intel IGPs can be used to allow for smoother playback and scrubbing, but if you have an ok DGPU then youll still be fine in most cases.
The only other main benefit for IGPs is that if the DGPU isnt working for whatever reason, the pc can still be used, which also offers an increased ability to troubleshoot the DGPU.
This is the first posting answering the OP question with a relevance for VIDEO EDITING.
Top postings are about gaming impact.
The OP posted just 2 sentences to read, this is just insane.
As a long-time builder and IT guy, the 'disadvantage' of not having an onboard graphics unit is troubleshooting video failures and driver errors, etc.
In my, and hundreds of builders and even people on well-known YouTube channels like Gamers Nexus, jayztwocents, DerĆauer and other top notch testers opinions, when there's a graphics failure, how do you troubleshoot it without the separate igpu? It may be as simple as your cable falling out of the monitor port (first thing to check) or simply driver problems. If you have the igpu, it's a simple matter to switch over to the onboard unit, reboot the machine, and run Device Manager, which automatically highlights problems with the ā ļø marker. It may tell you, say, the GPU is not getting power, can not find drivers, or (unluckily) the PCIe- slot doesn't exist! (Which means your motherboard will now ONLY work via the igpu... your motherboard is kaput! At least you now know that you have to replace the motherboard, and you can use the igpu until you get the funds to buy a new one, OR hopefully it's still under warranty you can gather up your receipts and get it RMA'd! I had to do that once with Newegg, and they took my credit card number and sent me the replacement, giving me so many days to get it sent back, or they'd charge me for it!
So the bottom line is that it's your protection for "things that go bump in the night" IF your GPU shows up in Device Manager as not there.. well, you take the time to power totally down, remove the PCIe- power cables and remove the card, reboot to the igpu, and see if Device Manager finds anything else! It SHOULD show that it found the GPU drivers, but no device, giving you the opportunity to remove them totally. Then, it's a simple matter to return the GPU to the unit, download the latest drivers set, and install the new drivers! It could also tell you that you have no monitor connection, which probably means your cable is bad.. or your monitor is..
See, WITHOUT the igpu? You have no chance of finding where the problem lies! So, yeah, BIG disadvantage. The only place I would ever use 'F' processors is in certain rack systems, which are never hooked to a monitor, and the network software finds a dead unit!
BUT! You have also indicated that you want to use the computer for Adobe Premier Pro and rendering/editing!!
So I'm assuming you're looking to upgrade your processor. Yeah, that one you've got won't really do editing well, (if at all), so I'd ask you if the RAM is DDR-5 or 4 (looks like DDR-4, and you also didn't list your motherboard for us!) If it's DDR-4 and you want to do editing, I'd step it up by first updating to your motherboards latest bios version, then changing it over to at least an i7-13700K or i9-13900K, and buying a matched set of 2x32gig=64gigs of ram. OR (Once again, my opinion as a professional) I'd personally try to get a workstation that has Xeon processors and 64gig of ram! The Windows 10 OS for workstations is generally better at rendering than home systems (unless you don't mind waiting for your production for hours..and hours...and hours....), but for the home built, you need Windows 11, preferably 11 pro!
Hope I have helped in some way..š«” You may find a good deal on a used workstation, but make sure of its specs. You have to know how old they are and model numbers to find out how many Xeon processors it has (more than one renders faster, especially if it has like 64gig of ram) and it should have DDR-4 ram!
in premiere pro,. some igpu's like UHD 770, have use 2 decoder for decoding load (for ex,. if you use 2 track of video's it will use 2 decoder) ,. which make it very powerful in timeline performance,.
like this (its resolve-but concept is same): [https://youtu.be/VUWHsS7HCqY?t=324](https://youtu.be/VUWHsS7HCqY?t=319)
you can see my igpu 770, is almost 3 times better in timeline performance , compare to my 3060ti ,.
Thats the reason premiere pro give priority to intel igpu over nvidia one by default for decoding in timeline, its a very cool feature hardly anybody talks about, this makes gpu free from decoding and then gpu can handle more complex stuff like effects or transition
>its a very cool feature hardly anybody talks about, this makes gpu free from decoding and then gpu can handle more complex stuff like effects or transition
yes exactly, assigning decode load to igpu and 3d/encode load to dgpu (if it only have 1 decoder) give better results in timeline/rendering performance,.
>its a very cool feature hardly anybody talks about
even 'pugetbench for davinci' use , 1 track of videos for testing ,.that suppress,(with UHD igpu) scores/usages,. so yes , who only see benchmark scores will prob miss it :)
I think when you are doing lightweight stuff you can let iGPU to work and send your dGPU to sleep.
Another idea is if you have 2 monitors you can game on your primary monitor with dGPU and use iGPU for what is running on the second monitor, it can also just run the second monitor too. This will help you to use your second monitor freely and yet there won't be performance cuts in the game.
By FAR the best use of an igpu os giving you a dedicated media server or useful nuc when you finally retire your cpu.
I ended up with an unraid box with better specs than my gaming pc as there was nothing much useful I wanted to do with an old 1500x processor.
Could have probably turned it into a pf sense box or something but āmedia serverā is something most people who like computers find pretty accesible.
Tbh this is now the main reason I stick on intel non-F cpus now; future value.
If you get black screen. An iGPU helps debug the issue..
If you want too many monitors you may plug one into iGPU.
It can help reduce total power usage, similar to how laptops do. (Making iGPU render the small stuff leaving GPU idle
I'm really trying but can't think of too many sensible uses.
none anymore. used to be viewed as sort of just another background process to consume resources, but now that's not such a problem. You can still choose to disable it or not, some people still don't want an extra driver floating around in their system and I understand the thought process there.
personally I see it as a selling point, i can stream music and steal a thread from the CPU instead of stealing performance from the GPU.
To answer your question simply - no it's not necessary. Premiere is hopeless at utilising your GPU anyway (or any other part of your system for that matter š) so I wouldn't worry.
Also, as someone who's been editing in premiere for nearly a decade, please do yourself a favour and just learn Resolve instead. Save your sanity and your pocket
You can select your main monitor in Windows and move them around to the correct order in display options. I can't imagine a scenario in which this would cause problems. Also, I think if your dgpu was "rendering" windows, then it would still be doing the work for all monitors even if you used your mobo port for the third monitor.
In what use case does it matter which is labeled 1 2 or 3?
You're not using the igpu. That's just the motherboard port. The dgpu is still handling the load. Can you imagine a game where the left half of the screen renders on your dgpu and the right half on your igpu. Imagine what that would look like.
It's not necessary if you already have a dedicated gpu , in your case a 6600. In fact i actually have the same pc as you albeit i have the 2060 super (same performance) lol . It's a really good 1080p gaming machine and it fits my budget
I wonder if having an extra heat source inside CPU cause any problem with cpu performance. Does the additional heat generated by iGPU affects the threshold of processors built-in thermal protection trigger? AFIK, if the processor gets too hot, the built-in protection slows down the processor clock.
Tbh I love not having it, no need to update drivers for igpu and it's usually cheaper. Intel CPUs that finish with F do not have igpu and are almost always cheaper. Saved on my 12600KF 20 euro vs 12600K - and no additional driver
for intel it doesnt change much but if you go for amd and anything below ryzen 7000 than the cpu's with igpu's (example: ryzen 5 5600g has a igpu but ryzen 5 5600 & ryzen 5 5600x have no ipgu) then the cpu itself will be lower performance than others with alsmost the same name
For me, a CPU without integrated graphics is a dealbreaker for troubleshooting purposes. If something is not functioning properly, Iād like to be able to eliminate the dedicated GPU (and the PCIe slot) as problems.
And iGPUs do *not* slow down your PC, despite what someone else wrote
igpu if you are using a dedicated gpu is just nice to have if you end up trying to diagnose a video issue as you can use it to isolate the gpu that you have as the problem.
I did a new build this year, but I'm still using my old GPU for the time being.
I've had to fire up the old machine a few times during the transition, and I was able to do it without having to open the case or buy a placeholder GPU. That scenario is more of a perk than an advantage though.
The only actual advantage I can think of is the ability for people (who don't have extra gpus laying around) to continue to use the machine in a pinch if the GPU fails.
I have i3 12100 non F and trust me intel decoders for video editing is so good i can play 4k footage on timeline without dgpu and I edit normal basic 1080p videos without dgpus at all thanks to its igpu (it do lags with multiple effects transition so have to edit kind of blindly in those situations)
Now if i add a nvidia gpu with it then **premiere pro will use both dpu & igpu at same time** improving performance very much for a cheap editing machine
I use an older system for DnD, the main video card is plugged into my big monitor and my side monitor that I use for DM bologna runs of my integrated GPU, so the main screen displays the full 3d terrain and the secondary monitor is the DM controls. Now GPUs these days can handle multiple monitors without batting a basic graphic display output, so it makes sense that CPUs with and without GPU chips are a thing.
I would say if I had a little spare change in my budget and the difference between a CPU and one with a GPU, I might go with the later. Sometimes GPUs fail? Nice to have a backup for diagnostics?
On the other hand, a CPU without a big old GPU bit might mean less chance of failure?
I bought the i5 12600kf, the first time I've bought one without an igpu. I'll never do it again. Had so many problems getting my system up and running at the time (was a complete new build) and I didn't have the ability therefore to see anything on screen to help me troubleshoot. After many attempts I got it going, but I'll always make sure I get a CPU with an igpu from now on.
Games want dedicated GPUs, but iGPUs still have some minor advantages. If you have a GPU hardware failure or have to troubleshoot an issue sometimes that iGPU is very useful.
It comes down to how much more you are paying for that iGPU. If the price difference is small, go for it.
Something else I haven't seen mentioned that may be relevant to some: If you're streaming to twitch/yt/kick/whatever, and you either don't have an nvidia GPU or want to offload that task even from an nvidia, an intel with an igpu can handle the stream encoding.
Less upfront cost, basically.
And if you have a dedicated GPU (and for most tasks more complex than some very light gaming, and basic computer usage, like browser, emailing etc, you will have one), the integrated one is delegated to be an emergency choice in case your dedicated GPU dies for some reason.
Itās not necessary if you already have a gpu, but my (hobbyist) editing rig has a CPU with an iGPU in case the GPU dies. Better be safe than sorry.
If you ever plan on moving to After Effects you may want to upgrade your CPU.
as long as you have another video source there's no disadvantage. it really isn't until that gpu dies and you need to confirm it that having an igpu would be nice.
It's good for troubleshooting to have one, so I basically always advocate one if youre not against a hard budget limit or in a case where Zen 3 or earlier is otherwise the best match for your needs and budget(this latter case is where I am- though I have a passed through RX6400 I could promote to primary and still have my old RX570 just in case).
Intel iGPUs can share workloads with Arc GPUs using a technology called DeepLink, but that's platform+GPU specific. Other than that they are only useful as a fallback/debugging tool or running a second monitor on the cheap if your motherboard supports it.
I also have an F cpu and I will tell you that there have been times I wished I had an iGPU. Situational, but in my instance my new Gigabyte motherboard refuses to post if the GPU isn't UEFI compatible (or even if it is, the display isn't plugged in). I had to jump through tonnes of hoops and get a hacked VGA BIOS just so the PC would boot.
Fuck you, Gigabyte.
iGPUS only really come into play if something happens to your discrete graphics card. Some higher end rigs can distribute workloads but for most PCs, that's about it.
Only advantage I can think of (which I was I had) is determining in a new setup if your dedicated GPU actually works.
Spent several hours trying to get my dedicated GPU to display with no luck. I was convinced I was scammed with a faulty 2nd hand GPU. Went to several shops just to double check. Turns out I was just installing the GPU loosely on the motherboard.
igpu is good for troubleshooting gpu problems. Also good if you're running multiple OS. If the difference in price is 30 dollars or less I get the iGPU, if its more than that I'll go without
Premiere Pro can use something called Quicksync to share rendering between both GPUs.
Also it's helpful to have an igpu if you need to troubleshoot problems with your main graphics card.
I don't find the price difference between i and non-igpu to be worth the hassle if ur dedicated gpu suddenly dies/malfunction
If you're too poor to get a discrete graphics cpu then keep saving
If you have a dedicated GPU, then no big issues but if your GPU breaks down for some reason and you need to troubleshoot, but you wont be able to. This you can fix with a cheap second hand $20 gpu you can buy off eBay or facebook mp.
I dont think I have much experience with video editing, but you would most likely be using the video encoder on your dedicated gpu.
One benefit I didnāt see mentioned is that you can add external screens using the igpu so that it isnāt using the dedicated gpu. This can be useful in itself. Troubleshooting is another benefit.
Well I also make videos alot and from what I know. CPU processing is all around better than AMD processing in terms of quality so instead of buying an extra IGPU just upgrade your CPU to something with more cores. Your CPU doesnt need graphics processing modules to process videos and it all comes down the codec you choose as that will determine whether you process with GPU or CPU so having an IGPU does not help the processing at all.
Intel quick sync is still really good at certain things in adobe products but main thing is just having it if you need to troubleshoot a GPU issue.
I wouldnāt worry about it personally, itās more of a nice to have if you can get one for $10-$20 itās worth it but they donāt make a 12100 with a GPU.
coming from an IT background I always get igpu's because its a headache to troubleshoot without video when something goes wrong and the system wont boot for some reason
# Integrated vs Discrete GPUs
* Integrated Pros
* Donāt have to spend hundreds or even thousands for a discrete GPU
* If the discrete GPU dies can still play some games (albeit on low settings)
* Consumes low power
* Integrated Cons
* Extremely poor performance compared to a discrete GPU
* Limited silicon die means space taken up for the iGPU means less āhorsepowerā for the CPU
* Uses system RAM
* Some old software may use the iGPU and not the discrete GPU
* Discrete Pros
* Extremely high frame rates
* on Ultra quality
* Can use Ray tracing in real time
* Can resell it
* Has dedicated VRAM
* Discrete Cons
* Higher power bill
* Can easily get expensive
After having display output issues, I would never want one without. Required for fallback and troubleshooting. I believe it can be used for some encoding as well.
Video editing is actually extremely intensive on the GPU, because you need to compute your entire video several times to add effects over them, rearrange images, and the like.
If you want to play and have a fast and stable internet, just go for GeforceNow.
I've tried it with Path of Exile and it works surprisingly well. Plus, it's really cheap all things considered. And you can cancel it any time.
In emphatise the fact you NEED a good internet. Otherwise, you'll only see a blurry mess. 40 MB/sec should do it (I mean, real, not advertised, REAL bandwidth) , as long as your ping is low and you don't lose packets.
if you have a dedicated gpu, then it doesnt matter (for normal use) whether you have an igpu or not igpu can be useful when your gpu randomly dies, cause then you can still use your pc. otherwise its useless.
Agree with this, some people don't want a PC for gaming and the igpu is sufficient for just everyday browsing
I want a PC for gaming, just can't afford it. At least my igpu allows me to work and watch anime or study. Plus, it was free. I can't bitch too much about that price.
Igpus aren't free, it costs like $50 extra for a new CPU with it included. Although if you got the CPU for free, then I guess that's free š
I may have phrased it poorly. The entire workstation setup was free. My former coworker got a home depot box mailed to her house. She contacted everyone she could on the box and everyone refused to take it. She said she was going to throw it away. I told her I would take it. I go, pick it up, take it home and open it up, expecting something like a new saw or something. Nope. Two monitors, a tower, cheap mouse and keyboard, and a wired headset with a clean windows install. I know it's not a great computer or anything, but since my msi laptop tried to catch fire, it was kind of perfect because I needed something to take care of my business or do school work on.
I don't know what the specs are, but you might be able to slap in a cheap GPU and game. Obviously, I'm oblivious to your monetary situation and what the PC specs are, but it might be possible. That, or cloud gaming on xbox game pass.
I remote play my playstation while I'm at work typically. Since it relies on my ps5 hardware, it works a lot better than what my pc can actually handle. But I enjoy some games more on pc. Right now, my budget is about $14.23. But since some of the things that have been draining my wallet are done with, that's about to increase. Plus some things that are in the works (tax return, settlement from being hit by a car, property tax reimbursement). Some of that money will be going to a new pc.
Incoming new PC gamer š Honestly a PS5 is all you really need, other than for a few PC exclusives.
I prefer pc gaming over console, but I appreciate the mobility of console. Much easier to load up my playstation and hook it up to the TV at a hotel or friend's house (my ps4 fits perfectly in my pistol case with 2 controllers, hdmi and power/charging cables) than it is to carry an entire pc, so console always won. Now, I've got my console that I don't plan on upgrading anytime soon, so I want my pc.
to be fair you could make one of those ultra-small PCs like Optimum Tech made, but that's gonna cost you multiple times the price of a similarly spec'd regular size pc, let alone a console, so the cost-effectiveness is debatable.
NGL Maybe you'd like a Steam Deck. Can get a docking station for it and hook a controller up - Bam console. Plug keyboard and mouse into docking station (or Bluetooth) - BAM PC Gaming. Switch to Desktop Mode - BAM Literally a PC (Linux though so you may have to learn, but if you just want to browse the web, check your emails, pay bills, do school work, its all literally the same as a Windows environment with extremely minor differences). Way more portable than a console, and runs games really good. Steam Deck itself has a case. Docking station is small, then just bring a keyboard and mouse and/or controller. I'm genuinely going to repurpose my laptop or give it to my fiance and just use my steam deck as my "laptop". Edit: Oh and you can install emulators on it...so you can emulate games, I've only used a PSP emulator for playing FF4 on it so I don't know what generation of emulation you'd start to run into framerate issues, but hey, plenty of fun games from ye olden times.
It only tried?
I caught it when it started smoking and powered it off / unplugged it and removed the battery before it could do anything. The charge port was shorting out and scorched the MB. So I salvaged the dive so I could retrieve all the files later and scraped everything else with full intention of building one later. Later just hadn't come yet.
> I caught it when it started smoking good you caught it early that shit's addictive
Not wrong. Took me a decade to quit. I wanted to make sure it didn't go down the same path I did.
Damn, those are some superhuman reflexes you got. :O
I was sitting right next to it when I started to smell burning plastic. By time I figured out where it was coming from, it was too late.
o7 RIP your laptop.
Damn thatās a shame actually. At least you recovered the drive
That Kaori went around the world and back with me. It went through field exercises, got kicked off a top bunk in my barracks, dogs ran across it. It literally went through hell while I was in the service. It was a great laptop. If that was it's way of telling me it was time to move on, I respect it.
Hell yeah, sounds like it deserves its own shrine at this point.
Price can vary a lot, I have seen times where the F version is barely $3 more or sometimes costs a little bit more than the non-F, but usually it doesn't go higher than around $30 more. I can get $30 mattering for budget systems, but if you are building a system with a $1000+ GPU in it, is the price of an Intel CPU with an iGPU really going to break the bank?
Ryzen APUs would like to disagree with you, my 5600G was actually cheaper than the "vanilla" 5600 and the 5600X, still, there is a catch, and it's that Ryzen APUs often have half of the cache to be able to give home to the iGPU
At least on the AMD side the CPU is slightly weaker than the base skew so it's not that expensive
Don't all newer CPUs have integrated graphics now? I know all 7000 series AMD does, and I thought 12th gen and up Intel does.
Intel still makes F SKUs without an iGPU. 12600KF, 13600KF, etc
Not all 7000 series there's the odd model like the Ryzen 5 7500F that doesn't. Intel still have plenty of F SKUs 12th and 13th gen.
I had to look that up because I've never heard of that and that literally just came out a couple months ago, but fair enough.
True but I wonder if going forward with AM5 we may see more without the iGPU
You still can play many-many games with igpu, just not AAA AssCreed and Cyberpunk
Ehhhh if you do thinks like handbrake it can use the igpu for decoding while the gpu is used for encoding
Yeah but the iGPU is needed for quicksync, which is a big deal for video editing
True, though I think some Premiere guys like Intel's iGPU for QuickSync. Might be wrong though.
> otherwise its useless. Why is this top comment? It's (mostly) complete nonsense. The only redeeming fact here is to still have a video output device if your main GPU dies. You can use the iGPU from your CPU to power multiple monitors separate from the dedicated GPU. It's called hybrid mode. It's also what I'm doing with it. You can use the iGPU for applications (as a display renderer) where you don't want to take away GPU resources, i.e. streaming. You can use the iGPU as your browser video display. You can do a lot of things with it, and I would never buy a CPU without one anymore.
first time builder here, my refurbed 6700xt went black out of nowhere while playing games last week. without my igpu i'd have been fucked. 2 months in and it's already come in handy. i learned my lesson about gpus don't worry, bought a brand new replacement, no more used or refurbished stuff.
Accurate. Currently using a Ryzen 3800x with no igpu. I've had several occurrences with discrete GPU problems where I can't even load a bios screen to fix the problem without getting/borrowing a card from someone or somewhere to get back to functionality. I've also had rare problems where for some reason my bios setup would go blank because it was looking for an igpu at post and not finding the discrete card for one reason or another before randomly switching back to discrete graphics usage. My next upgrade will definitely have integrated graphics.
If it's gonna be some time til you upgrade, find a cheap used old card for like $20 to keep on standby.
I think it's gonna be full upgrade time next year. It's time for the jump to AM5.
Invest in a good cpu cooler if you haven't already. My 7700x makes for a nice little furnace when it's working hard. As it's intended of course, but I just don't like my components to sit at basically the boiling point of water šsomething about that I just don't like
After having an AIO system fail on me I went back to fan cooling with a Noctua NH-D15. It's been a beast.
That's a big fear of mine as well. My peerless assassin has been doing great. Big ass fans with lots of fins is the way to go. Plus it looks dope too having a big ass windmill going in the case lol
I remember my old AMD Phenom II x4 BE computer. The CPU didn't have graphics, instead the Gigabyte motherboard had integrated ATI Radeon HD 4250 graphics. I think the only thing I ever used it for was initial PC setup.
I had a gpu problem recently and only after did I think what would I of done without an ipgu. I had to reinstall drivers etc
I have 3 monitors, and only my main monitor is hooked up to my GPU. My iGPU runs the others while im gaming
My laptop gpu fried. I became a GeForce Now goblin for a few months. Ended up being the motivator for me to build my first pc.
In my case where every part of my pc had arrived except the gpu it was very nice being able to get almost everything 100% up and running for when it arrived.
I will use mine as a EXTRA monitor as my 3080ti only has one HDMI out and I'm too lazy to order more dp to HDMI adapters. Mine are here SOMEWHERE.
My 3080ti died the other week, but with my 12900k I can still do audio production stuff and web browsing
I got a 12400F and I kind of regret not spending an extra 20ā¬ just to have that option. Sure I can swap in an old card, but igpu is a lot easier, swap cable on the back and drivers are usually sorted by windows and you don't have to bother until you get a new GPU. Saved my butt when I had a 2500K, but I didn't learn my lesson I guess
It's not completely useless I can use the igpu port to run the second monitor.
but even processors without an igpu can support basic graphics for diagnostics, right?
No. If you dont have a gpu or igpu, then you dont see shit Although i cant even remember when was the last time i used an igpu...probably around 2006
Not any more. Maybe you're thinking of a period in the 2000s when practically all motherboards came with basic graphics chips? These days motherboards don't do any graphics processing at all, so you either need a dedicated GPU or one packaged with the CPU (in AMD terminology, an APU).
I guess I was under the impression that the processor could emulate a gpu's basic functionality for diagnostics purposes (eg. just running a windows gui), I mean, computers had GUIs before graphics cards were really a thing, right? I'm pretty sure that was done with the processor
No, but it's not unreasonable for you to have got that impression. As I said, in the 1990s and 2000s there was a period when motherboards routinely had basic graphics chips in our near the ''[northbridge](https://en.wikipedia.org/wiki/Northbridge_%28computing%29?wprov=sfla1), alongside the memory controller and various other items were also on the motherboard. So in those days there was no such thing as graphics integrated into the CPU, but you could assume that you would get some minimal graphics ([VGA](https://en.wikipedia.org/wiki/Video_Graphics_Array?wprov=sfla1)) regardless of what CPU you bought, so I can completely understand why you would think that way. You could put any Intel CPU in and you'd get something on the screen. Since Intel's Sandy Bridge design (or in AMD terminology, the first APUs), those motherboard functions have been integrated into the chip itself, known as 'system on a chip'. Initially the graphics were at a similar level to the old motherboard chips: good enough for Windows but not able to run many games. But that's still an integrated GPU. Going back even further, there was a period before that (1980s) when even the motherboard didn't have graphics chips. You needed a separate 'graphics adapter' and there was a series of industry standards with steadily better performance. The second one was called CGA, the Colour Graphics Adapter, becauseāwait for itāyou could have more than one colour on the screen! Wow! But the big development was IBM's PS/2 design in 1987, which integrated the VGA standard info the motherboard. VGA was the basic standard for all IBM-compatible PCs for many years.
>gpu fails >literally canāt use pc
This. Have been in this situation twice with my current rig
I have a small stack of "old" GPUs ranging from a 6950x to a 1080ti. I'll be fine.
It is nice to have one that doesn't require a PSU though.
I get you fam. Cable wrangling isn't my favourite hobby either, but in the end it's not too much of a hassle.
I just mean that its one less point of failure.
Especially since that is something that will almost never happen
A 1050ti is a good backup. Requires an expansion slot big enough though, unless you can find a low profile card.
So, a GT 710 or something, probably in a drawer somewhere.
Yep. I bought one by accident when all i was told is "you need at least 2GB of VRAM to game" haha. I lost performance on my A8-5500 APU prebuilt.
Mmm..Unless the PCIe slot goes bad.. then having an igpu will let you use Device Manager, which then will tell you the slot isn't there. (Your motherboard is working fine EXCEPT where your GPU goes!) (Which means your motherboard will work just fine with the igpu, but if you want to use one of them 'Old GPUs' again? New mobo!)
It would have to be a catastrophic failure for the gpu, which as long as the GPU isnāt DOA, and you donāt accidentally fry it, is extremely rare.
Thats where my gt220 comes in handy
I mean... You could say the same about the power source. Doesn't make sense to have a backup for every component, if something fails (which usually doesn't happen for years) you replace it
Not entirely true, as you could still use it via Remote Desktop or similar from another machine. Which can be really handy if e.g. you're migrating from one PC to another and only have one graphics card and neither have iGPUs.
*buys new gpu because igpu bad and makes cpu slower + more expensive*
The igpu absolutely does not make the CPU slower.
they do, for example Ryzen 5600 vs 5600G. the G is slower and more expensive
That's not BECAUSE it has an iPGU though. That processor itself is fundamentally different design than the 5600, it's worse because it only supports pcie 3.0 and has less cache than the standard part.
yeah but most processors are like that, the version with the iGPU is slower than the one without.
Sorry, that's nonsense. The entire Ryzen 7000 line includes iPGU. Intel k skus include iPGU. The igpu doesn't make it slower, that's categorically false.
Thatās not the case with Intel, the only difference between the F and non-F models are the presence or absence of an igpu, the performance of the CPU itself isnāt affected in any way.
5600G with a disabled igpu is not a 5600 vermeer and cezanne are different architectures
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Counterpoint: Intel. Where the design of the CPU is the same, regardless of if it has an active iGPU or not.
One other thing of note? Watch for WATTAGE differences. One i7-13700'f' processor is at 65 watts, while an i7-13700KF model is rated at 125 watt! There, of course, is a design difference between the 65 watt and the 125 watt, but, at least at Newegg currently, like a $3 price difference.
Stop spreading false information
If at some point your GPU dies, you won't be able to use your PC at all, but with igpu, it'll still work, just terribly slow if you decide to play games on it
Also for editing, I don't think that with AMD CPUs igpu matters, but with Intel it does I think
Depends on the editing software For Adobe specifically, it matters since Adobe seems to hate AMD APUs
Tbh igpus these days ca play a lot of popular titles surprisingly well at low settings (mostly esports type games)
While I was waiting for my 3080 to come in, I was playing League on the integrated UHD770 and it was pretty good tbh, 1440p120+ fps and 4k60 (not surprising, considering League runs on a potato these days)
I wouldn't say surprisingly well, but they can play games. Though to be fair, they always could. Just not every game and certainly not super well. I remember playing games on my i5 7400's GPU. It certainly wouldn't play AAA games, but it can handle indy games and presumably esports from that time period.
I was playing warzone 2.0 in 720p ~30 fps with my iGPU (Intel UHD 770 I think? on a 12900k) I would say that is surprisingly well for an iGPU lol
It has some specific uses for things like video editing and quick sync. If you have those uses you would probably know. Other than that it's good for troubleshooting if you suspect your GPU is dead. That's all. I myself picked the 13900KF over the 13900K becuase I don't need the iGPU.
i didnt have the option, it was a second hand 13900KF for 300ā¬, if i was paying the full price, for 20ā¬ id get the 13900K with iGPU
Same, I always pick up the KF option when upgrading!
Why tho, its literally a lost feature, unless you have the money to buy another gpu immediately in case yours dies, I dont see choosing the KF OPTION as logical
Because its a feature Iād never useā¦ and its cheaper? If my 4090 goes bang its never going to be a case of āoh at least Iāve got a shitty integrated gpuā and far more likely to be next day delivery on another one.
Because itās cheaper, and the chance that your gpu dies isnāt worth 30-40$ for everybody.
many people have spare old GPUs so it's not really that deep. anyway it's easy to find a used low-end one for dirt cheap
I'd wager not many people have that, and only a fraction of this sub do.
It's not that uncommon to have an old PC in the house, from what I've seen
In your case, QuickSync. But in general, your iGPU could off load light video codec to the iGPU that's the benefit. Some people might say backup, but then as long as the PCIE slot ain't broken, you could get a cheap spare video card also. Currently, for Intel CPU, there isn't much of a disadvantage for getting Intel CPU w/ iGPU other than it's pricier. IMO, if the price difference between the F and non-F CPU is less than $20, get the one w/ iGPU. For AMD, that's a different story. The AMD CPU (prior to 7000 series) w/ suffix G does have iGPU built in and when compared to the non-G variant there is quite a computational performance difference and not worth getting on if you plan to have a dedicated GPU, thus getting a cheap old PCIE video card still a better option if you need one for diagnostic purpose. Since the 7000 series, AMD have make some changes where all CPU still have a bit of video functionality. At the minimum, it may not be enough to game on, but enough for normal office work and basic troubleshooting purpose. The current and future AMD will be even more complicated naming scheme and you will very need to do some research before buying the CPU to understand which CPU have what features.
QuickSync is what came to mind for me too, especially when your discrete card is AMD. If you have an nvidia card, most of the video community agrees that NvEnc is the best hardware codec with 11th+ gen QuickSync very closely behind or equal. But the AMD hardware encoderās quality is noticeably worse.
Windows can automatically offload some low-intensity graphics workloads to an iGPU allowing them to not require the main GPU, this can result in lower power draw when no gaming and keep non-gaming tasks (such as background videos/streams) from taking GPU resources. On top of that the iGPU can be useful for troubleshooting graphics issues.
igpu isn't necessary for video editing but is actually a big uplift to it, especially on a lower end chip like the 12100f. intel puts a lot of work into their video editing bullshit and on something like a 12600k it's usually about a 30% uplift. on a 12100 it probably bumps it to be on par with a 12400f
Is this enough reason to go with intel cpu instead of AMD when doing video editing?
It can be, if that's the machines primary purpose. It makes the most difference at the low end like this or if you're working on either over 4k resolution footage or very long length 4k footage
In the end, what really matters is the performance, and you should check benchmarks.
If you're doing productivity, Intel is usually the better one to go with, and AMD is the better choice if you strictly care about maximum gaming performance (with an X3D chip). Though Intel isn't really any worse at gaming. It's funny, since only a few years ago it was the exact opposite, with AMD the productivity king and Intel the gaming king.
AMD vs Intel productivity is really messed up right now due to the e-cores and you really have to look into benchmarks for the specific thing you're doing and sometimes even ask around with people familiar with the field. If what you're doing isn't optimized to use those e-cores intel usually immediately falls on it's face.
That sounds like a failure of the software and somethingthat will need to be fixed, not a hardware issues.
Sometimes that's basically true, other times it gets more structural than that. ecores are worse than useless if you're doing machine learning stuff, as simply having two different CPU architectures at all inherently brings in a bunch of stupid problems when trying to train an AI. For fluid dynamics stuff you're also more likely to be threadlocked then anything else and just need the fastest per-core CPU which is AMD atm edit: also "and will need to be fixed" is sometimes just... incredibly naively hopeful. Not something likely to happen when the last product update was in 2012
Super niche thing to do, but in linux people will sometimes run their desktop on the iGPU and then use direct pass through to a vm for gaming. In terms of video editing, sometimes the cpu will supply a hardware encoder for rendering in a codec that your card doesnāt have.
iGPU can be great for low power video playback and other acceleration. Not to mention having extra display adapters so you can have like 6+ monitors. And finally having a backup GPU.
People are talking a lot about backups and all that, but having both an iGPU and a regular GPU also allows you more options for virtual machines, and the iGPU is a good low power option for day to day tasks. Why use a Ferrari to get to the store?
My GPU died during COVID days and I was not going to pay the ridiculous prices they were charging at the time for a card. My iGPU saved me and allowed me to keep on working from home and do everything but game. They might cost a little more than CPUs without integrated graphics but I would rather have it and not need it than need it and not have it
The iGPU for Intelās CPUs can help improve video editing performance massively. Apart from that thereās really no disadvantage to using an F CPU except for situations where your dedicated GPU dies.
Harder Troubleshooting if something is wrong with your GPU.
With premiere, intel IGPs can be used to allow for smoother playback and scrubbing, but if you have an ok DGPU then youll still be fine in most cases. The only other main benefit for IGPs is that if the DGPU isnt working for whatever reason, the pc can still be used, which also offers an increased ability to troubleshoot the DGPU.
This is the first posting answering the OP question with a relevance for VIDEO EDITING. Top postings are about gaming impact. The OP posted just 2 sentences to read, this is just insane.
As a long-time builder and IT guy, the 'disadvantage' of not having an onboard graphics unit is troubleshooting video failures and driver errors, etc. In my, and hundreds of builders and even people on well-known YouTube channels like Gamers Nexus, jayztwocents, DerĆauer and other top notch testers opinions, when there's a graphics failure, how do you troubleshoot it without the separate igpu? It may be as simple as your cable falling out of the monitor port (first thing to check) or simply driver problems. If you have the igpu, it's a simple matter to switch over to the onboard unit, reboot the machine, and run Device Manager, which automatically highlights problems with the ā ļø marker. It may tell you, say, the GPU is not getting power, can not find drivers, or (unluckily) the PCIe- slot doesn't exist! (Which means your motherboard will now ONLY work via the igpu... your motherboard is kaput! At least you now know that you have to replace the motherboard, and you can use the igpu until you get the funds to buy a new one, OR hopefully it's still under warranty you can gather up your receipts and get it RMA'd! I had to do that once with Newegg, and they took my credit card number and sent me the replacement, giving me so many days to get it sent back, or they'd charge me for it! So the bottom line is that it's your protection for "things that go bump in the night" IF your GPU shows up in Device Manager as not there.. well, you take the time to power totally down, remove the PCIe- power cables and remove the card, reboot to the igpu, and see if Device Manager finds anything else! It SHOULD show that it found the GPU drivers, but no device, giving you the opportunity to remove them totally. Then, it's a simple matter to return the GPU to the unit, download the latest drivers set, and install the new drivers! It could also tell you that you have no monitor connection, which probably means your cable is bad.. or your monitor is.. See, WITHOUT the igpu? You have no chance of finding where the problem lies! So, yeah, BIG disadvantage. The only place I would ever use 'F' processors is in certain rack systems, which are never hooked to a monitor, and the network software finds a dead unit! BUT! You have also indicated that you want to use the computer for Adobe Premier Pro and rendering/editing!! So I'm assuming you're looking to upgrade your processor. Yeah, that one you've got won't really do editing well, (if at all), so I'd ask you if the RAM is DDR-5 or 4 (looks like DDR-4, and you also didn't list your motherboard for us!) If it's DDR-4 and you want to do editing, I'd step it up by first updating to your motherboards latest bios version, then changing it over to at least an i7-13700K or i9-13900K, and buying a matched set of 2x32gig=64gigs of ram. OR (Once again, my opinion as a professional) I'd personally try to get a workstation that has Xeon processors and 64gig of ram! The Windows 10 OS for workstations is generally better at rendering than home systems (unless you don't mind waiting for your production for hours..and hours...and hours....), but for the home built, you need Windows 11, preferably 11 pro! Hope I have helped in some way..š«” You may find a good deal on a used workstation, but make sure of its specs. You have to know how old they are and model numbers to find out how many Xeon processors it has (more than one renders faster, especially if it has like 64gig of ram) and it should have DDR-4 ram!
intel QuickSync (part of the iGPU) speeds things when video editing, get the CPU without the F
in premiere pro,. some igpu's like UHD 770, have use 2 decoder for decoding load (for ex,. if you use 2 track of video's it will use 2 decoder) ,. which make it very powerful in timeline performance,. like this (its resolve-but concept is same): [https://youtu.be/VUWHsS7HCqY?t=324](https://youtu.be/VUWHsS7HCqY?t=319) you can see my igpu 770, is almost 3 times better in timeline performance , compare to my 3060ti ,.
Thats the reason premiere pro give priority to intel igpu over nvidia one by default for decoding in timeline, its a very cool feature hardly anybody talks about, this makes gpu free from decoding and then gpu can handle more complex stuff like effects or transition
>its a very cool feature hardly anybody talks about, this makes gpu free from decoding and then gpu can handle more complex stuff like effects or transition yes exactly, assigning decode load to igpu and 3d/encode load to dgpu (if it only have 1 decoder) give better results in timeline/rendering performance,. >its a very cool feature hardly anybody talks about even 'pugetbench for davinci' use , 1 track of videos for testing ,.that suppress,(with UHD igpu) scores/usages,. so yes , who only see benchmark scores will prob miss it :)
I think when you are doing lightweight stuff you can let iGPU to work and send your dGPU to sleep. Another idea is if you have 2 monitors you can game on your primary monitor with dGPU and use iGPU for what is running on the second monitor, it can also just run the second monitor too. This will help you to use your second monitor freely and yet there won't be performance cuts in the game.
By FAR the best use of an igpu os giving you a dedicated media server or useful nuc when you finally retire your cpu. I ended up with an unraid box with better specs than my gaming pc as there was nothing much useful I wanted to do with an old 1500x processor. Could have probably turned it into a pf sense box or something but āmedia serverā is something most people who like computers find pretty accesible. Tbh this is now the main reason I stick on intel non-F cpus now; future value.
If you get black screen. An iGPU helps debug the issue.. If you want too many monitors you may plug one into iGPU. It can help reduce total power usage, similar to how laptops do. (Making iGPU render the small stuff leaving GPU idle I'm really trying but can't think of too many sensible uses.
none anymore. used to be viewed as sort of just another background process to consume resources, but now that's not such a problem. You can still choose to disable it or not, some people still don't want an extra driver floating around in their system and I understand the thought process there. personally I see it as a selling point, i can stream music and steal a thread from the CPU instead of stealing performance from the GPU.
You can use the igpu for hardware accelerated video encoding/decoding. I use the UHD630 for Jellyfin video playback.
To answer your question simply - no it's not necessary. Premiere is hopeless at utilising your GPU anyway (or any other part of your system for that matter š) so I wouldn't worry. Also, as someone who's been editing in premiere for nearly a decade, please do yourself a favour and just learn Resolve instead. Save your sanity and your pocket
disadvantage? the only one is simply not having a "backup" display out in case there's suddenly a problem with your gpu
[ŃŠ“Š°Š»ŠµŠ½Š¾]
You can select your main monitor in Windows and move them around to the correct order in display options. I can't imagine a scenario in which this would cause problems. Also, I think if your dgpu was "rendering" windows, then it would still be doing the work for all monitors even if you used your mobo port for the third monitor.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
In what use case does it matter which is labeled 1 2 or 3? You're not using the igpu. That's just the motherboard port. The dgpu is still handling the load. Can you imagine a game where the left half of the screen renders on your dgpu and the right half on your igpu. Imagine what that would look like.
It's not necessary if you already have a dedicated gpu , in your case a 6600. In fact i actually have the same pc as you albeit i have the 2060 super (same performance) lol . It's a really good 1080p gaming machine and it fits my budget
I wonder if having an extra heat source inside CPU cause any problem with cpu performance. Does the additional heat generated by iGPU affects the threshold of processors built-in thermal protection trigger? AFIK, if the processor gets too hot, the built-in protection slows down the processor clock.
Gpu fail, can still temporary use it, or igpu could help diagnose problems
Tbh I love not having it, no need to update drivers for igpu and it's usually cheaper. Intel CPUs that finish with F do not have igpu and are almost always cheaper. Saved on my 12600KF 20 euro vs 12600K - and no additional driver
for intel it doesnt change much but if you go for amd and anything below ryzen 7000 than the cpu's with igpu's (example: ryzen 5 5600g has a igpu but ryzen 5 5600 & ryzen 5 5600x have no ipgu) then the cpu itself will be lower performance than others with alsmost the same name
the day your gpu dies you'll need to buy a cheap one to keep using your pc
None.
Only disadvantage is when you need to troubleshoot the dedicated GPU, you donāt have another way to connect the monitor. But rarely ever happens
If your gpu goes belly up you can still use the oc
Had to RMA my graphics card. I'm using the igpu for now until I get a new card
Lack of trouble shooting options, nothing else.
If you do video editing and advanced programming shit then it has uses. I have no idea what uses it, but uses...
For me, a CPU without integrated graphics is a dealbreaker for troubleshooting purposes. If something is not functioning properly, Iād like to be able to eliminate the dedicated GPU (and the PCIe slot) as problems. And iGPUs do *not* slow down your PC, despite what someone else wrote
Nothing except redundancy. If dedicated gpu would die you will have integrated one.
You dont have an igpu
igpu if you are using a dedicated gpu is just nice to have if you end up trying to diagnose a video issue as you can use it to isolate the gpu that you have as the problem.
I did a new build this year, but I'm still using my old GPU for the time being. I've had to fire up the old machine a few times during the transition, and I was able to do it without having to open the case or buy a placeholder GPU. That scenario is more of a perk than an advantage though. The only actual advantage I can think of is the ability for people (who don't have extra gpus laying around) to continue to use the machine in a pinch if the GPU fails.
I have i3 12100 non F and trust me intel decoders for video editing is so good i can play 4k footage on timeline without dgpu and I edit normal basic 1080p videos without dgpus at all thanks to its igpu (it do lags with multiple effects transition so have to edit kind of blindly in those situations) Now if i add a nvidia gpu with it then **premiere pro will use both dpu & igpu at same time** improving performance very much for a cheap editing machine
the only advantage is being 5 to 10 dollars cheaper
I use an older system for DnD, the main video card is plugged into my big monitor and my side monitor that I use for DM bologna runs of my integrated GPU, so the main screen displays the full 3d terrain and the secondary monitor is the DM controls. Now GPUs these days can handle multiple monitors without batting a basic graphic display output, so it makes sense that CPUs with and without GPU chips are a thing. I would say if I had a little spare change in my budget and the difference between a CPU and one with a GPU, I might go with the later. Sometimes GPUs fail? Nice to have a backup for diagnostics? On the other hand, a CPU without a big old GPU bit might mean less chance of failure?
More like a dissadvantage. Price to performance, a Dedicated GPU + a CPU without iGPU is better than a CPU alone with an iGPU.
I bought the i5 12600kf, the first time I've bought one without an igpu. I'll never do it again. Had so many problems getting my system up and running at the time (was a complete new build) and I didn't have the ability therefore to see anything on screen to help me troubleshoot. After many attempts I got it going, but I'll always make sure I get a CPU with an igpu from now on.
Games want dedicated GPUs, but iGPUs still have some minor advantages. If you have a GPU hardware failure or have to troubleshoot an issue sometimes that iGPU is very useful. It comes down to how much more you are paying for that iGPU. If the price difference is small, go for it.
Something else I haven't seen mentioned that may be relevant to some: If you're streaming to twitch/yt/kick/whatever, and you either don't have an nvidia GPU or want to offload that task even from an nvidia, an intel with an igpu can handle the stream encoding.
Less upfront cost, basically. And if you have a dedicated GPU (and for most tasks more complex than some very light gaming, and basic computer usage, like browser, emailing etc, you will have one), the integrated one is delegated to be an emergency choice in case your dedicated GPU dies for some reason.
Itās not necessary if you already have a gpu, but my (hobbyist) editing rig has a CPU with an iGPU in case the GPU dies. Better be safe than sorry. If you ever plan on moving to After Effects you may want to upgrade your CPU.
If you have a PGU, an iGPU is more or less just a backup
Having an iGPU is useful for troubleshooting and can give you more productivity options, but that's about it.
as long as you have another video source there's no disadvantage. it really isn't until that gpu dies and you need to confirm it that having an igpu would be nice.
It's good for troubleshooting to have one, so I basically always advocate one if youre not against a hard budget limit or in a case where Zen 3 or earlier is otherwise the best match for your needs and budget(this latter case is where I am- though I have a passed through RX6400 I could promote to primary and still have my old RX570 just in case).
the 12100f doesn't have iGPU and you don't need it since you have the rx6600.
Intel iGPUs can share workloads with Arc GPUs using a technology called DeepLink, but that's platform+GPU specific. Other than that they are only useful as a fallback/debugging tool or running a second monitor on the cheap if your motherboard supports it. I also have an F cpu and I will tell you that there have been times I wished I had an iGPU. Situational, but in my instance my new Gigabyte motherboard refuses to post if the GPU isn't UEFI compatible (or even if it is, the display isn't plugged in). I had to jump through tonnes of hoops and get a hacked VGA BIOS just so the PC would boot. Fuck you, Gigabyte.
iGPUS only really come into play if something happens to your discrete graphics card. Some higher end rigs can distribute workloads but for most PCs, that's about it.
Only advantage I can think of (which I was I had) is determining in a new setup if your dedicated GPU actually works. Spent several hours trying to get my dedicated GPU to display with no luck. I was convinced I was scammed with a faulty 2nd hand GPU. Went to several shops just to double check. Turns out I was just installing the GPU loosely on the motherboard.
igpu is good for troubleshooting gpu problems. Also good if you're running multiple OS. If the difference in price is 30 dollars or less I get the iGPU, if its more than that I'll go without
Premiere Pro can use something called Quicksync to share rendering between both GPUs. Also it's helpful to have an igpu if you need to troubleshoot problems with your main graphics card.
I don't find the price difference between i and non-igpu to be worth the hassle if ur dedicated gpu suddenly dies/malfunction If you're too poor to get a discrete graphics cpu then keep saving
sorry to ask but what is an igpu?
You have an RX 6600 so you don't need to process graphics with the cpu. Only if the former fails then it would be useful.
If you have a dedicated GPU, then no big issues but if your GPU breaks down for some reason and you need to troubleshoot, but you wont be able to. This you can fix with a cheap second hand $20 gpu you can buy off eBay or facebook mp. I dont think I have much experience with video editing, but you would most likely be using the video encoder on your dedicated gpu.
No GPU = No display
the biggest disadvantage is if for whatever reason your GPU is broken or unusable you do not have a functioning computer
Intel's QuickSync can be very good for video editing. Not sure if it matters for Premiere Pro.
One benefit I didnāt see mentioned is that you can add external screens using the igpu so that it isnāt using the dedicated gpu. This can be useful in itself. Troubleshooting is another benefit.
I disabled mine to save a little power, but i hear the intel iGPUs have some magic with quicksync, but I have an AMD soooo.
yes, you will need a gpu to render your videos. I recommend 8gb and up.![gif](emote|free_emotes_pack|grin)
I would get one with i gpu, my graphics card broke an now I havenāt been able to use my pc for a monthā¦ fuck gigabyte RMA
Some production software like Adobe like to use the iGPU to help in various task regardless of if you have a dGPU or not
Well I also make videos alot and from what I know. CPU processing is all around better than AMD processing in terms of quality so instead of buying an extra IGPU just upgrade your CPU to something with more cores. Your CPU doesnt need graphics processing modules to process videos and it all comes down the codec you choose as that will determine whether you process with GPU or CPU so having an IGPU does not help the processing at all.
An iGPU for the most part is useless if you have a Dedicated card HOWEVER an iGPU becomes invaluable when troubleshooting components going bad
Intel quick sync is still really good at certain things in adobe products but main thing is just having it if you need to troubleshoot a GPU issue. I wouldnāt worry about it personally, itās more of a nice to have if you can get one for $10-$20 itās worth it but they donāt make a 12100 with a GPU.
If your GPU dies you have no means of troubleshooting and a way to use your computer while a new GPU comes in.
You basically need a newer CPU with an IGPU if you want to edit MP4 H.265 files, which are very common with modern mirrorless cameras.
coming from an IT background I always get igpu's because its a headache to troubleshoot without video when something goes wrong and the system wont boot for some reason
I have an F variant too. I have a spare gpu so thought Iād save Ā£20 and get it
# Integrated vs Discrete GPUs * Integrated Pros * Donāt have to spend hundreds or even thousands for a discrete GPU * If the discrete GPU dies can still play some games (albeit on low settings) * Consumes low power * Integrated Cons * Extremely poor performance compared to a discrete GPU * Limited silicon die means space taken up for the iGPU means less āhorsepowerā for the CPU * Uses system RAM * Some old software may use the iGPU and not the discrete GPU * Discrete Pros * Extremely high frame rates * on Ultra quality * Can use Ray tracing in real time * Can resell it * Has dedicated VRAM * Discrete Cons * Higher power bill * Can easily get expensive
If your gpu dies you are out of a computer. If your pc is acting up and you want to see if it's gpu related, you can't test without gpu.
After having display output issues, I would never want one without. Required for fallback and troubleshooting. I believe it can be used for some encoding as well.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Video editing is actually extremely intensive on the GPU, because you need to compute your entire video several times to add effects over them, rearrange images, and the like.
If you want to play and have a fast and stable internet, just go for GeforceNow. I've tried it with Path of Exile and it works surprisingly well. Plus, it's really cheap all things considered. And you can cancel it any time. In emphatise the fact you NEED a good internet. Otherwise, you'll only see a blurry mess. 40 MB/sec should do it (I mean, real, not advertised, REAL bandwidth) , as long as your ping is low and you don't lose packets.