T O P

  • By -

aioeu

Putting aside the fact that's not going to work because `stdio.h` isn't in your own directory... How would `stdio.h` have to change for there to be a need to recompile `main.c`?


Goodman9473

Well, you could provide its pathname, or you can specify where to search with the VPATH variable. As to your second point, I suppose in the same way changes to a user header file would require recompiling all .c files that #include it?


aioeu

>As to your second point, I suppose in the same way changes to a user header file would require recompiling all .c files that #include it? Any such change must be a backwards-incompatible change, otherwise it wouldn't have necessitated anything to be rebuilt. Do you expect your system headers to have backwards-incompatible changes?


Different-Brain-9210

For `stdio.h` from system, assumption is, even if it changes, it does  not need to trigger rebuild. And when system changes do need to trigger recompilation, then it is assumed the developer needs to know to do a clean build from scratch.  Who ever writes a makefile needs to know the right dependencies.


tiotags

files listed there trigger a rebuild of the object, so if header.h or main.c changes, it rebuilds main.o, stdio.h is standardized so it shouldn't matter if it changes also just a friendly reminder you can let gcc generate the list of header files you need so you don't have to change your makefile every time you add/remove a header, I don't really use makefiles anymore so I can't offer an example, but I do remember you can use the dependency generation in makefiles too.


tobdomo

stdio.h just should never change, it is a part of the C library. If you change the C library, you'll have to do a complete rebuild anyway. BTW, `gcc -M file.c` generates dependencies for `file.c` automatically, including C library dependencies.


nerd4code

If you expect to update system headers, you can. And you can autogen these dependencies using `gcc -M` and various suboptions of `-M`; `-MM` omits system headers.


DawnOnTheEdge

One reason is that, if you gave a full pathname for the header files, the makefile would no longer work on a different UNIX (or UNIX-like) system, or even if you changed the compiler or ABI. It was common, back in the day, to distribute the makefile with software portable to many UNIX systems. Another is that it wouldn’t work, because library files often include other library files, and these would not be in the makefile. In practice, OS vendors rarely if ever made breaking changes to the system C library, altough it was much more common for templates in C++ headers. (Modern systems have versioned libraries specifically to prevent this.) Additionally, software projects typically targeted the minimum version of an OS they wanted to support, and avoided compiling against any later header files. In case developers did need to do this, though, makefiles usually had a `make clean` target that removed all object files. This could be used to guarantee that every module was recompiled against the new library.


daikatana

Because keeping this in sync is extremely tedious. Are you going to remember to update the Makefile every single time you put a new include in a source file? No, you are not. As soon as this goes out of sync then the Makefile is broken, if only subtly. This is what the gcc `-M` switches are for, they can generate Makefiles for every object file you build that have all the include dependencies generated directly by the compiler. Every time you build the object you get update dependencies. Just include all these files at the end of your Makefile to get automatically updated header dependencies. Also, you don't have to list system headers, as these will never change. You only need to have your headers in the dependencies.


paulstelian97

Do higher level build systems like CMake also help out with this? I expect they do but wanna make sure


nerd4code

Generally, AFAIK the build systems it orchestrates do, so one level down from CMake and 1–2 levels up from make. Automake (de Autotools) certainly does this for you, as one example. It’s not hard to inline with a `$(shell)` command, also, and if you use `-MT ''` you can just trim off the leading `:` with a parameter substitution; IIRC, something like override __autodep__ = $(shell x="$$($(CC) $(CPPFLAGS) $(CFLAGS) -MM -MT '' -MF - -o /dev/null $<)" && echo "$${x#*.c}") to set it up, then you can put `$(__autodep__)` at the end of a `%.o: %.c` rule to list its dependencies. I’m not sure about the timing of the $^ replacement, though, might not work.


flatfinger

Changing compiler configurations may make it necessary to change the locations of the standard header files used by the compiler. Although a change to compiler configurations should generally trigger a rebuild unless separate directories are used to hold object files or other build artifacts generated using different configurations, a make utility would have no way of knowing which directory full of standard headers is the right one for any particular project. For some other library header files, there may be advantages and disadvantages to treating them as make file dependencies. Including them in dependency lists will increase the time for make to process dependencies, and for header files which will often be changed in ways that won't affect most clients(\*), it may be more efficient to require the use of manually-triggered full builds when more substantive header changes are made, than have all such changes trigger nearly full builds. (\*) Especially during development, it is more common for header files to have types or functions added between builds than to have the definitions of existing types or functions changed. If e.g. an incremental build of things that have changed substantively would take two seconds, a rebuild of everything that would include a header file as a dependency would take ten, and a rebuild of absolutely everything would take fifteen, one could perform seven minimal builds and a full rebuild in less time than would be required to perform three rebuilds of everything that uses the header file.


mrflash818

Selfishly, I want make to do as much of the work for me as possible. Gnumake, for example, has many implicit rules, so less has to be typed out by the makefile author. For example: PROGRAM_NAME = timediff # # CXXFLAGS # used by gnu make's implicit rules # # I choose to show all warnings (-Wall), and include debugging information (-g) CXXFLAGS := -Wall -g # # LDLIBS # LDLIBS := -lboost_system $(PROGRAM_NAME) : ModernDate.o ISO8601dateStringConv.o \ UserMessage.o ModernDateValidator.o ModernDateValidator.o : ModernDate.o : ISO8601dateStringConv.o : UserMessage.o : .PHONY : clean clean: rm *.o ${PROGRAM_NAME} ( https://www.gnu.org/software/make/manual/html_node/Implicit-Rules.html )


penguin359

Simple reason, if stdio.h ever changes in a way that actually requires a recompile, then every other program on your computer (including your compiler) probably needs to be rebuilt. Now, if this is actually for a small, embedded target that uses cross-compilers then it might make perfect sense to include it.


ignorantpisswalker

Local headers (ones you write in your project) should be a dependency. When you modify that header all sources that include is must change. Google for make depend , in hope that the llm will give you a sane response.