Game World!

Join A World Of Gamers

Enter your email address:

Delivered by FeedBurner

Followers

Popular Posts

Tuesday 29 June 2021

Why are makefiles still used?

 So I have been coming across many comments/posts/etc regarding creating makefiles directly, and how it is a silly thing to do in 2015. I am aware of tools such as CMake, and I actually use CMake quite often. The thing is, CMake is just creating the Makefile for you and helping to remove the tedium of doing it yourself. Of course it adds a lot of other great features... but its still a Makefile in the end.

So my question is, is the 'obsolete' talk regarding make referring to the entire Make utility, or just the idea of manually writing your own Makefiles? I do not use an IDE for C/C++ development at all (just emacs), so I have always written Makefiles.

If Make is considered outdated, what should a C/C++ dev be using to build small, personal projects?

  • 8
    For small, personal projects a make without a Makefile is sufficient. What the hassle? – ott-- Apr 11 '15 at 0:42
  • 8
    Every piece of available software on my FreeBSD systems, both desktop and servers, has a Makefile that's started by 'make'. No. 'make' is not obsolete. – Rob Apr 13 '15 at 21:06 
  • 4
    People are free to use IDEs, but if their IDEs can't support projects using Makefiles nearly 40 years after make was first written, then they shouldn't expect people to work around that. – Miles Rout Apr 14 '15 at 22:45
  • 1
    The "obsolete talk regarding make" is just a symptom or a pain point, not a proclamation of deprecation. Especially when a superior total replacement does not yet exist, and all approaches to reducing the pain still uses make in some ways, just hidden from the users who are most prone to the pain. Although a lot of good answers have been given, the question itself is borderline opinion-based. – rwong Apr 16 '15 at 5:48
  • 1
    CMake is a layer of abstraction. This abstraction is needed to tame the wild variety of consumer IDE-products that won't comply to open-source Makefile . Again it is not only abstraction, but it does allow configuration (like autoconf) features. Like other abstractions it allows alternatives. But it paves the way of experimental new idea of alternatives of Makefile easily available to the users of CMake. – shuva Mar 14 '18 at 21:39 
36

The big difference is that CMake is a cross-platform meta-build system. A single CMake project can produce the usual Unix/Linux makefile, a Visual Studio project for Windows, an XCode project for Mac, and almost any other non-meta build system you might want to use or support.

I wouldn't say using make directly or even manually editing makefiles is "obsolete", but those are things you probably shouldn't do unless you are working on Unix/Linux stuff that won't need porting to Windows, much like how you should not be editing Visual Studio project files directly if you ever want to port them to not-Windows. If you have any interest in portability, it's worth learning a meta-build system like Scons or CMake.

  • 12
    POSIX make is also cross-platform. – Blrfl Apr 10 '15 at 23:31
  • 7
    @Blrfl this may be true, but generally you'll find the command lines you need to use to compile and/or link your project are platform specific, even if all of the platforms you're targetting are POSIX-compliant and even if you're using the same compiler on all (there'll still be pesky problems with locations of include files, whether you need -lm-lnsl and so on or whether those facilities are included in the C library, etc). – Jules Apr 10 '15 at 23:43 
  • 6
    @Jules There is much more (or less) to CMake, it's basically tailored towards the creation of your standard, modular application for systems with ELF loaders, and provides abstractions for all the tools required for that specific tool chain. If you depart from that toolchain (e.g. custom linker script for bare metal), CMake suddenly provides no support or portability at all, you end up hacking it just as you used to hack fully custom build scripts. – Ext3h Apr 11 '15 at 13:26
  • Same goes for Qmake (despite not being the badly documented inconsistently behaving **ap CMake is), btw ;) – mlvljr Apr 13 '15 at 23:52
13

Is make really outdated?

I don't think so. In the end, make is still powerful enough to provide all the functionality desired, like conditional compilation of changed source and alike. Saying make was outdated, would be the same as saying writing custom linker scripts was outdated.

But what raw make doesn't provide, are extended functionalities and stock libraries for comfort, such as parametric header generation, integrated test framework, or even just conditional libraries in general.

On the other hand, CMake is directly tailored towards generating generic ELF libraries and executables. Whenever you depart from them predefined procedures, you have to start hacking CMake just as you used to hack your own Makefiles.

Considering that you can create far more in C++ than just your average application for an system which knows ELF loaders, CMake surely isn't suited for all scenarios. However if you are working in such a standardized environment, CMake or any other of these modern script generator frameworks certainly are your best friends.

It always depends on how specialized your application is, and how well the structure of the CMake framework fits your work flow.

  • 1
    While useful, make is a horrible system with braindead, arcane syntax and tons of unnecessary limitations that make for lots of oops-moments. – antred Nov 18 '17 at 15:35
7

Once upon a time high level languages were just an idea. People tried to implement compilers. Back then there were severe hardware limitations - there were no graphical tools so "plain text" ended up being used for the input file format; computers typically had an extremely tiny amount of RAM so the source code had to be broken into pieces and the compiler got split into separate utilities (pre-processor, compiler, assembler, linker); CPUs were slow and expensive so you couldn't do expensive optimisations, etc.

Of course with many source files and many utilities being used to create many object files, it's a mess to build anything manually. As a work-around for the design flaws in the tools (caused by severely limited hardware) people naturally started writing scripts to remove some of the hassle.

Sadly, scripts were awkward and messy. To work around the problems with scripts (which were a work-around for the design flaws in the tools caused by limited hardware) eventually people invented utilities to make things easier, like make.

However; makefiles are awkward and messy. To work around the problems with makefiles (which were a work-around for a work-around for the design flaws in the tools caused by limited hardware) people started experimenting with auto-generated makefiles; starting with things like getting the compiler to generate dependencies; and leading up to tools like auto-conf and cmake.

This is where we are now: work-arounds for a work-arounds for a work-around for the design flaws in the tools caused by limited hardware.

It's my expectation that by the end of the century there'll be a few more layers of "work-arounds for work-arounds" on top of the existing pile. Ironically, the severe hardware limitations that caused all of this disappeared many decades ago and the only reason we're still using such an archaic mess now is "this is how it's always been done".

  • 1
    So whats the alternative? Is the alternative totally changing the concept of builds as we know it? – JParrilla Apr 11 '15 at 14:49
  • 1
    @Darkslash: Once you start considering (hypothetical) alternatives it's like opening floodgates - you end up questioning everything (and end up with ideas like separation of semantics and syntax, and redesigning IDEs and tools and delivery as a set rather than individual pieces of the larger puzzle). More practical is to realise that tools like cmake only treat symptoms and can't cure the root cause/s; which makes it easier to accept the fact that you've got no choice other than to use tools that will probably never be close to "ideal". – Brendan Apr 11 '15 at 15:18
  • 6
    Makefiles aren't in any way 'awkward and messy'. They're incredibly elegant: make is at its heart an engine for traversing an acyclic dependency graph. Nothing about 'scripts' or the command line is 'archaic'. – Miles Rout Apr 14 '15 at 22:43
  • 1
    @MilesRout: The awkward and messy part is the graph construction. The traversal is elegant enough. But just take the single most common example of the messy part: C header files. Each #include is an edge, but make cannot parse C files. Still not a problem, because make could store these edges with the next node (the *.o file), but it doesn't do that either. – MSalters Apr 15 '15 at 12:55
  • 5
    I don't agree that a better design is possible. make is not just for compiling C! You want a program that takes .c and .h files and gives Makefiles. But that's not make, and that's not what make should do. Again, make isn't all about C, and a C-specific solution built into make is a bad idea (and incidentally a violation of the UNIX philosophy). – Miles Rout Apr 16 '15 at 11:48
6

make (the tool or direct use of it via a Makefile) is not outdated, particularly for "small, personal projects" as you use it for.

Of course, you can also use it for larger projects, including those targeted for multiple platforms. With target-specific variables you can easily customize how you build for different platforms. Nowadays, Linux distributions come with cross-compiling toolchains (e.g. mingw-w64) so you can build a complete Windows software package (with installer if you like) from Linux, all driven from your Makefile.

Tools like cmake and qmake can be useful, but they are not without their problems. They are usually fine for building an application itself along with any libraries (though I always had problems with qmake doing proper dependency checking between libraries and programs that use them), but I always struggle with the constraints of those tools when doing the rest of the job (creating installers, generating/installing documentation or translation files, doing unusual stuff like turning an archive into a shared library, etc.). All of this can be done in make, though things like dependency tracking can require a little effort.

IDEs like Qt Creator and Eclipse can also import Makefile-based projects, so you can share with IDE-using developers. I think the Qt Creator IDE is excellent as a C++ IDE, but after spending some time to become more effective with Emacs, and since I'm doing more Ada development, I'm finding myself preferring Emacs for everything. To relate back to the question about make, as a post-link step in my Makefile, I update my TAGS file (for symbol navigation in Emacs), loaded with M-x visit-tags-table:

find $(SRC_DIR) $(TEST_DIR) -regex ".*\.[ch]\(pp\)?" -print | etags -

Or for Ada development:

find $(SRC_DIR) $(TEST_DIR) -name "*.ad?" -print | etags -
2

This answer complements @lxrec answer.

Makefiles can be used for many things, not just creating a program/library from source code. Build systems such as CMake or autotools are designed to take code, and build it in such a way as to fit into the user's platform (i.e. find libraries or specify correct compile options). You could for example have a makefile which helps automate some release tasks, such as: build a zip file containing code with the version derived from git; run tests against your code; upload said zip file to some hosting service. Some build systems (such as automake) might provide an easy way to do this, others may not.

That's not to say you need to use makefiles for this, you could use a scripting language (shell, python etc.) for such tasks.

1

I don't think that human written Makefile-s are obsolete, especially when:

  1. using POSIX make, which gives you a portable Makefile
  2. or using GNU make 4, which gives you many very interesting features, in particular GUILE scriptability, which enables to code efficiently the fancy features provided by Makefile generators (I believe that the features of autotools or Cmake could be easily written by Guile customization of GNU make). Of course, the price to pay is to require GNU make 4 (not a big deal, IMHO).
0

Makefiles are not obsolete, in the same way that text files are not obsolete. Storing all data in plain text is not always the right way of doing things, but if all you want is a Todo List then a plain text file is fine. For something more complicated you might want a more complicated format like Markdown or XML or a custom binary format or anything in between, but for simple cases plain text works fine.

Similarly, if all you want is a way to avoid writing out g++ -g src/*.c -o blah -W -Wall -Werror && ./blah all the time, a hand-written Makefile is perfect!

If you want to build something that is highly portable, where you don't have to manage that portability yourself, you probably want something like Autotools to generate the right Makefile for you. Autotools will detect the features supported by various platforms. A program written in standard C89 built with Autotools will compile virtually everywhere.

  • "will compile virtually everywhere" -> "will compile on most recent Unix-like systems". I don't think autotools works to any relevant degree on a Windows system, for example (discounting Cygwin/MingW, which is really a Unix-like system on top of Windows). – sleske Aug 20 '15 at 12:52
  • It has nothing to do with being recent. The advantage of autotools is that it works on every remotely POSIX-like system. Windows is somewhat POSIX compliant. – Miles Rout Aug 23 '15 at 21:05
  • Yes, i stand corrected. Actually, I remember one criticism of autotools is precisely that it contains code even for very old systems, so scratch the "recent". I still stand by the Windows part, though. – sleske Aug 24 '15 at 4:44
  • And the blame for that lies entirely with Microsoft. If they cared about producing a quality product then Windows would have proper support for international operating systems standards (like POSIX). POSIX isn't just a "Unix thing", it's the portable operating system standard for all operating systems. Windows is just a noncompliant pile of junk. – Miles Rout Aug 25 '15 at 4:24
  • @MilesRout still the bottom line is that "virtually everywhere" excludes 90% desktop computers. – el.pescado Oct 9 '15 at 7:05
-1

if your project is simple and contains very few files, then no make file is needed.

However, when the project is complex, uses numerous memory areas, has many files, then it is necessary to place each memory area in the right spot in the addressable memory, it is highly desirable to not recompile every file every time a trivial change is made to one file,

If you want to erase old files/compile/link/install with the minimum amount of hassle and chance of keypress mistakes, the makefile is a real boon.

When you get into the 'real world' projects rarely ever are just 1 or 2 files, but rather hundreds of files. a makefile will always perform the right actions and no unnecessary actions (after it is debugged) so you only type one simple command and the makefile does all the work

cmake tries to do the whole build process, while making it as simple as possible for the novice user.

Make enables the user to retain control over the build process

  • 1
    That didn't answer why to prefer make over cmake or vice versa. – Ext3h Apr 11 '15 at 13:08
  • Using makefiles doesn't rebuild all files every time. (else CMake or Auto-tools couldn't use it) – ideasman42

Floating Button

Button