KDE Switches To CMake

The KDE4 build system is now centered around CMake. If you are a developer, CMake will be much easier to learn, handle and maintain than what you are used to so far. Alexander Neundorf, who took upon himself a big share of the actual work required for the switch, has published Why the KDE project switched to CMake -- and how on the LWN.net development pages. He outlines the considerations that led to choosing CMake over competing tools, shows why CMake is a better fit than the autotools used in KDE 1, 2 & 3, provides a short introduction into CMake file syntax. He even allows some insights into the current KDE4 development state. Hot on the heels of KDE, Scribus is switching as well.

Dot Categories: 

Comments

by Martin (not verified)

I'm abusing this forum a bit to ask a question that has always puzzled me ;-)

I'm a long-term Pascal programmer and what I could never figure out about C/C++ is why those makefiles are needed at all.
This has always confused me.
In Pascal I can include other files by saying "uses blabla". Now, when I click on compile, the compiler figures out where everything is and generates an exe file. And this goes relatively fast BTW.
Can some expert please enlighten me what is it that C++ can do that Pascal can't that justifies the extremely long compiling times (in contrast to i.e FreePascal) and those complicated Makefiles? Would it theoretically be possible to write KDE in a language like FreePascal without any makefiles or are such languages missing important things?

by Chase Venters (not verified)

Makefiles are useful for managing dependencies and targets. They're actually very important for big projects like KDE, because if you make a change to the file, you want to recompile all of the affected code, but you /don't/ want to have to recompile the entire application or suite.

So, make can be taught that if I change util.h, util.h is used by util.c and superutil.c. So if I change util.h and run make, make simply compiles util.c into util.o, superutil.c into superutil.o, and then re-links my_big_application by combining util.o, superutil.o and perhaps thirty more .o files that it did not have to rebuild (since they did not change).

by Matej Cepl (not verified)

you can do the same with C/C++ (gcc -o result *.c), but make is able to find out which files need to be recompiled (because *.o is older than *.c) and stuff like that.

by a.c. (not verified)

Make is about deciding when to compile something. Doing a simple project (several source file with perhaps several headers ) is not a big deal. Even if a compile takes 1-2 minutes, that is no big deal. But a large project (ALL of linux or ALL of KDE) can take hours if you compile every thing. But with a makefile, it will compile files that have changes (and any dependancies). After all, if you change just one file, should it take 3 hours to find out that it works?

by Amand Tihon (not verified)

IIRC, in Pascal, your whole "blabla" is contained in a single file. When you write "uses blabla", the compiler knows that only blabla.pas (and its "uses") is involved. If there are functions or procedures declared as "external", the .pas that contains them must be "used".

In C(++), you usually have a separation between .c(pp) files which contain the actual code, and .h files which only declare things as available. Thoses .h files are like a bunch of procedures/functions declared as "external", plus some other declarations like data structures (type in Pascal). The trick is, [nearly] ALL the actual code is somewhere else in one or more .cpp files.

In the end, you don't have a strict relationship between your "#include " and the code, as the functions might as well be dispersed in blabla.cpp, foo.c, bar.cpp and blabla-compat.cpp

Now there are tools like automake or cmake, to help you and discover the dependencies. They do it just like your Pascal compiler, by parsing the sources, and they may be sufficient. On complex projects, however, real people have to give some hints, because the blabla.h for instance, has no corresponding blabla.cpp.

These tools also allow you to do so-called conditionnal compilation, where different portions of the code are compiled only when a specific library is present on the system : that's one of the things the nearly ubiquitous "./configure" does.

by Martin (not verified)

Thanks a lot for all the answers. Really an interesting read.
The most important things I took out of all this is:

1)
Relationships between "interface" and "implementation" parts (like
in Pascal) can be very complex and spread across various files. This
is not possible in Pascal, of course. A completely missing "implementation"
can be realized in Pascal as well by using the "external" statement in the interface section. This means: This will be linked in from somewhere else.

2)
You can call external tools flexibly like a .ui compiler during the compile process. You can do things like that in Pascal with {$R ...} statements

So, all this boils down to: The Pascal way is to put all things that C++ puts in a makefile and headers in one single file. Perhaps it is just me, but for my smaller applications I find the Pascal way much easier because I only have a single file for my small application. What has always frightened me about writing a KDE app is that when I start KDevelop and select Simple KDE app, the wizard creates lots of folders with many files that are not even explained anywhere. Ugh, guess that's what they mean with "Thrown in at the deep-end". I usually like it though when I understand all of my source code from head to toe...

So, to be finally back on topic ;-) perhaps CMake will make life a bit better for all of us novice programmers, too, because more often that not there are strange errors when compiling KDE apps from source. Did anyone ever look into an automake.m4 file? I can hardly believe that some people can find there way around this....

by Shash (not verified)

Easier, yes. More flexible, no. Like so many have pointed out above, you don't want to compile the whole of kdelibs for one measly change in one line in (say) kpushbutton.cpp. In fact, that wouldn't even necessarily affect anything else that uses the KPushButton class, if it doesn't change the actual interface of that class. What make will do is recompile kpushbutton.cpp to create kpushbutton.o, and re-link kpushbutton.o with the rest of the object files that depend on it. Which takes a fraction of the time needed to compile the entire code. From your descriptions, recompiling in Pascal would either need a really advanced make-like system, or will just not be able to handle this.

Makefiles, btw, are mainly a Unix standard - others use other methods. For example, Turbo C++ (aka the devil) used a project file format which accomplished the same thing in a different way.

BTW, you can safely forget about the m4 files for smaller projects, unless you're trying to include external (to KDE) libraries. If you want to learn more, go read the autobook: http://sourceware.org/autobook/autobook/autobook_toc.html - it's a manual to autoconf, automake and libtool.

by panzi (not verified)

Well, C/C++ are no high level languages by modern standards. C is a better assembler. In a high level Language like e.g. Java, a import loads symbolds from a .class file. In C/C++ a include actually _includes_ the source of the included file (the header) into the source of the .c(pp) file! It's like a copy & paste.
In this procces is no higher logic which can derive dependencies between the sources by the includes. Therefore there are Makefiles or simmilar things necessary to declare thouse dependencies. There is no concept of packages or libraries in C/C++ which can be seen in the source. Packages/libraries are defined/build by the buildsystem, not by the source files/directory hierarchie (like in Java).

I dream of a C++ like language which uses imports, not includes. A language wich is like Java but with operator overloading and native compilation and, on demand, manual memory handling (on demand only! default should be refcount).

by ita (not verified)

> I dream of a C++ like language which uses imports, not includes. A language wich is like Java but with operator overloading and native compilation and, on demand, manual memory handling (on demand only! default should be refcount).

Something like the D language ? http://www.digitalmars.com/d/overview.html (compatible with c but not c++.

by boemer (not verified)

Yeh... me too...

C# is getting a whole closer, as easy as Delphi, mostly C++ syntax (more than Java). No Swing.

Native compilation is possible, I believe mono has it too. But alas default goes to IL-Language (probably better to write ILL, because that is what it is...)

by EP (not verified)

<< I dream of a C++ like language which uses imports, not includes. >>

Fine. There are such mechanisms proposed for C++. There is a chance that these are incorporated into the next standard, but there is also a respectable chance that it is too late for the next version. You can help fullfill your dream by working on a sample implementation of the relevant proposals.

Here is a classification of proposals, regarding their readiness for C++0x
[http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2006/n2011.htm]
I presume you are able to seek out which ones are relevant to your dream yourself. As you can see, these items are termed "still actively being developed, [...]", with a "[...] clear intention to incorporate each one in the next standard." But the latter won't happen magically. Qualified input, or a prototype implementation would help.

If you are going to implement parts, maybe Pedro Lamarão can give you hints how to start
[http://gcc.gnu.org/ml/gcc/2006-06/msg00361.html]

by Quintesse (not verified)

Others suggest changes to C++ to make it more like you want, I'll suggest you could change Java to make it more like what you want :-)

There are RFEs for operator overloading (like this http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6349553)

Java with native compilation? Try gcc (or better: gcj, the Java plugin for gcc).

And refcounting is about the worst possible way of doing managed memory handling so I would stick with the Java way of doing things.

by Erik (not verified)

Ada is what you are dreaming about. It has "use" statements, not includes. Operator overloading works like a charm. Manual memory handling works with Ada.Unchecked_Deallocation. It can be compiled natively and also to Java bytecode. And Ada95 has been around since before C++ was even standardized.

by suy (not verified)

I want to add to what others have already posted, that KDE is not just a C++ project. If it were so simple, KDE will be built with qmake (from Qt).

In KDE there are very complicated build requirements, like kparts, KConfigXT, modules/plugins, etc., and lots of other "static" files that require a very specific installation path to work, like desktop files, icons, and a long etc. See the threads on kde-core-devel about the requirements of a new build system, and scare yourself a bit. :)

by Maarten ter Huurne (not verified)

First of all, let's separate long compilation times and the need for Makefiles. The main reason for long compilation times in C++ is the preprocessor; I experimented with precompiled headers in a project much smaller than KDE and it cut compilation time in half. C++ has a preprocessor because compatibility with C is the main design goal. Why C was given a preprocessor, I don't know. The preprocessor is very powerful but also causes a lot of problems. Besides the preprocessor, template expansion is also slow. I'm not sure if that unavoidable, or whether the compiler programmers are reluctant to optimise code that is already very complex due to the complex language definition.

Makefiles are useful for partial compiles, like other posters have explained. But they are also useful for doing automated tasks other than compilation. For example, if you design a user interface with Qt Designer, you have created a .ui file, which is an XML format. To use the user interface in your program, a tool called "uic" takes the .ui file as input and generates C++ code. Then that C++ code has to be compiled. With a Makefile, you can automate the code generation and incremental recompilation, so if you change your UI design, with a single command the required files are regenerated and recompiled.

Other examples of automated tasks are extracting API documentation, converting the user documentation from DocBook to HTML, creating screenshots for the user documentation (so the look always matches the latest release) and distributing the version number to every piece of code and documentation that needs it (to avoid forgetting to manually update it somewhere in the hecktic time before a release).

I think any project over a certain size will need a Makefile or some other kind of automated build and package procedure (Ant, scons etc). With C/C++ you need a Makefile as soon as you have more than one source file, with other languages you can delay it a bit longer, but there is always a point at which a project becomes too complex to manage by hand.

Could KDE be written in another language? In theory it could. I wouldn't be surprised if Qt and kdelibs stay C++ but an increasing number of applications will be written in Python and Ruby, since for many applications runtime performance is not that critical, while the ease of development in those languages is a big plus.

by ita (not verified)

The preprocessor is not the main problem for the speed (i have written a preprocessor in python). The c/c++ languages are certainly difficult to compile (no separation between the lexer and the parser, multi-steps transformations including preprocessing, templates processing), there are fast and slow compilers (gcc vs tcc). For gcc/g++ the main problems lie probably in the portability and the strict adhesion to the standards.

Now for the build system, it is enough to say that KDE programs are not simple c++ programs but the results of several compilers (kconfig_compiler, dcop, moc, uic): the build system must not only track the changes between the source files but apply the rules as appropriate.

by Fuman (not verified)

I think there was a time when Pascal was more popular than C++, not on Unix though. Unfortunately Pascal compilers generated slower code and we had a strong c++ lock-in via the libraries.

Flexibility is a source of error and complexity traps while it is useful to tinker sometimes. You can reduce problems with coding styles but they are more difficult to get enforced than compiler-implemented standards. Even worse is entrance barriers to programming. A complex environment is more difficult to get to know.

So what is really needed is a safe subset language as a default with the option to get deeper. Like your car offers certain options for you as a user and as an mechanic you can open it and change what is hidden.

The approach followed by .NET will lead to a standardisation of the runtime machine and its interfaces and you can generate the code from multiple languages. This will certainly simplify multi-languge environments. Sames goes for GCC backends. Mixed language environments will provide you the flexibility you need for each task.

For standard programming work it is always better to use a simple language which keeps away the bloat. But it has to be ensured that you can use or interoperate with advanced code which goes beyond what the restricted language is able to offer. And you want access and interoperate with libraries which offer you the features you need and here c++ offers more than pascal or other languages.

So what do you do when you need the features offered by makefiles in complex environments. And sure, there are Pascal makefiles possible as well. Take any language and let it evolve and driven by the tasks you perform you get features entered. So many tried to abolish makefiles and then came up with solutions which reintroduced make complexity on another level.

by Axl (not verified)

So with kdelibs4 and qt4 all kde apps can be compiled and used under windows? And developed/modified with visual studio?

Very interesting, there's a huge user-base waiting there..

by Kurt Pfeifle (not verified)

Well, I don't think the user base over there is exactly *waiting*. To a large degree, they do not yet know about us at all...

It remains to be seen just exactly...

...how much of KDE's goodness can really be ported to the Windows platform....

...how well it works (will all KIO Slaves work?, will IPC with DBUS work? etc.)

...and how many new *developers* we will be able to attract.

Did you notice I said *new* developers? Only if we gain more developers who before were exclusively creating their apps for Windows, will the reach out to that new platform in the end have "paid" for itself.

by Alexander Neundorf (not verified)

It does already right now in this eraly stage attract new developers, apart from Ralf Habacker and Christian Ehrlicher (which I think already did work on KDE 3 for cygwin) there are right now at least two new developers coming from the Windows world (Peter Kuemmel and Jorge) and at least one new developer on OS X (Tanner Lovelace).

Alex

by Sean (not verified)

I've been wondering, is the KDE 4 port to Windows being done as a shell replacement along the lines of LiteStep?

Am I actually going to be able to use the one desktop I like while running any application I choose?

by Jakob Petsovits (not verified)

> I've been wondering, is the KDE 4 port to Windows being done as a shell
> replacement along the lines of LiteStep?

As far as I know, no, this is not planned.
The goal is to port kdelibs and most of the applications, but leave out the desktop part of KDE. Plasma and KWin depend on X11, and it will stay this way. If you want KDE as a platform, you can get now it for Windows too; if you want it as a desktop, better go for the real thing using Linux or some Unix system.

by Ben Morris (not verified)

No, not all KDE apps. Many still depend on non-KDE Unix stuff. For example, many media players depend on Unix-only media libraries like Xine. The amaroK team have stated that they will not even try to port to Windows in the forseeable future, because even with KDE working perfectly, there are none of the media libraries it currently supports (can you imagine amaroK using WMP? Gah).

by Bobby (not verified)

Do you have a reference link? I have heard differently in a couple discussions.

Bobby

by Frédéric L. W. ... (not verified)

What happens to all configure options ? configure --help and such are preserved ?

by Carewolf (not verified)

configure died..

So not instead of writing ./configure --prefix=/opt/kde
You write cmake . -DCMAKE_INSTALL_PREFIX=/opt/kde

And all options has obscure CMAKE_FOO_BAR environment variables you need to set.

Seriously it's a much bigger pain in the ass for a developer than autotools was.

by Morty (not verified)

That's plainly braindamaged, changing to something obscure instead of something "everybody" knows and are familiar with.

But as I see it this has a really simple solution, make a small wrapper script called configure which sets all those obscure variables using the old configure syntax. The only difference to the users would be a amazingly fast ./configure step, as it does not do all the checking. Then let cmake do the rest.

The amazing thing is that no one has made it yet.

by Joe Kerian (not verified)

As someone who has recently learned both the automake system and the cmake system, for me there was no contest. The cmake system is much simpler, clearer, and less error-prone than .am syntax for the exact same project.

I use both auto* and cmake in separate projects at the moment, and cmake has yet to give me a completely unhelpful, ungoogleable, apparently unique error message. Automake may be something that "everybody" knows... but my experience has been that "nobody" knows how to fix it when it breaks.

I found both automake and cmake to be somewhat under-documented, which was somewhat more excuseable for the newer cmake project. One request though... would it be too much to ask that the devs of the core systems keep a wiki entry that describes the CMAKE_* macros and values that they have defined?

by Alex (not verified)

It's not complete, but a good start:
http://www.cmake.org/Wiki/CMake_Useful_Variables

You can also do in cmake:
get_cmake_property() and get a list of all currently existing variables (man cmake for more information)

Alex

by ac (not verified)

> Seriously it's a much bigger pain in the ass for a developer than autotools was.
man ccmake

by Mark Hannessen (not verified)

mmm..
does this method also allow you to install your docs, or man pages or whatever in a different location the /opt/kde as you would do with ./configure --bindir=/adir --libexecdir=/anotherdir

by Alexander Neundorf (not verified)

Just replying to some of the comments:
in which way are there security problems ? I'd like to know.

About writing a script which pretends to be configure: maybe an idea for users, but really not for developers. They should learn the basics of using cmake.

About different installation directories:
currently everything is installed relative to the install prefix, but this can be changed without problems if required.

Alex

by riddle (not verified)

> Seriously it's a much bigger pain in the ass for a developer than autotools was.

Please enlighten me. What (besides the CMAKE_* variables) makes it more difficult for developers.

by Pupeno (not verified)

What happened with SCons ? I thought it was going to be used on KDE 4, was it dropped ?
Thanks.

by anonymous (not verified)

have you actually read the linked article..? :P

"However, various hurdles showed up unexpectedly. The KDE individuals who tried to bring SCons into a shape that made it fit for building such a huge project felt they didn't have any support from the upstream SCons developers. There were major problems building KDE on non-Linux platforms with SCons (e.g. on OS X); in general they felt it did not yet have a mature configuration system. The only option down that road was to create major SCons fixes and patches on their own. Since these changes would not likely be included in the upstream sources, it would require permanent maintenance of the fixes in a separate repository. In effect, this would have amounted to a fork of SCons. KDE developers would have had to maintain the new build system entirely on their own. So the rosy SCons/bksys image paled again...."