Akademy Redux: Release Team Members Propose New Development Process

At Akademy 2008, KDE Release Team members Sebastian Kügler and Dirk Müller discussed the future of KDE's development process. Describing the challenges KDE faces and proposing some solutions, they spawned a lot of discussion. Read on for a summary of what has been said and done around this topic at Akademy.

Our current development model has served us for over 10 years now. We did a transition to Subversion some years ago, and we now use CMake, but basically we still work like we did a long time ago: only some tools have changed slightly. But times are changing. Just have a look at the numbers:

  • KDE 0.0 to 3.5 took 420.000 new revisions in 8 years
  • KDE 3.5 to 4.0 took 300.000 new revisions in 2 years

Also, this year's Akademy was the largest KDE event ever held, with more than 350 visitors from every continent of the world.

This enormous growth creates issues both for the wider community, for developers, and for the release team. Patches have to be reviewed, their status has to be tracked - these things become progressively harder when the size of the project balloons like it currently does. The centralized development system in Subversion's trunk doesn't support team-based development very well, and our 6-month release cycle - while theoretically allowing 4 months development and 2 months stabilizing - often boils down to barely 50% of the available time suitable for new feature development.

KDE's current revision control system doesn't allow for offline commits, making life harder for people without a stable internet connection. Furthermore we're still looking for more contributors, so lowering the barrier for entry is another important concern.

Changing requirements

We will have to allow for more diversity and we must be able to accommodate individual workflows. Not everyone is happy with a 6-month schedule, not everyone prefers Subversion. Companies have their schedules and obligations, and what is stable for one user or developer is unsuitable for another. Meanwhile, new development tools have surfaced, such as the much-praised distributed revision control tool Git. Together with new tools for collaborating, new development models are emerging. KDE is in the process of adopting a much wider range of hardware devices, Operating systems (OpenSolaris, Windows, Mac OS) and mobile platforms such as Maemo. And we have an increased need for flexible and efficient collaboration with third parties and other Free Software projects.
Sebastian and Dirk believe it is time for a new way of working. In their view, KDE's development process should be agile, distributed, and trunk freezes should be avoided when possible. While there are still a lot of culprits in their proposal, KDE needs to get ready for the future and further growth.

Agile Development

The most fundamental idea behind Agile Development is "power to the people". Policies are there to avoid chaos, and to guide (but not force) people in any way.

What is Agile Development supposed to offer us?

  • Shorter time-to-market, in other words, less time between the development of a feature and the time users can actually use it
  • More cooperation and shortened feedback cycles between users and developers
  • Faster and more efficient development by eliminating some current limitations in team-based development processes.
  • Simplicity. Not only good in its own right, but it also makes it easier to understand and thus contribute to KDE development.

How can we do this?

To achieve this, we have to reflect upon our experiences as developers and share our thoughts on this. Our process should be in our conscious thoughts. Sebastian and Dirk talked about a specific lesson they have learned: plans rarely work out. As a Free Software project, we don't have fixed resources, and even if we did, the world changes too fast to allow us to reliably predict and plan anything. We have to let go. We should set up a process aimed at adaptation and flexibility, a process optimized for unplanned change.

This needs to be done in one area in particular: our release cycle. Currently, our release cycle is limiting, up to the point of almost strangling our development cycle. So Dirk and Sebastian propose a solution:

"Always Summer in Trunk"

Our current release process, depicted in the graphic below, can be described as using technical limitations to fix what is essentially a social issue: getting people into "release mode". Over 4 months, we develop features, then enter a 2 month freeze period in which increasingly strict rules apply to what can be committed to trunk. This essentially forces developers to work on stabilizing trunk before a release. Furthermore, developers need to keep track of trunk's current status, which changes depending on where in the release cycle KDE currently is, not taking into account diverse time schedules of both upstream and downstream entities. At the same time, many developers complain about Subversion making it hard to maintain "work branches" (branches of the code that are used to develop and stabilize new features or larger changes in the code), subsequent code merges are time-consuming and an error-prone process.

The proposal would essentially remove these limitations, instead relying on discipline in the community to get everyone on the same page and focus on stability. To facilitate this change, we need to get the users to help us: a testing team establishing a feedback cycle to the developers about the quality and bugs. Using a more distributed development model would allow for more flexibility in working in branches, until they are stabilized enough to be merged back to trunk. Trunk, therefore, has to become more stable and predicable, to allow for branching at essentially any point in time. A set of rules and common understanding of the new role of trunk is needed. Also, as the switch to a distributed version control system (which is pretty much mandatory in this development model) is not as trivial as our previous change in revision control systems, from CVS to Subversion. Good documentation, best practice guides, and the right infrastructure is needed. The need for better support for tools (such as Git) in KDE's development process does not only come from the ideas for a new development model though. Developers are already moving towards these tools and ignoring such a trend would mean that KDE's development process will clutter and ultimately become harder to control.

In Sebastian and Dirk's vision, KDE's current system of alpha, beta and release candidate releases will be replaced by a system which has three milestones:

The Publish Milestone

This is the moment we ask all developers to publish the branches they want to get merged in trunk before the release. Of course, it is important to have a good overview of the different branches at all times to prevent people from duplicating work and allow testers to help stabilize things. But the "Publish Milestone" is the moment to have a final look at what will be merged, solve issues, give feedback and finally decide what will go in and what not. The publish milestone is essentially the cut-off date for new features that are planned for the next release.

The Branch Milestone

This is the moment we branch from trunk, creating a tree which will be stabilized over the next couple of months until it is ready for release. Developers will be responsible for their own code, just like they used to be, but one might continue using trunk for development of new features. To facilitate those developers who do not want switch between branches, we could have a tree which replicates the classic development model. Developers are encouraged and expected to help testing and stabilizing the next-release-branch.

The Tested Milestone

The "tested" milestone represents the cut-off date. Features that do not meet the criteria at this point will be excluded from the release. The resulting codebase will be released as KDE 4.x.0 and subsequently updated with 4.x.1, 4.x.2, etc. It might be a good idea to appoint someone who will be the maintainer for this release, ensuring timely regular bugfix releases and coordinating backports of fixes that go into trunk.


A prerequisite for this new development model would be a proper distributed source code management system. Git has already stolen the hearts of many KDE developers, but there are other options out there which should be seriously assessed. Furthermore we need tools to support easy working with the branches and infrastructure for publishing them. Getting fellow developers to review code has always been a challenge, and we should make this as easy as possible. We also need to make it easy for testers to contribute, so having regularly updated packages for specific branches would be an additional bonus. Trunk always needs to be stable and compilable, so it might be a good idea to use some automated testing framework.

Under discussion are ideas like having some kind of "KDE-next" tree containing the branches which will be merged with trunk soon; or maybe have such trees for each sub-project in KDE. Another question is which criteria branches have to meet to get merged into the "new" trunk. Especially in kdelibs, we want to ensure the code is stable already to keep trunk usable. Criteria for merges into various modules have to be made clear. What happens if bad code ends up in trunk? We need clear rules of engagement here. How can we make it as easy as possible to merge and unmerge (in the case the code that has been merged is not ready in time for a release)?

Having a page on TechBase advertising the different branches (including a short explanation of their purpose and information about who's responsible for the work) will go a long way in ensuring discoverability of the now-distributed source trees. A solution also needs to be found for the workload around managing trunk. Especially if we have tight, time-based releases, a whole team of release managers needs to take responsibility. KDE's current release team has come a long way in finding module coordinators for various parts shipped with KDE, but currently not every module has a maintainer.

While there are still a lot of questions open, we'd like to work them out in collaboration with the KDE community. KDE's future revision control system is discussed on the scm-interest mailing list. Discussion on a higher level can be held on the Release Team's mailing list, and naturally KDE's main developer forum, kde-core-devel.

With the release of KDE 4.0, the KDE community has entered the future technologically. Though timescales for the above changes have not yet been decided upon, Dirk and Sebastian took the talk as an opportunity to start discussing and refining these ideas: it's time that KDE's primary processes are made more future-proof and ready for the new phase of growth we have entered.

Dot Categories: 


by boudewijn rempt (not verified)

Well, this frightens me: "relying on discipline in the community to get everyone on the same page and focus on stability", since it's given as the single solution for the problem that quite a lot of people prefer to hack on new stuff they never finish but want to commit anyway. And looking at the disaster that befell KOffice last week, when suddenly KDE trunk started using an entirely different way of linking that completely broke all of KOffice, I doubt there is enough discipline to go around.

And then git... If I look at what happens in the linux kernel development community, I see the best developers being permanently busy with merging, merging, merging and merging and very little else. And, actually, a scheme where we only seldom integrate, where every developer at one, infrequent, moment publishes their branch for merging sounds like a recipe for integration hell, no matter how developer-friendly and good at branching and merging a tool might be.

by sebas (not verified)

Certainly true. The "discipline" one is also one of my biggest issues. On the one hand, that would be alleviated by an allegedly less broken trunk, and on the other hand, with better tools, it should be easier to switch to that branch (I'm assuming here that developer reluctance to test release branches has to do with SVN making it hard to switch to a different branch quickly, not from unwillingness). Usually, I see developers taking great pride in their code and want to make sure it looks good when released. Making the rules to merge it stricter would then use the "you want it merged, then first make it work well" mechanism, rather than the "you're on that branch now, either feel the pain of switching branch or fix bugs" mechanism. The first feels much stronger to me.

I guess the real questions is "Why do we have bugs in trunk at all?", and we should work on reducing that.

The Linux kernel development process is not very similar to KDE's of course. The overhead of what's going in is higher, it's quite formal and bound to specific persons that decide it all. KDE is quite different in that it's team-based, so potentially making it easier to share code in those teams and merge it upstream as it becomes interesting for others until finally bugs have been shaken out and it's ready to go into trunk.

by Derek Kite (not verified)

>I'm assuming here that developer reluctance to test release branches has to do with SVN making it hard to switch to a different branch quickly, not from unwillingness

I wouldn't come to that conclusion. There are substantial benefits that come automatically from being in trunk that will not happen if branched.

There are a few things in free software that are hard and rare. Development isn't one of them. Testing and code review are hard and rare. Trunk provides automatic access to the rare resources. Limited, but as good as it's going to get. Branches force the developer to build those resources, skills that some may have, some may not. Or more likely, do without.

That doesn't solve the issues at hand however.

What about release in retrospect? Currently development is halted at freeze, which makes out of tree development very enticing, or even necessary. What if each module decided in retrospect the release point? New or major reworks that break trunk could be done out of tree (and having tools that support this would help), and merged at the point where testing and review are desired.

The kernel is different, but testing and review hasn't happened in the spun off branches. One of the problems they face is untested code showing up in merges. Testing and review happens in trunk.

Different targets require their own tree, and having a tool that supports that well would solve the problem.


by sebas (not verified)

Yup, that's one of the points. We need better tools to support more diverse needs and practices. A more staged development process (such as we tried to describe) would be one step towards making this possible.

by Cyrille Berger (not verified)

> >I'm assuming here that developer reluctance to test release branches has to do with SVN making it hard to switch to a different branch quickly, not from unwillingness

Well honestly, at least for me, it's not switching branches which I find takes time, or going back to a previous revision, since I often hear "ah with git it is fast", for me it's often 99% of build time for 1% of checkout... With git, darcs, cvs, svn, it will allways be the build time that will makes switch branches, going back to a revision, a pain. Not saying that optimizing the 1% isn't good ;) just that this is not the real main reason for me.

by SadEagle (not verified)

I just end up with having multiple entire source + build trees of KDE for this very reason, and for the stable version I don't update outside my work area much, so I can actually build & test a backport in a sane amount of time. A DVCS or such would not do anything to help.

Where things like git do help (from my experiments with git-svn) is in managing in-progress work that's not quite ready for commit, so I don't end up with 50 different patches lying around. Although even there, while branches have low computational load, they still have a high cognitive one...

by sebas (not verified)

True. I think the build times are affected quite differently actually.

Also, using git is only part of the picture here. A way to tackle this problem "switching is expensive" within our possibilities is to make it easier to switch to one of those branches. That means documentation, knowing which branch your colleagues are working on, having a good back- and forwardport tool and workflow. And with that goes that we need the right tools to make this cumbersome process as easy as possible.

Switching to a branch at the right point in time actually saves you compilation time as those branches have less and smaller changes compared to the freeze state. Staging the development process also has this effect, less experimenting means fewer / shorter recompiles.

by Git (not verified)

May be it's a silly thing, but what about stting up an infraesrtucture of distributed compilation as Icecream for KDE developpers? At least for the more common source tree.

by richlv (not verified)

distcc instances that community could donate to developers ? :)

by Thiago Macieira (not verified)

Distributed compilation doesn't work over the Internet.

You need a local farm, in your LAN, for that to work. The preprocessed C++ source file is often in the order of several megabytes in size. Uploading that to a machine on the Internet would take too long for most people.

Not to mention, of course, that allowing someone to upload a compiler to your machine and run it is an enourmous security risk. I'd never place my machine in a compile farm whose users I didn't trust.

by Thomas Zander (not verified)

Using git you can make the tool do what you want without too much effort, its very flexible.
I set up a workflow for myself using git-svn for koffice where I always work on a branch. And thus my workflow is geared towards making the compile time as easy for me as possible.
The workflow I have seems to be successfull, I compile much less than I did when I was using svn only, and I only have one tree I'm actually compiling in.

I can explain the workflow in another forum, if you want, but I feel confident in saying that in general your worries can to a large extend be managed quite successfully with git.

by SadEagle (not verified)

Regarding "less broken trunk": in my experience starting from early 3.0 alpha cycle trunk has always been stable enough for everyday use except during pre-4.0 development. So I am not sure it's a real problem.

At first I thought this will be like the kernel development, then I thought it was different, but it really isn't.

Linus tries to have his tree in good shape all the time. But his tree would correspond to the KDE release trees. And linux-next = kde-next.
And the various parts of the kernel have maintainers too.

Only question is who is going to be KDE DVCS benevolent dictator for a period of time (KDEDVCSBDFAPOT) ?? Will he/she be elected?

Damn, it would really help if you would release the Akademy videos or did the dog ate them? Maybe watching Dirk and Sebas would answer my questions.

No benevolent dictator, just more strict rules and better review to get your code into trunk. Review by the community and people who care (but you should try harder to find someone to review it, not just hope that no one finds issues within two weeks -- which also could mean "make it hard to review and it's more likely to go in" -- bad idea.

Very interesting WRT review and vcs tools is Ben's blog about it: http://benjamin-meyer.blogspot.com/2008/08/code-review-should-be-free.html

It confirms that our current review process is about as inefficient as it could possibly be.

We may be using a DVCS: I for sure look forward to easily being able to pull in changes from another Amarok developer that they don't think is ready for trunk.

However KDE is essentially sticking to the centralized model.

by Ed Cates (not verified)

Who was the Antarctic contingent? ;-)

by sebas (not verified)

We're working on that one, actually. :-)

by Liz (not verified)

Not a dev, but a KDE user was in Antarctica for the 2005 season :-)

http://www.fsf1532.com/image/kde_everywhere.png (435Kb)

by zonk (not verified)

Probably the penguins.

by Michael "Antarc... (not verified)

LIARS!!!! LIESLIESLIES!!!! LIES!!!! LIES! Lies! lies! LIES. Lies. lies.

I don't think there was a developer from Antarctica.

by Peppe Bergqvist (not verified)

This is just concerning tools, so it's very basic. And it's just for information, not to start a flame war of any kind.

This is a comparison of git, mercurial and bazaar and there are some nice piecharts there that shows how many languages are being used and how much.

Mercurial uses 4% Lisp (!) for example.

So from a multi platform kind of view I guess Bazaar is the stuff to use, but that's just my guess.

by Oscar (not verified)

I've tried Darcs. It's not bad. Could be something to have a look at.

by kajak (not verified)

bzr is too slow!

by Peppe Bergqvist (not verified)

Of course, a troll with no arguments at all..
And what is "too slow"? Slow at creating repos? committing? diffing? branching?
http://laserjock.wordpress.com/2008/05/08/git-and-bzr-historical-perform... give you some numbers, but that is just for that particular case.

by Ian Monroe (not verified)

bzr svn is hellishly slow and memory hogging. Because of that, despite trying a couple of times, I could never really use it.

So even though it shouldn't be this way, the 'svn' compatibility is very important. KDE devs use Git because they can.

Git is dangerous - I lost part of my work the other day with a failed git rebase interactive command. Mercurial or Bzr might be better, but I don't know, never had the opportunity to use them.

by Shawn Bohrer (not verified)

"Git is dangerous - I lost part of my work the other day with a failed git rebase interactive command."

This is a myth. You have to fairly skilled to actually loose your work in git. In almost all cases including yours you can use the reflog to recover your work.

Checkout "git help reflog"

by Ian Monroe (not verified)

A myth? It happened to me on Monday.

by Ian Monroe (not verified)

Ah ok, so if I was more familiar with the 150 commands I could have recovered my work. Right. ;)

by S.F. (not verified)

For such problems, using git-reflog would have saved you.

by Ian Monroe (not verified)

The fact that Git commands aren't atomic seems like a major problem though. git-rebase gave an error, and rather then resetting to the previous state it sent my latest commits to purgatory.

When people talk about Git being atomic I suppose they just mean its commits are atomic. There are so many git commands though, its no wonder that there's a lack of quality control.

by Evgeniy Ivanov (not verified)

"There are so many git commands though, its no wonder that there's a lack of quality control."
There are a lot of hg or bzr command too (maybe a bit less, but enough). It's not a problem: to work you don't need all of them.

by Tim (not verified)

I bet it really isn't that slow. Besides as someone else said build time is the real annoyance when developing.

I'd rather have a slightly slow but understandable VCS any day.

by Al (not verified)

> I'd rather have a slightly slow but understandable VCS any day.

Why not have both? Try Mercurial and judge for yourself.


by Aaron Seigo (not verified)

... funny you should mention Lisp. it's a great language that has relatively few day to day users. bzr may well be a good tool, but most people using a DVCS are using git. seeing as it is rather debatable as to which is better between and git and bzr, current usage is important. a lot of KDE devs are already using git (i'm a casual user myself these days), so the ramp up effort for git should be considerably less than with bzr for that reason alone.

and my main concern is that if/when we switch that it interferes with development as people get used to the switch. any change in the toolset we use will cause discomfort and transitional costs at first. keeping those to a minimum is a priority from my perspective.

by Ian Monroe (not verified)

Which is actually a good reason to find a DVCS system thats easier to use then Git. I've been using Git with git-svn for about a year and a half now, I still come across confusing situations.

Git has the best SVN support so there are KDE developers using Git which can't really be said for any of the other DVCS's. That's a major advantage no doubt. But it just seems like no one sat down and thought about the UI for Git.
ian@gomashio:~> git-
Display all 133 possibilities? (y or n)

From the little I've seen of Mercurial and Bzr they seem to created with more of a 'vision' of how the user will work with it. And I know Bzr is designed from the start to work with centralized repos (eg launchpad), seems like it might be a closer fit. Dunno though!

KDE switching to Git would make my life a lot easier, half of the problems I have are related to Git and SVN not playing well together. And being able to actually share branches and work directly with others will be great indeed (if git-svn allowed this, we probably wouldn't even need to switch). So I'm not really against the plan at all, just a little worried.

by Aaron Seigo (not verified)

> Which is actually a good reason to find a DVCS system
> thats easier to use then Git

git is pretty easy to use these days; i found git annoying to use a couple years back but it's quite straightforward now. so the main issue i see is existing familiarity, tools that can help transitions and availability of people who will do the work.

> Bzr is designed from the start to work with centralized repos

git does just fine with this as well; i'm using this model for some side projects with gitosis on the server side to make that all a bit easier as well.

by Peter Plys (not verified)

I see your point, but if we had to take our decisions based on current usage, we would have never chosen CMake, which we know was the best option.

Besides, Launchpad will be released as free software within a year[0]. As apparently it is much better than other hosting services such as Sourceforge or Google Code, we can expect the number of Bazaar users to increase dramatically.

MySQL recently migrated to Bazaar[1]; some other big projects are using Mercurial.

IMHO, we should choose which DVCS to use based on their actual features, easy of use, documentation, future perspectives, etc, and not on the current number of users.

This page seems to be a good starting point:

[0] http://arstechnica.com/journals/linux.ars/2008/07/23/mark-shuttleworth-l...
[1] http://blogs.mysql.com/kaj/2008/06/19/version-control-thanks-bitkeeper-w...

by Thiago Macieira (not verified)

We will choose the tool based on having developers with expertise to execute the conversion and then maintain the repository. That also includes teaching and helping others.

If you inspect the traffic for the kde-scm-interest mailing list, you'll see there's only one contender.

If anyone is interested in a different DVCS tool, you had better catch up with the last 6 months of work I have put into Git.


by jerry (not verified)

You've shown the *technical* merits of git, and based on the post above there is *emotional* weight behind this direction, but the decision should be made from a *global* perspective.
Quoting the original article:
KDE 0.0 to 3.5 took 420.000 new revisions in 8 years
KDE 3.5 to 4.0 took 300.000 new revisions in 2 years
What allowed such a progression, the technology of the source code control system, or the socially viable concept of "wanting to contribute"?
The same concept applies right up the life cycle.
If changes can to be tested by 10 people, that's great, but if changes can be tested by 10,000 people that's... well, that's open source.

by Thomas Zander (not verified)

No, I think you misread the "emotional" part. The point that Thiago made is simply that if you think something not git should be a contender, start doing the legwork. Since KDE will end up with a tool that actually has a conversion done of the >800000 revisions to that revision model.

So, to back your favorite horse, get on it and make it move.

by Aaron Seigo (not verified)

let's assume that Launchpad does get released as Free software, that this does result in an increase of bzr users ... how does that event happening sometime in the next year help change either the situation today or give us a reasonable guarantee that it will even catch up with git usage? it doesn't.

moreover, this is not simply about the current number of users globally, it's about the current number of users within the KDE contributor community. which matters to lower the associated costs of transitioning and to ensure we actually have the manpower to do the transition in the first place.

as for features, git has everything i need and then some.

by Ian Monroe (not verified)

I actually don't see the relevance of Launchpad at all. Its not like KDE is going to switch to it or that it'd be impossible for Launchpad to have a "insert-SCM-here" backend.

by Peppe Bergqvist (not verified)

Lisp is a nice language, I have used it myself in my education. The possibility to alter a running program is very nice and of most importance when dealing with parsing natural language and building phrase tress (for example). So Lisp is great language, but with few users as you say, maybe this can hinder some development of git (don't know since I haven't checked where the code occurs).
Maybe my point is: the less mixture of different languages the easier to maintain, or something.

by Thomas Capricelli (not verified)

Mercurial only uses lisp for the emacs integration. Which, of course, makes sense.

by Hannes Hauswedell (not verified)

Hm... I didnt really understand the new system completely, but the people in charge sound like they know what they are doing.

I hope that things will work out as planned though. Changing the development model completely could also scare people away and cause breakages. The transition period will definitely slow KDE4-Development down for a while.

Maybe such a big change should not be attempted before KDE4 gets "more done" and more widely accepted. I guess that a year from now KDE4 will have succeded KDE3 completely, also (3rd-party-)application-wise. That might be a good point in time to do it....

by Ian Monroe (not verified)

Arguably SVN is slowing down development now, as explained in the article.

There will be a transition cost of course. Just keep in mind the current "branching sucks" cost that we are paying daily now.

by Paul Gideon Dann (not verified)

Yay! This has me really excited. I've always been a great fan of the "stable-trunk" model. It's so much cleaner. The Linux and Git projects both have a policy of keeping master more stable than the most recent release (anything unstable remains in "next"). This provides great confidence in the master branch, and would certainly give me confidence to start hacking away, unlike KDE's current trunk, that I've been unfortunate enough to find broken on more than one occasion :(

I've been an avid Git lover ever since I first tried it about a year ago. Before someone claims Git isn't cross-platform, I'll jump in to say that I use Git with Linux, MacOS X, and Windows on a daily basis, and I can testify to the fact that Git works just fine in Windows now :) Oh, and Git is really not hard to learn at all; it just takes a little time to get started. Everyone should be taking time to learn the tools they use anyway if they want to be efficient. Learn to touch-type and you'll be more efficient; learn Git and your code will love you :p

by Morty (not verified)

From what I see both as a user getting updates and reading comments by developers, a problem seem to be the development schedule. A 6 months schedule sounds like it's neither here nor there. It's both to short and too long, depending on what you work on and how it fits with the current development phase. I think a more adaptable schedule would work better, using a policy of dual release schedule. But still retaining the freeze periods, making it less relying on developer discipline.

Something like a 4 month schedule, but with the option for selected modules to skip to a 8 month schedule, and instead releasing the latest from the previous stable branch. This will give the developers shorter time to market when the development process are at a stage where small incremental changes makes the most sense. And still have the ability to calmly work on bigger things when the need are present.

by markc (not verified)

Well if the "always summer in trunk" principle is adopted then, in theory, one could take a snapshot of trunk any old time and release binaries that would most likely at least build and run. The necessity of a release schedule diminishes in importance. I'd say a 3 month official release schedule could be feasible where the last 2 weeks is focused on bug-fixes-only and intense testing of trunk builds. If the released tarballs prove to be a disaster then it would only take a week or two to release a fixed followup release.

Part of the problem with longer release cycles is that the buildup of future changes during the stabilizing phase become so intense that the impedance mismatch between the previous and next versions, particularly at the crossover point just after a release, is in itself a hugely destabilizing factor. With more frequent release there is not such a great leap between the stabilized code and the pent up changes about to rush into the next release cycle leading to less time spent patching things up after each release and more time developing code against a well known and highly usable trunk up until the next bugs-only-freeze stage.

by Morty (not verified)

As it is there are not often that KDE trunk already at least build and run, so that would not change much. Afterall it's simplest form of developer discipline, don't commit things that don't build. The major problem with "always summer in trunk" is that it relay on more developer discipline, when it comes to stopping and concentrating on bug fixing. A freeze period makes it less a issue, as it to some extent forces it. And it's basicly what you suggest anyway, just without calling it for what it is.

Part of the problem with short release cycles is that there are features that you don't have time to finish and stabilize in a satisfying degree. Either leading to rush them in before release creating lots of problems, or in a state where they never get committed as it's never time to sync and stabilize them with head in a sane way.

Since KDE is a very large project there will always be parts that are in state where it's feasible to do small incremental changes fitting to a short release cycle, and other parts where larger more time consuming changes are needed. With a dual release cycle, developers of the different modules can decide and plan their development efforts accordingly.

To make up an example. For instance the Plasma developers may have lots of small features and fixes that is possible to stabilize quickly and they want to push those out to user as son as possible. While the PIM developers may want to port the whole module to use Akonadi and perhaps do some refactoring, needing more time to implement it correctly and stabilize it compared to the Plasma developers. A dual release schedule would accommodate this, taking into account the different needs of both cases.