Subscribe to Planet KDE feed
Planet KDE -
Updated: 25 min 27 sec ago

Snapping KDE Applications

Fri, 2016/12/02 - 2:44pm

This is largely based on a presentation I gave a couple of weeks ago. If you are too lazy to read, go watch it instead😉

For 20 years KDE has been building free software for the world. As part of this endeavor, we created a collection of libraries to assist in high-quality C++ software development as well as building highly integrated graphic applications on any operating system. We call them the KDE Frameworks.

With the recent advance of software bundling systems such as Snapcraft and Flatpak, KDE software maintainers are however a bit on the spot. As our software is building on such a vast collection of frameworks and supporting technology, the individual size of a distributable application can be quite abysmal.

When we tried to package our calculator KCalc as a snap bundle, we found that even a relatively simple application like this, makes for a good 70 MiB snap to be in a working state (most of this is the graphical stack required by our underlying C++ framework, Qt).
Since then a lot of effort was put into devising a system that would allow us to more efficiently deal with this. We now have a reasonably suitable solution on the table.

The KDE Frameworks 5 content snap.

A content snap is a special bundle meant to be mounted into other bundles for the purpose of sharing its content. This allows us to share a common core of libraries and other content across all applications, making the individual applications just as big as they need to be. KCalc is only 312 KiB without translations.

The best thing is that beside some boilerplate definitions, the snapcraft.yaml file defining how to snap the application is like a regular snapcraft file.

Let’s look at how this works by example of KAlgebra, a calculator and mathematical function plotter:

Any snapcraft.yaml has some global attributes we’ll want to set for the snap

name: kalgebra version: 16.08.2 summary: ((TBD)) description: ((TBD)) confinement: strict grade: devel

We’ll want to define an application as well. This essentially allows snapd to expose and invoke our application properly. For the purpose of content sharing we will use a special start wrapper called kf5-launch that allows us to use the content shared Qt and KDE Frameworks. Except for the actual application/binary name this is fairly boilerplate stuff you can use for pretty much all KDE applications.

apps: kalgebra: command: kf5-launch kalgebra plugs: - kde-frameworks-5-plug # content share itself - home # give us a dir in the user home - x11 # we run with xcb Qt platform for now - opengl # Qt/QML uses opengl - network # gethotnewstuff needs network IO - network-bind # gethotnewstuff needs network IO - unity7 # notifications - pulseaudio # sound notifications

To access the KDE Frameworks 5 content share we’ll then want to define a plug our application can use to access the content. This is always the same for all applications.

plugs: kde-frameworks-5-plug: interface: content content: kde-frameworks-5-all default-provider: kde-frameworks-5 target: kf5

Once we got all that out of the way we can move on to actually defining the parts that make up our snap. For the most part parts are build instructions for the application and its dependencies. With content shares there are two boilerplate parts you want to define.

The development tarball is essentially a fully built kde frameworks tree including development headers and cmake configs. The tarball is packed by the same tech that builds the actual content share, so this allows you to build against the correct versions of the latest share.

kde-frameworks-5-dev: plugin: dump snap: [-*] source:

The environment rigging provide the kf5-launch script we previously saw in the application’s definition, we’ll use it to execute the application within a suitable environment. It also gives us the directory for the content share mount point.

kde-frameworks-5-env: plugin: dump snap: [kf5-launch, kf5] source:

Lastly, we’ll need the actual application part, which simply instructs that it will need the dev part to be staged first and then builds the tarball with boilerplate cmake config flags.

kalgebra: after: [kde-frameworks-5-dev] plugin: cmake source: configflags: - "-DKDE_INSTALL_USE_QT_SYS_PATHS=ON" - "-DCMAKE_INSTALL_PREFIX=/usr" - "-DCMAKE_BUILD_TYPE=Release" - "-DENABLE_TESTING=OFF" - "-DBUILD_TESTING=OFF" - "-DKDE_SKIP_TEST_SETTINGS=ON"

Putting it all together we get a fairly standard snapcraft.yaml with some additional boilerplate definitions to wire it up with the content share. Please note that the content share is using KDE neon’s Qt and KDE Frameworks builds, so, if you want to try this and need additional build-packages or stage-packages to build a part you’ll want to make sure that KDE neon’s User Edition archive is present in the build environments sources.list deb xenial main. This is going to get a more accessible centralized solution for all of KDE soon™.

name: kalgebra version: 16.08.2 summary: ((TBD)) description: ((TBD)) confinement: strict grade: devel apps: kalgebra: command: kf5-launch kalgebra plugs: - kde-frameworks-5-plug # content share itself - home # give us a dir in the user home - x11 # we run with xcb Qt platform for now - opengl # Qt/QML uses opengl - network # gethotnewstuff needs network IO - network-bind # gethotnewstuff needs network IO - unity7 # notifications - pulseaudio # sound notifications plugs: kde-frameworks-5-plug: interface: content content: kde-frameworks-5-all default-provider: kde-frameworks-5 target: kf5 parts: kde-frameworks-5-dev: plugin: dump snap: [-*] source: kde-frameworks-5-env: plugin: dump snap: [kf5-launch, kf5] source: kalgebra: after: [kde-frameworks-5-dev] plugin: cmake source: configflags: - "-DKDE_INSTALL_USE_QT_SYS_PATHS=ON" - "-DCMAKE_INSTALL_PREFIX=/usr" - "-DCMAKE_BUILD_TYPE=Release" - "-DENABLE_TESTING=OFF" - "-DBUILD_TESTING=OFF" - "-DKDE_SKIP_TEST_SETTINGS=ON"

Now to install this we’ll need the content snap itself. Here is the content snap. To install it a command like sudo snap install --force-dangerous kde-frameworks-5_*_amd64.snap should get you going. Once that is done one can install the kalgebra snap. If you are a KDE developer and want to publish your snap on the store get in touch with me so we can get you set up.

The kde-frameworks-5 content snap is also available in the edge channel of the Ubuntu store. You can try the games kblocks and ktuberling like so:

sudo snap install --edge kde-frameworks-5 sudo snap install --edge --devmode kblocks sudo snap install --edge --devmode ktuberling

If you want to be part of making the world a better place, or would like a KDE-themed postcard, please consider donating a penny or two to KDE


Wiki, what’s going on? (Part 18-Making it real)

Thu, 2016/12/01 - 10:15pm



WikiToLearn1.0 action plan is getting real


Release the new version and start working to improve it : done.

Ok, done! Now let’s start talking about it, spam it, find new users and grow more and more!

Yes, more or less this is the work we are doing in these weeks with our team.

Unimib is funding posters, which we are using to start a new promotional campaign for WikToLearn! Promo team is working on these new info-graphics and you are going to love them. Unimib students, stay tuned and get ready to spot our posters all around you.

We are working hard also from both institutional and more informal contacts: new collaborators are coming. The team is organizing and taking part to new events in the incoming future; stay tuned, more people are going to talk about us and you’ll appreciate our efforts! We are also planning a series of new talks to present the new release and to get more and more people involved in our project.

We are also working on agreements with different universities and institutional centers such as GARR, Imperial College and UCL.

Christmas is coming, if you have ideas to celebrate it with our community contact us! WikiToLearn1.0 is going to celebrate its first XMas 😉

C’mon, new year with the new WikiToLearn is coming: the moment is now!

Share your knoledge, share freedom!



L'articolo Wiki, what’s going on? (Part 18-Making it real) sembra essere il primo su Blogs from WikiToLearn.

KDevelop 5.0.3 released

Thu, 2016/12/01 - 9:00pm

KDevelop 5.0.3 released

Today, we are happy to announce the release of KDevelop 5.0.3, the third bugfix and stabilization release for KDevelop 5.0. An upgrade to 5.0.3 is strongly recommended to all users of 5.0.0, 5.0.1 or 5.0.2.

Together with the source code, we again provide a prebuilt one-file-executable for 64-bit Linux, as well as binary installers for 32- and 64-bit Microsoft Windows. You can find them on our download page.

List of notable fixes and improvements since version 5.0.2:

  • Fix a performance issue which would lead to the UI becoming unresponsive when lots of parse jobs were created (BUG: 369374)
  • Fix some behaviour quirks in the documentation view
  • Fix a possible crash on exit (BUG: 369374)
  • Fix tab order in problems view
  • Make the "Forward declare" problem solution assistant only pop up when it makes sense
  • Fix GitHub handling authentication (BUG: 372144)
  • Fix Qt help jumping to the wrong function sometimes
  • Windows: Fix MSVC startup script from not working in some environments
  • kdev-python: fix some small issues in the standard library info

The 5.0.3 source code and signatures can be downloaded from here.

sbrauch Thu, 12/01/2016 - 22:00

Long time no write

Thu, 2016/12/01 - 12:01pm

img_20161105_220320My new job

I’m pretty bad at blogging, so as usual there’s a ton of stuff that has happened since last time I blogged.

In KDE-land I’ve mostly been helping out porting and finishing up porting stuff to KDE frameworks. A lot of smaller stuff like krename and kregexpeditor, but also helping out finishing up the porting of e. g. okular, ktorrent and konsole. And of course also Filelight.

I also worked a bit on new features, like URL hints in Konsole. Press a configurable key combo, numbers show up over recognized link, and press the number to open it: hints

I also discovered an old patch from Adam Treat to make Konsole recognize local files in addition to just URLs, which I cleaned up and integrated: files

I also tried to improve the look of search results, but making something that looks good with all color schemes is really hard, so I’m kind of stuck for now. If anyone have any ideas on how to do this properly it would be appreciated. But the last iteration looks like this:konsolesearchdark

Also a ton of smaller stuff in Konsole, like supporting OCS7 instead of just polling /proc for figuring out the current path, which should improve power saving a tiny bit. And a lot of stuff in various other KDE applications (and other applications, like the thermald Qt interface) and libraries that I don’t remember.


Unfortunately I haven’t had as much time for KDE stuff as usual, as I got a new job last year in a pretty cool company, taking up a lot of my time. We haven’t been public until yesterday, so I couldn’t really write much about it publicly, but now I can. We’re making a digital notebook, a tablet device with a e-paper display from E Ink and a digitizer from Wacom. The use-cases we’re targeting are reading, writing and sketching, so it is a pretty specialized device without a web browser or “social” integration and Facebook support and whatnot.

As we’re using Linux and Qt for pretty much everything above the kernel I’m really thankful for the KDE Frameworks, having such a nice collection of high-quality extra libraries makes my life much easier. We’re even using Qt for the HW testing application used in the factory (pictured at the top of the page).

And before anyone asks the obvious question; we don’t have the resources to officially support third-party development, so no official SDK. But I do plan on releasing the toolchain and there will be a way to enable SSH access over USB, so people can play with their own device. This should also allow us to use (L)GPL3 code on the device, using a non-ancient version of bash is nice.

The official page with more information is at


KDevelop: Seeking maintainer for Ruby language support

Wed, 2016/11/30 - 11:39pm


just a short heads-up that KDevelop is seeking for a new maintainer for the Ruby language support. Miquel Sabaté did an amazing job maintaining the plugin in the recent years, but would like to step down as maintainer because he's lacking time to continue looking after it.

Here's an excerpt from a mail Miquel kindly provided, to make it easier for newcomers to follow-up on his work in kdev-ruby:

As you might know the development of kdev-ruby has stalled and the KDevelop team is looking for developers that want to work with it. The plugin is still considered
experimental and that's because there is still plenty of work to be done. What has been
done so far:

  • The parser is based on the one that can be found on MRI. That being said, it's based on an old version of it so you might want to update it.
  • The DUChain code is mostly done but it's not stable yet, so there's quite some work to be done on this front too.
  • Code completion mostly works but it's quite basic.
  • Ruby on Rails navigation is done and works.

There is a lot of work to be done and I'm honestly skeptical whether this approach will end up working anyways. Because of this skepticism and the fact that I was using another editor, I ended up abandoning the project and thus kdev-ruby was no longer maintained by anyone.

If you feel that you can take the challenge and you want to contribute to kdev-ruby, please reach out to the KDevelop team. They are extremely friendly and will guide you on the process of developing this plugin.

Again, thanks for all your work Miquel, you will be missed!

If you're interested in that kind of KDevelop plugin development, please get in touch with us!

More information about kdev-ruby here:

Finding a valid build order for KDE repositories

Wed, 2016/11/30 - 11:13pm

KDE has been lately been growing quite a bit in repositories, and it's not always easy to tell what needs to be build before, do i build first kdepim-apps-libs or pimcommon?

A few days ago i was puzzled by the same question and realized we have the answer in the dependency-data-* files from the kde-build-metadata repository.

They define what depends on what so what we need to do is just build a graph with those dependencies and get a valid build order from it.

Thankfully python already has a module for graphs and stuff so was not that hard to write.

So say you want to know a valid build order for the stable repositories based on kf5-qt5

Here it is

Note i've been saying *a* valid build order, not *the* valid build order, since there are various orders that are valid since not every repo depends other repos.

Now i wonder, does anyone else find this useful? And if so to which repository do you think i should commit such script?

Qt Creator 4.2 RC1 released

Wed, 2016/11/30 - 1:55pm

We are happy to announce the release of Qt Creator 4.2 RC1.

Since the release of the Beta, we’ve been busy with polishing things and fixing bugs. Just to name a few:

  • We fixed that the run button could spuriously stay disabled after parsing QMake projects.
  • Qt Creator is no longer blocked while the iOS Simulator is starting up.
  • We added preliminary support for MSVC2017 (based on its RC).

For an overview of the new features in 4.2 please head over to the Beta release blog post. See our change log for a more detailed view on what has changed.

Get Qt Creator 4.2 RC1

The opensource version is available on the Qt download page, and you find commercially licensed packages on the Qt Account Portal. Please post issues in our bug tracker. You can also find us on IRC on #qt-creator on, and on the Qt Creator mailing list.

The post Qt Creator 4.2 RC1 released appeared first on Qt Blog.

Krita 3.1 Release Candidate

Wed, 2016/11/30 - 10:12am

Due to illness, a week later than planned, we are still happy to release today the first release candidate for Krita 3.1. There are a number of important bug fixes, and we intend to fix a number of other bugs still in time for the final release.

  • Fix a crash when saving a document that has a vector layer to anything but the native format (regression in beta 3)
  • Fix exporting images using the commandline on Linux
  • Update the OSX QuickLook plugin to use the right thumbnail sizes
  • Improved zoom menu icons
  • Unify colors on all svg icons
  • Fix tilt-elevation brushes to work properly on a rotated or mirrored canvas
  • Improve drawing with the stabilizer enabled
  • Fix isotropic spacing when painting on a mirrored canvas
  • Fix a race condition when saving
  • Fix multi-window usage: the tool options palette would only be available the last openend window, now it’s available everywhere.
  • Fix a number memory leaks
  • Fix selecting the saving location for rendering animations (there are still several bugs in that plugin, though — we’re on it!)
  • Improve rendering speed of the popup color selector

You can find out more about what is going to be new in Krita 3.1 in the release notes. The release notes aren’t finished yet, but take a sneak peek all the same!


Note for Windows users: if you encounter crashes, please follow these instructions to use the debug symbols so we can figure out where Krita crashes.


A snap image for the Ubuntu App Store is available in the beta channel.

OSX Source code

Fuzzing Qt for fun and profit

Tue, 2016/11/29 - 12:50pm

Many KDAB engineers are part of the Qt Security Team. The purpose of this team is to get notified of security-related issues, and then decide the best course of action for the Qt project.

Most of the time, this implies identifying the problem, creating and submitting a patch through the usual Qt contribution process, waiting for it to be merged in all the relevant branches, and then releasing a notice to the users about the extent of the security issue. We also work together with downstreams, such as our customers, Linux distributions and so on, in order to minimize the risks for Qt users of exposing the security vulnerability.

However, that’s only part of the story. As part of the security team, we can’t simply wait for reports to fall in our laps; we also need to have a proactive approach and constantly review our code base and poke it in order to find problems. For that, we use a variety of tools: the excellent Coverity Scan service; the sanitizers available in GCC and Clang; clazy, maintained by KDAB’s engineer Sérgio Martins; and so on.

Note that all these tools help catch any sorts of bugs, not only the security-related ones. For instance, take a look at the issues found and fixed by looking at the Undefined Behavior Sanitizer’s reports, and the issues fixed by looking at Coverity Scan’s reports.

Today I want to tell you a little more about one of the tools used to test Qt’s code: the American Fuzzy Lop, or AFL to friends.


What is AFL? It’s a fuzzer: a program that keeps changing the input to a test in order to make it crash (or, in general, misbehave). This “mutation” of the input goes on forever — AFL never ends, just keeps finding more stuff, and optimizes its own searching process.

AFL gained a lot of popularity because:

  • it is very fast (it instruments your binaries);
  • it uses state-of-the-art algorithms to mutate the input in ways that maximize the effect on the target program;
  • the setup is immediate;
  • it has a very nice text-based UI.

The results speaks for themselves: AFL has found security issues in all major libraries out there. Therefore, I decided to give it a try on Qt.

The setup

Setting up AFL is straightforward: just download it from its website and run make. That’s it — this will produce a series of executables that will act as a proxy for your compiler, instrumenting the generated binaries with information that AFL will need. So, after this step, we will end up with afl-gcc, afl-g++ and so on.

You can go ahead and build an instrumented Qt. If you’ve never built Qt from source, here’s the relevant documentation. On Unix systems it’s really a matter of running configure with some options, followed by make and optionally make install. The problem at this step is making Qt use AFL’s compilers, not the system ones. This turns out to be very simple, however: just export a few environment variables, pointing them to AFL’s binaries:

export CC=/path/to/afl-gcc export CXX=/path/to/afl-g++ ./configure ... make

And that’s it, this will build an instrumented Qt. (A more thorough solution would involve creating a custom mkspec for qmake; this would have the advantage of making the final testscase application also use AFL automatically. For this task, however, I felt it was not worth it.)

Creating a testcase

What you need here is to create a very simple application that takes an input file from the command line (or stdin) and uses it to stress the code paths you want to test.

Now, when looking at a big library like Qt, there are many places where Qt reads untrusted input from the user and tries to parse it: image loading, QML parsing, (binary) JSON parsing, and so on. I decided to give a shot at binary JSON parsing, feeding it with AFL’s mutated input. The testcase I built was straightforward:

#include <QtCore> int main(int argc, char **argv) { QCoreApplication app(argc, argv); QFile file(app.arguments().at(1)); if (! return 1; QJsonDocument jd = QJsonDocument::fromBinaryData(file.readAll()); return 0; }

Together with the testcase, you will also need a few test files to bootstrap AFL’s finding process. These files should be extremely small (ideally, 1-2KB at maximum) to let the fuzzer do its magic. For this, just dump a few interesting files somewhere next to your testcase. I’ve taken random JSON documents, converted them to binary JSON and put the results in a directory.

Running the fuzzer

Once the testcase is ready, you can run it into the fuzzer like this:

afl-fuzz -m memorylimit \ -t timeoutlimit \ [master/slave options] \ -i testcases/ \ -o findings/ \ -- ./test @@

A few explanatory remarks:

  • The testcases directory contains your reference input files, while the findings of the fuzzers will be written in findings.
  • To avoid blowing up your system, AFL sets very strict limits for execution of your test: it is allowed to allocate at most memorylimit megabytes of virtual memory and it is allowed to run for at most timeoutlimit milliseconds. You will typically want to raise the memory limit from its default (50MB) to something bigger, depending on your system and on the test.
  • One instance of afl-fuzz is single threaded; in order to maximize the search throughput on a machine with multiple cores/CPUs, you must manually launch it multiple times with the same -i and -o arguments. You should also give each instance a unique name and, if you want, elect one instance to do a deterministic search rather than a random one. This is all expressed through the master/slave options: pass to one instance the -M fuzzername option, and to all the others pass the -S fuzzername option. (All the fuzzernames must be unique).
  • Last but not least, @@ gets replaced by the name of a file generated by AFL, containing the mutated input.

For reference, I’ve launched my master like this:

afl-fuzz -m 512 -t 20 -i testcases -o findings-json -M fuzzer00 -- ./afl-qjson @@

The output is a nice colored summary of what’s going on, updated in real time:

AFL running over a testcase.

AFL running over a testcase.

Now: go do something else. This is supposed to run for days! So remember to launch it in a screen session, and maybe launch it via nice so that it runs with a lower priority.


After running for a while, the first findings started to appear: inputs that crashed the test program or made it run for too long. Once AFL sees such inputs, it will save them for later inspection; you will find them under the findings/fuzzername subdirectories:

findings-json/fuzzer00/crashes/id:000000,sig:06,src:000445,op:arith8,pos:168,val:+6 findings-json/fuzzer00/crashes/id:000001,sig:11,src:000445,op:arith8,pos:168,val:+7 findings-json/fuzzer00/crashes/id:000002,sig:11,src:000449,op:arith8,pos:196,val:+6 findings-json/fuzzer00/crashes/id:000003,sig:11,src:000489,op:flip1,pos:435 findings-json/fuzzer01/crashes/id:000000,sig:06,src:000526,op:havoc,rep:2 findings-json/fuzzer01/crashes/id:000001,sig:11,src:000532,op:havoc,rep:2 findings-json/fuzzer01/crashes/id:000002,sig:06,src:000533,op:havoc,rep:4

If you’re lucky (well, I guess it depends how you look at it…), you will end up with inputs that indeed crash your testcase. Time to fix something!

You may also get false positives, in the form of crashes because the testcase runs out of memory. Remember that AFL imposes a strict memory limit on your executable, so if your testcase allocates too much memory and does not know how to recover from OOM it will crash. If you see many inputs crashing into AFL but not crashing when running normally, maybe your testcase is behaving properly, but just running out of memory, and increasing the memory limit passed to AFL will fix this.

The sig part in the name of each saved input should give you a hint, telling you which Unix signal caused the crash. In the listing above, signal number 11 is a SIGSEGV, which is indeed a problem. The signal 06 is SIGABRT (that is, an abort), which was generated due to running out of memory.

To reproduce this last case, just manually run the test over that input, and check that it doesn’t misbehave; then rerun it, but this time limiting its available memory via ulimit -v memory_available_in_kilobytes. If the testcase works normally but crashes under a stricter ulimit, it’s likely that you’re in an out-of-memory scenario. This may or may not require a fix in your code; it really depends whether it makes sense for your application/library to recover from an OOM.

Fixing upstream

After reporting the findings to the Security Team, it was a matter of a few days before a fix was produced, tested and merged into Qt. You can find the patches here and here.

Tips and tricks

If you want to play with AFL, I would recommend you to do a couple of things:

  • Set your CPU scaling governor to “performance”. This is for a couple of reasons: it makes no sense for the kernel to try to throttle down your CPUs if AFL is running; and it is actually a bad thing because it interferes with AFL measurements. AFL complains about this, so keep it happy and disable “powersave” or “ondemand” or similar governors.
  • Use a ramdisk for the tests. AFL needs to write a new input file every time it runs your application; for the JSON testcase above, AFL was achieving about 1000 executions/second/core. Each of this run needs a new test file as input; in addition to that, AFL needs to write stuff for its own bookkeeping.This will put your disk under very considerable stress, possibly even wear it out. Now, any modern filesystem will still flush data to disk only a few times every second (at most), but still, why hit the disk at all? One can simply create a ramdisk, and run AFL in there: $ mkdir afl # mount -t tmpfs -o size=1024M tmpfs afl/ $ cd afl/ $ afl-fuzz -i inputs -o findings ...
  • Do not let this run on a laptop or some other computer which may overheat. AFL is tremendously resource intensive and runs for days. If you want to get liquid cooling for your workstation, this is the perfect excuse.

Fuzzing is an excellent technique for testing code that needs to accept untrusted inputs. It is straightforward to set up and run, requires no modifications to the tested code, and it can find issues in a relatively short timespan. If your application feature parsers (especially of binary data), consider to keep AFL running over it for a while, as it may discover some serious problems. Happy fuzzing!

About KDAB

KDAB is a consulting company dedicated to Qt and offering a wide variety of services and providing training courses in:

KDAB believes that it is critical for our business to invest into Qt3D and Qt, in general, to keep pushing the technology forward and to ensure it remains competitive.

The post Fuzzing Qt for fun and profit appeared first on KDAB.

Kdenlive’s first bug squashing day

Mon, 2016/11/28 - 11:59pm

Kdenlive 16.12 will be released very soon, and we are trying to fix as many issues as possible. This is why we are organizing a Bug squashing day, this friday, 2nd of december 2016 between 9am and 5 pm (Central European Time – CET).

Kdenlive needs you

There are several ways you can help us improve this release, depending on your skills or interests. During the bug squashing day, Kdenlive developers will be reachable on IRC at, channel #kdenlive to answer your questions. A collaborative notepad has also been created to coordinate the efforts.

If you have some interest / knowledge in coding:
You can download Kdenlive’s source code and find instructions on our wiki. We will also be available on friday on IRC to help you setup your development environment. You can then select an ‘easy bug‘ from the notepad list and then look at the code to try to fix it. Feel free to ask your questions on IRC, the developers will guide you through the process, so that you can get familiar with the parts of the code you will be looking at.

If you are a user and encounter a bug:
You can help us by testing the Kdenlive 16.12 RC version. Our easy to install AppImage and snap packages will be updated on the 1rst of december with the latest code (Ubuntu users can also use our PPA). This will allow you to install the latest version without messing with your system. You can then check if a bug is still there is the latest version, or let us know if it is fixed.

So feel free to join us this friday, this is your chance to help the world of free software video editing !

For the Kdenlive team,
Jean-Baptiste Mardelle

Mentoring for Google Code-in – WikiToLearn

Mon, 2016/11/28 - 5:35pm

Google Code-in

Google Code-in has just begun. I’ll be mentoring this time.🙂

If you know any pre-university students who are interested in computers or open source please do inform them about this. Task varies from coding, documentation, training, outreach, research, quality assurance and user interface. Also, students earn prizes for their successful completion of tasks.

What is Google Code-in ?

Google Code-in is a contest by Google to introduce pre-university students (ages 13-17) to open source software development. Since 2010, over 3200 students from 99 countries have completed work in the contest.

What I’ll be doing ?

I’ll be mentoring for tasks under WikiToLearn, KDE organization.
I have published a task related to WikiToLearn community : What can I do for WikiToLearn

I’ll be helping students with code and design for this task.

I have few other tasks in my mind. I may publish them as we move on (based on our progress).

Why I’m doing this ?

Well, I just love open source and like helping others to get into FOSS. And WikiToLearn, KDE is a great community to work with.
I strongly believe in it’s philosophy – “Knowledge only grows if shared”. It feels good to help the younger generation to get into community so that our community grows big.

Join WikiToLearn now and contribute however you can.🙂

KDAB and Meiller – Tipper Truck App

Mon, 2016/11/28 - 4:54pm

Design, Technical Excellence and Superb User Experience

Why does a tipper truck need an app? Meiller is the leading manufacturer of tippers in Europe. KDAB software developers and UI/UX designers worked with Meiller to create a mobile app that interacts with embedded hardware on the truck, allowing drivers to diagnose and fix problems – even when on the road. KDAB shows us how technical excellence and stunning user experience go hand in hand.

The post KDAB and Meiller – Tipper Truck App appeared first on KDAB.

Qt World Summit 2016 Webinar Series – Christmas Sessions

Mon, 2016/11/28 - 1:13pm


‘Tis the season to be jolly and as always we are just trying to be Qt /kjuːt/. We just keep on giving and giving and here is another present for you. We are hosting webinars based on the breakout sessions from The Qt World Summit 2016. So, grab a cup of cocoa and sign up for our December Tuesday webinars where you can join our R&D developers online for technical sessions that will keep your computer warm throughout 2017. The best thing of all – even if you can’t make it online – by signing up Santa will bring the recorded session to you.

Introducing Qt Visual Studio Tools

December 6th at 5 pm CET, by Maurice Kalinowski

New possibilities with Qt WebEngine

December 13th at 5 pm CET, by Allan Sandfeld Jensen

Qt Quick Scene Graph Advancements in Qt 5.8 and Beyond 

December 20th at 10 am CET (Rescheduled from November 15th), by Laszlo Agocs


Also, stay tuned for details on upcoming webinars in January!

Make sure to check our events calendar for the full list of Qt-related events delivered by us and our partners.

The post Qt World Summit 2016 Webinar Series – Christmas Sessions appeared first on Qt Blog.

Introducing new KCM for network configuration

Mon, 2016/11/28 - 12:45pm

After several attempts trying to write a new KCM for network configuration and actually not finishing any of them, I decided to start one more time, but this time my goal was to simply transform the old editor into a bit nicer KCM and place it into system settings where this was missing for very long time. You can see my current result below.

This is still same editor as it was existing before as a standalone application, except the list of connections is now written in QML and is similar to the applet we have in systray. I also had to rewrite the editor widget a bit because it’s currently implemented as a dialog with a tabwidget inside where each tab is represented by one setting widget (e.g. Ipv4SettingWidget), For the new KCM we now have ConnectionEditorBase widget doing all the logic behind, like creating specific setting widgets based on connection type and so on. This widget alone doesn’t display anything and you have to actually subclass it and reimplement method taking care of layouting. This allows me to have e.g. ConnectionEditorTabWidget which just subclasses ConnectionEditorBase and reimplements addWidget() method to place setting widgets into QTabWidget. In future we can also simply write a new UI/layout on top ConnectionEditorBase widget and get rid of the tab layout.

Regarding functionality, it should be already almost on par with functionality of the editor. There are still some missing features (like import/export of VPN), but besides that I think everything else is going well. With the new KCM there are also some minor improvements, like you can now reset your not-saved changes you made to a connection. My plan is to get this into Plasma 5.9 which is supposed to be released in january so I still have plenty of time to finish missing features and address issues I made during this transition and of course time to take your comments into account and make this KCM as most usable for everyone I can :).

Desktops DevRoom @ FOSDEM 2017: you are still on time to submit a talk

Mon, 2016/11/28 - 12:24am

FOSDEM 2016 is going to be great (again!) and you still have the chance to be one of the stars.

Have you submitted your talk to the Desktops DevRoom yet?


Remember: we will only accept proposals until December 5th. After that, the Organization Team will get busy and vote and choose the talks.

Here is the full Call for Participation, in case you need to check the details on how to submit:

FOSDEM Desktops DevRoom 2017 Call for Participation

Topics include anything related to the Desktop: desktop environments, software development for desktop/cross-platform, applications, UI, etc

KDE Developer Guide needs a new home and some fresh content

Sun, 2016/11/27 - 11:54pm

As I just posted in the Mission Forum, our KDE Developer Guide needs a new home. Currently it is "not found" where it is supposed to be.

We had great luck using markdown files in git for the chapters of the Frameworks Cookbook, so the Devel Guide should be stored and developed in a like manner. I've been reading about Sphinx lately as a way to write documentation, which is another possibility. Kubuntu uses Sphinx for docs.

In any case, I do not have the time or skills to get, restructure and re-place this handy guide for our GSoC students and other new KDE contributors.

This is perhaps suitable for a Google Code-in task, but I would need a mentor who knows markdown or Sphinx to oversee. Contact me if interested! #kde-books or #kde-soc

Plasma 5.8.4, Applications 16.08.3 and Frameworks 5.28.0 available in Chakra

Sun, 2016/11/27 - 9:43pm

This announcement is also available in Italian, Spanish and Taiwanese Mandarin.

As you have probably noticed, this move took a while to reach stable due to the issues with our main server, which resulted in a downtime of 2 days for our website and all the related services. There was nothing we could do, since our hosting provider experienced a major subsystem malfunction. The website might be a bit unstable or slow in the following days until the issue is properly fixed. We can only apologize for any inconvenience.

But the latest updates for KDE's Plasma, Applications and Frameworks series are now available to all Chakra users.

Plasma 5.8.4 includes three weeks worth of bugfixes and new translations, with changes mostly in the breeze theme, kwin and plasma workspace packages..

Applications 16.08.3 include more than 20 recorded bugfixes and improvements to ' kdepim, ark, okteta, umbrello, kmines, among others'. kdelibs was also updated to version 4.14.26.

Frameworks 5.28.0 include a new syntax-highlighting package, in addition to the usual bugfixes and improvements, mostly found in kio, plasma-framework, kwidgetsaddons and ktexteditor.

Other notable package upgrades and changes:


  • kirigami 1.1.0 has been added to the repos, a QtQuick based components set
  • openjdk 8.u112
  • cpupower 4.8.6
  • curl 7.51.0
  • dkms 2.3+git161025
  • eclipse-ecj 4.6.1
  • graphicsmagick 1.3.25
  • inetutils 1.9.4
  • libxi 1.7.8
  • ndiswrapper 1.61
  • net-tools 1.60.20160710git
  • pypy 5.6.0
  • rust 1.13.0
  • scons 2.5.1
  • sddm 0.14.0
  • tzdata 2016i

  • choqok 1.6.0
  • kdevelop 5.0.2
  • qtcreator 4.1.0

  • hugin 2016.2.0

  • wine 1.9.24
  • winetricks 20161107

    It should be safe to answer yes to any replacement question by Pacman. If in doubt or if you face another issue in relation to this update, please ask or report it on the related forum section.

    Most of our mirrors take 12-24h to synchronize, after which it should be safe to upgrade. To be sure, please use the mirror status page to check that your mirror synchronized with our main server after this announcement.
  • Marble Maps 1.0 has been released

    Sun, 2016/11/27 - 7:20pm

    It’s finally done! I’m happy to tell you that Marble Maps version 1.0 has just landed in the Google Play Store. We hope you like it as much as we do 🙂

    Many thanks to all contributors who made this possible. Thanks to a multitude of performance improvements all over the place, vector rendering has become very fast. And thanks to the ever-improving vector tile creation toolchain we are able to provide a lot more data than I anticipated some weeks ago. For the first version there are Germany and 200 cities world-wide in full detail, as well as most European countries and the USA in high detail (up to tile level 13 or 15). For the rest of the world we provide medium detail at least (up to tile level 9). The plan, of course, is to provide full vector data for the whole world in the near future.




    Testing the untestable

    Sun, 2016/11/27 - 8:27am
    Treading on thin ice

    Admit it: how many times you have seen “software from this branch is completely untested, use it at your own risk” when you checked the latest code from any FOSS project? I bet you have, many times. For any reasonably modern project, this is not entirely true: Continuous Integration and automated testing are a huge help in ensuring that the code builds and at least does what it is supposed to do. KDE is no exception to this, thanks to and a growing number of unit tests.

    Is it enough?

    This however does not count functional testing, i.e. checking whether the software actually does what it should. You wouldn’t want KMail to send kitten pictures as a reply to a meeting invitation from your boss, for example, or you might want to test that your office suite starts and is able to actually save documents without crashing. This is something you can’t test with traditional unit testing frameworks.

    Why does this matter to KDE? Nowadays, the dream of always summer in trunk as proposed 8 years ago is getting closer, and there are several ways to run KDE software directly from git. However, except for the above strategy, there is no additional testing done.

    Or, should I rather say, there wasn’t.

    Our savior, openQA

    Those who use openSUSE Tumbleweed know that even if it is technically a “rolling release” distribution, it is extensively tested. That is made possible by openQA, which runs a full series of automated functional tests, from installation to actual use of the desktops shipped by the distribution. The recently released openSUSE Leap has also benefited from this testing during the development phase.

    “But, Luca,” you would say, “we already know about all this stuff.”

    Indeed, this is not news. But the big news is that, thanks mainly to the efforts of Fabian Vogt and Oliver Kurz, now openQA is testing also KDE software from git! This works by feeding the Argon (Leap based) and Krypton (Tumbleweed based) live media, which are roughly built daily, to openQA, and running a series of specific tests.

    You can see here an example for Argon and an example for Krypton. openQA tests both the distro-level stuff (the console test) and KDE specific operations (the X11 test). In the latter case, it tests the ability to launch a terminal, running a number of programs (Kate, Kontact, and a few others) and does some very basic tests with Plasma as well.

    Is it enough to test the full experience of KDE software? No, but this is a good solid foundation for more automated testing to spot functional regressions: during the openSUSE Leap 42.2 development cycle, openQA found several upstream issues in Plasma which were then communicated to the developers and promptly fixed.

    Is this enough for everything?

    OF course not. Automated testing only gets so much, so this is not an excuse for being lazy and not filing those reports. Also, since the tests run in a VM, they won’t be able to catch some issues that only occur on real hardware (multiscreen, compositing). But is surely a good start to ensure that at least obvious regressions are found before the code is actually shipped to distributions and then to end users.

    What needs to be done? MOre tests, of course. In particular, Plasma regression tests (handling applets, etc.) would be likely needed. But as they say, every journey starts with the first step.

    New features in Ark 16.12

    Sat, 2016/11/26 - 6:25pm
    Ark, the file archiver and compressor developed by KDE, has seen a lot of development for the upcoming 16.12 release. This blog post provides a summary of the most important changes.
    Advanced archive editing
    Thanks to the excellent GSoC work done by Vladyslav Batyrenko (mvlabat) this summer, it’s now possible to perform advanced editing operations for an archive. This means that files/folders can be moved and copied within an archive. This functionality is available either from the context menu or with the well-known keyboard shortcuts (CTRL+C, CTRL+X, CTRL+V).

    Additionally, files/folders can now be added to any subfolder of an archive. In the past files could only be added to the root of an archive. This is done by selecting a subfolder and then activating the “Add Files…” item from either the “Archive” menu or the context menu.

    Finally, files/folders can be renamed. This can be done by selecting the entry and pressing F2 or selecting Rename from the context or “File” menu.


    See mvlabat’s blog post for more info on these features.

    Choose compression method
    Ark now allows setting compression method for supported archives. This is possible for Zip and 7z archives. For instance, LZMA compression may be selected for Zip archives to improve compression ratio (requires 7z to be installed). Note that Zip archives using newer compression methods may not be supported by older unarchivers (e.g. the unzip utiliy), but should be supported by modern software such as WinZip, WinRar and 7-Zip for Windows. The compression method can be set in the Compression section when creating a new archive.
    AES-encryption for Zip archives

    Strong AES-encryption is now used by default for Zip archives when 7z is installed. Three AES key lengths are available (128, 192 and 256 bit). The classic Zip encryption method (ZipCrypto), which is now known to be vulnerable but is more widely supported, can also be selected. Again, note that e.g. unzip doesn’t support extracting archives with AES-encryption.


    Support for AR archives
    We added support for opening AR archives. This old Unix format is now mostly used for static libraries (*.a) on Linux systems. So static libraries can now be opened by Ark to view the contained object files.


    Performance improvements

    Opening large archives should be much faster with Ark 16.12. Previously, the model containing all the archive entries wasn’t created until the archive was completely loaded from disk. Now we start creating the model right away. This resulted in a greatly reduced time to open large archives.


    Progress information

    Ark now shows progress in percentage for more operations (e.g. open, extract, add) than previously. This means it’s possible to know approximately how long an operation will take. Additionally, progress is now always shown in Plasma’s system tray where operations can also be aborted. When progress in percentage is available it is also shown in the task manager item (thanks to KBroulik’s work).



    Bugfixes and under-the-hood changes
    A ton of bugs were fixed and the code architecture further modernized.
    Testing and feedback
    The  16.12 beta is now out, while the release candidate should be out on December 1st and the final release on December 15th. Please test the new features and provide feedback either as comments on this blog post or as bugs on KDE’s bugzilla.
    What’s next?
    For Ark 17.04 we hope to add a graphical interface for configuring the plugins Ark uses to handle different archive formats. Also, we are investigating whether we can use libzip to handle zip archives.
    If there are features you are missing in Ark, please let us know.
    Thanks to Elvis Angelaccio and Vladyslav Batyrenko (mvlabat) for their development work on Ark.