Subscribe to Planet KDE feed
Planet KDE -
Updated: 6 min 33 sec ago

State of the KF5 Android CI

Tue, 2016/06/28 - 8:23pm


I would have liked to say, “Yeah the Android CI runs!” – But we are not there yet; pretty close actually, and close enough that it already makes sense to tell about it, yet a few last Jenkins settings remain to be done and real life issues cause this to take a few more days. So, I will give a short primer on what we prepared in Randa.

After tonight’s run of the KApiDox generation at, all KF5 libraries whill correctly show if they support Android or not. The current count of supported frameworks is 16 and since last year some important ones like KI18n, KCoreAddons and Threadweaver were added to the list. For all of them I can say that the build is tested in my Android cross-building Docker image together with Qt 5.6 and the Android CMake toolchain from Extra-CMake-Modules.

Obviously, the currently most pressing issue is to get a proper CI system for KF5 on Android into place. The very  same Docker image, as names above, was extended to be utilized as container for our future Docker based CI system. During the Randa week the container was already integrated into and just waits there for the last infrastructure bits to be correctly set. The next step following the build coverage will be the integration of automatic unit tests. For e.g. Linux systems this is a boring topic, since one can just run tools like CTest. On Android the story is slightly more complicated. Our goal is to run unit tests in a closest possible real life scenario. That means, we want unit test to be packaged as APK files, then installed and run in an Android emulator, and also need to get the results back. This means, we need some wrapper code around the generated CTest files that does all the packaging, installing, result downloads, etc. The main parts of these scripts were finished during Randa and as a proof of concept, for the first tested framework Attica’s unit tests already pass on my devel machine \o/ Once we have the build coverage in place, getting the unit tests checked on the CI will be the next step.

I hope that all of these existing bits will come together soon. However, with QtCon approaching in already two months and me giving a talk about KF5 on Android, it sounds there is a clear deadline for this integration to conclude:)

PS: the fundraiser campaign for Randa is still ongoing for 12 more days!

The Road of Trials

Tue, 2016/06/28 - 4:20pm

Hello all!

In my last blog post I said that I would work on extending support for paint operations like 'fill'. I have done so, albeit more as a necessity in fixing the assistant code. Moreover, I have fixed a number of other paint operations which are vital in painting the various assistants Krita offers currently.

Tool Outline

Before I talk about the assistant fixes I would like to talk about my fix of the tool outline code. This code is responsible for drawing the brush outline which follows the cursor and some of the selection previews while the user is dragging the selection over the canvas. Prior to starting work on the porting of our decoration code to OpenGL 3.2 some work had been done already by another member of the Krita community (beelzy). She changed several deprecated drawing functions to make use a vertex array object and multiple vertex buffer objects. The idea was that we bind a single VAO and instead of uploading the data directly to the GPU, the data is now first uploaded to a vertex buffer object which can be used for drawing the shape.

This approach works fine for drawing the canvas on the screen and drawing the chequerboard texture you see if your layer is transparent, however it broke down for drawing the tool outline. The reason for this is that the tool outline is.. well.. a line! So as opposed to drawing a quadrilateral polygon (rectangle) for the canvas we now want to render a line. This isn't a problem on its own however we can see that the data required to draw a 'quad' is significantly different than for a line. Besides, the data required to draw a selection as it is being dragged over the canvas changes significantly in a short period of time whereas the data requires to draw the canvas shape remains the same for longer periods of time.
Wouldn't it be nice if we could have one buffer for storing the data required to draw the canvas shape, and another buffer for storing the data required to draw the volatile lines?

To address this issue I split our original vertex array object (VAO) into two of these objects. One specifically meant for drawing quadrilaterals, and the other specifically for drawing lines. On top of that, I specified in these objects that the quad data doesn't change very much by setting it to GL_STATIC_DRAW and that the line object does change a lot by setting it to GL_STREAM_DRAW. The OpenGL documentation explains these constants better than I can so I will just post them here.

GL_STATIC_DRAW: The data store contents will be modified once and used many times.
GL_STREAM_DRAW:  The data store contents will be modified once and used at most a few times.

So that is just what we want! We define the data required to draw a rectangle once and render it many times. In contrast, we keep redefining the data required to draw the line at that particular moment and draw it at most a few times.

Now the astute reader might note that the canvas size and shape might change as well during zooming or rotating. And that's true! Here is a little dilemma we have to answer. The traditional way of handling these transformation is by using a matrix. The matrix contains all the transformation necessary so that when it is applied to every vertex of the polygon the whole polygon is transformed to its new location and shape in the Vertex Shader. This doesn't change the vertex positions stored in the graphics card, but rather puts them in the correct position every frame. A more naive approach might be to just re-upload the already transformed vertices to the graphics card. The reason why I say naive is that this approach would obviously cause huge amount of slowdown when we are working with a lot of vertices (like a game model). Uploading tens of thousands of vertices as opposed to a matrix consisting of 16 floating point numbers every time a model changes position would be stupid. However, here we are only working with 4 vertices. In this case we would only have to re-upload 8 floating point numbers to the graphics card in order to update the shape. Moreover, maybe we don't even have to re-upload the data on every frame but only when the user changes the size or position of the canvas. Turns out this approach is not so naive and may be the best option.

The metric 12 fiasco

For the past few weeks my terminal has been spammed by a single warning over and over again. "QOpenGLWidget::metric(): unknown metric 12". So this week I decided to go out and investigate what was causing this so that I could debug in peace. After tedious grepping and taking notes I traced it back to metric 12 coming from somewhere inside QWidget. So now that I knew where metric 12 was coming from and what it was supposed to do imagine my surprise when I went into QOpenGLWidget and found that the handling of metric 12 had simply been commented out. D'oh! Turns out that during the beta phase of Qt 5.6.0 they commented out this handling and our forked paint engine was based on this beta version. So I updated my Qt to the new version 5.6.1 and copied its handling of metric 12 back into our paint engine. Let's hope there aren't more surprise changes between our forked paint engine and Qt 5.6.1's paint engine.

Perspective assistant

In my previous post I noted that many assistants crashed upon the first click on the canvas. For some profane reason I decided to tackle the perspective assistant first, and boy did I regret that.

Here is a partial compendium of all the different things the perspective assistant consists of:

  • Corner nodes
  • Nodes between corner nodes
  • Handles for nodes between corner nodes
  • Grid between nodes
  • Shapes behind the icons of the assistant widget
  • Icons on the assistant widget
  • Lines while moving nodes
  • Lines drawn in hidden mode
  • Crosshair where perspective lines meet
So where to even begin? Well the way I started was putting down a return statement at the start of every perspective function I could find (and there are quite a few). Then by removing these statements one by one I would find it to crash multiple times. Each of these crashes relates to a path ultimately leading to a function in the paint engine that was still using legacy code.

One such path lead to a new branch in the code that is responsible for painting strokes (aptly named stroke()). This new branch involved code for painting non-opaque strokes. Qt does this by using a stencil buffer (for reasons which are unclear to me). This branch seems to be called for the drawing of the grid lines which are (barely) transparent.

Another path was responsible for drawing the assistant widget which consists of a circle and a rounded rectangle combined to form the background and 3 icons pasted on top of it. The icons are drawn using QPainter::drawPixmap, which calls drawImage, which calls drawTexture, which calls.. wait what's another word for image?

And then the circle.. ultimately it gets drawn using a function called drawVertexArrays() in the paint engine. This function receives all the vertices of the circle. But then how do you fill that circle? Well there are many ways, however since I saw that it was being drawn using GL_TRIANGLE_FAN which makes a fan of triangle I assumed the following shape:

The first vertex in the array would be taken as the central hub and all consequent vertices would form a triangle with the hub and the previous vertex. But there's a problem, the data I received in the function didn't contain a central vertex. How does this work? Well turns out it works pretty cleverly by just taking one of the circle vertices as the central hub. It results in a circle that looks like this in wire-frame:

I should note at this point that the assistant code is very old and quite messy which resulted in a lot of confusion over which code was responsible for what. I will probably visit it again at some point to clean it up and hopefully make the assistant faster.

Some more bugs caused by my own stupidity kept me busy for the remainder of the time this week, but I won't bother you with the details (I'm ashamed).

Previously I made a graph to keep track of which decorations were broken and which were fixed. Well... turns out I had less merit from that than I expected. The perspective assistant was hoarding all the methods required to draw every assistant. And since I fixed the perspective assistant, by proxy I fixed every other assistant that we offer. In addition, I fixed the tool outline painting which was also used by many selections, so this is the status now:

I expected some assistants to be fixed, but I certainly didn't expect this. In any case there are still a lot of code paths in the paint engine which aren't called by Krita but still need to be ported to OpenGL 3.2.

Finally, the way that I currently fix the paint engine code is by uploading the data to a VBO instead of directly passing it to the graphics card (which isn't allowed anymore). One can imagine however that adding this extra step in between doesn't mean the code becomes faster. The way in which OpenGL 3.2 could be faster is by uploading our data once and then drawing it multiple times. So over the coming weeks I will investigate if it is possible to cache certain drawing operations so that drawing the same thing over and over again doesn't need any uploading of data to the graphics card. That is where a speed up will come from and that is what Qt would be happy with.

Oh, and here's a little picture of all the assistants in action:

KStars on Windows – Midterm evaluation

Tue, 2016/06/28 - 11:59am

Hello everyone!

Midterm evaluation has passed and now it’s time for a new blog post! There are a couple of weeks from the last time I’ve talked about my progress with my Google Summer of Code project.

In my last post I presented the alpha version of KStars for Windows that missed some very important functionalities for KStars, such as:

  • FITS Viewer Tool: it is integrated with the INDI framework for seamless display and manipulation of captured FITS images. FITS stands for Flexible Image Transport System and is the standard file format used to store most astronomical data files.
  • INDI and Ekos

In order to have KStars’ FITS Viewer Tool available on Windows, I was needed to build the following two packages:

  • CFITSIO: it is a library of C and Fortran subroutines for reading and writing data files in FITS data format.
  • WCS: the FITS “World Coordinate System” (WCS) standard defines keywords and usage that provide for the description of astronomical coordinate systems in a FITS image header.

Thus, I used my old friend, emerge tool, again: firstly I had to write a CMakeLists.txt for compiling the sources and creating a static library and then I created a tar archive required by emerge tool. For downloading the tar archive I used my personal website as host. (i.e.

After I successfully built the sources I ran again ‘emerge –update kstars’ to be sure that KStars found the libraries needed. Actually, I ran it severeal times so my Windows’ version of KStars to be up-to-date with origin master.

Regarding INDI and Ekos, after discussing with my mentor due to difficulty of INDI port, I shifted my focus to QA & Documentation that were not updated for a long time now. Thus, I already created a QA list for current KStars’ version on KDE wiki. I will modify and update it as necessary, removing and adding new tests for KStars tools:

In conclusion, KStars on Windows project is going in the right way and now it’s time to make sure that KStars on Windows meets our users’ expectations. I want to give thanks once more to my mentor, Jasem, who gave valuable help and supported me every time I needed it.

Best regards,

Debian: Reproducible builds update

Mon, 2016/06/27 - 5:58pm

A quick update to note that I did complete extra-cmake-modules and was given the green light to push upstream and in Debian and will do so asap.
Due to circumstances out of my control, I am moving a few states over and will have to continue my efforts when I arrive at
my new place of residence in a few days. Thanks
for understanding.


Brexit Is Not Happenning

Sun, 2016/06/26 - 6:50pm

David Cameron is an astounding genius. Of course, you'd expect nothing less from an alumni of Eton and Oxford, but to see his genius in action in such a grand public scale makes it no less incredible.

This meme was doing the rounds on Facebook yesterday. But I beg to differ - this wasn't a rage-quit. This was a chess move so calculated and so devastating that the full extent of its consequences will take a long time to become apparent.

Biggest Rage Quit of 2016

With his resignation, David Cameron ensured that the person who succeeds him is going to commit political suicide. And someone will have to succeed him, because someone will have to become the next Prime Minister of the United Kingdom. And that person will have to die. Whether or not that person is Boris Johnson remains to be seen.

Here's the deal. The referendum was set up to be advisory, not binding. The British Parliament is under no legal obligation to follow through on the results on the referendum, but if the government does not follow up, the entire premise of democracy falls apart. Therefore, Britain has to leave the EU, and to do that, someone will have to inform the European Commission by sending them a notice under Article 50 of the Lisbon Treaty. That someone was going to be David Cameron, until he decided to resign and let his successor do it.

On the face of it, that would be an honour for his pro-Leave successor. But of course, things aren't so simple:

  • If the successor follows through and invokes Article 50, Scotland and Ireland breaks away and joins the EU. The United Kingdom is no longer united. A mountain of laws and regulations need to be torn up and new ones written in its place. The English economy collapses. The public quickly loses patience, and the blood is now on the successor's hands. His career is over, as is the United Kingdom as we know it.

  • If the successor does not follow through and fails to invoke Article 50, the premise of the democratic government falls apart. The governance of the UK is no longer democratic, and the entire establishment is a farce. The successor's career is over, as is probably the entire government's. It doesn't end there, however. The next person on the chair also finds himself in the same conundrum, as the next. Until the will of the British people change and they decide to stay - and make it known in another referendum - this cycle continues.

It appears I'm not alone in this theory. Someone commented exactly along these lines on a Guardian article, which is what led me to think twice - hey, I thought this line of thought was an effect of too much House of Cards, but it appears I might not be completely crazy after all - and take this out of my mind and put it to paper. Well, the Internet.

The next few months are going to be very interesting.

My hot summer starts

Sun, 2016/06/26 - 5:14pm

Hi again

It started being so much hot in Italy (too hot, to be clearer), and summer also started annoying me; the exam I have to do last tuesday has had a good result, and the next tuesday I will do another exam which I hate. These exam period, by the way, is full of contacts and hot.

As I said in the last post, we decided to write down every future project on meta; I wrote there 3 main projects, which one is already done, I can proudly say this. The other 2 are collaborations which have to be confirmed as well, but today I can say that one of them will be set in the next months: collaborating with an undergraduate student of engineering, we will write a course of fluyd dynamics.

I already wrote a fluyd dynamics course, which you can find in the macro text of mechanics, but it was a simplified course, for a first year student. So, the course we are going to write is a moredepth course for following years, which needs advantage calculus knowledge to be completely understood. This means that we will have two different versions of a courses, which is possible after the sprint we had in early May, and we have never tested it before: we created templates to allow this, another structure of the site but we still not have multiple versions on the site. So, this could be really a good test for us, to see the feedbacks from the students. Let’s see how it goes, we will start in late July or early August.

From my side it’s all. From the editing side, let me say that we are in calm and peace, studying hard for our exams, so we will work again in the next weeks.

If you want to read more about the participation we had at Wikimania @Esino Lario follow our facebook page.

Thenk you, Daniele

Wiki, what’s going on? (Part 6-WikiMania2016 Esino Lario)

Sun, 2016/06/26 - 3:34pm



Take a little town of 700 souls. Take more than one thousand wikipedians from Wikipedia and its sister projects who are organized for the most important conference about themes such as free education, open software and textbooks and free knowledge. Ok, now put these two elements togheter: what do you get? WikiMania 2016, held in Esino Lario: the annual conference celebrating Wikipedia and its sister free knowledge projects with conferences, discussions, meetups, training and a hackathon.


Themes such as free education and open textbooks, which are fundamentals in our philosophys, were deepened in details: we had the possibility to meet different members from WikiEdu and WikiMedia communities and we talked about our project with them. It’s amazing to see that there are so many people willing to create strong projects to spread knowledge all around the world! Workshops, talks and discussions with members of these communities have made these days simply amazing!


We made great contacts with lot of contributors and found lot of people interested in what we are doing: stay tuned! Great news are coming very soon!





Foto Group Wikimaniapeople wikimania

L'articolo Wiki, what’s going on? (Part 6-WikiMania2016 Esino Lario) sembra essere il primo su Blogs from WikiToLearn.

[GSoC] KDev-Embedded, workflow integration

Sun, 2016/06/26 - 12:45pm

After some work in the plugin development, now the project have a strong focus in a better integration with KDevelop workflow. Until now the Board Configuration window have some simple features to perform the upload process for beginner users, it's called by the embedded submenu in the KDevelop toolbar.

arduinowindow — KDevelop_063Welcome message arduinowindow — KDevelop_064Error message Success message

The problem is that the … Read the rest

Mid-term Results GSoC “LabPlot Theme Manager”

Sun, 2016/06/26 - 12:21am

Hi folks, now is the time to show the achievements which kept me busy from a month during the coding period for my ‘Google Summer of Code’ project LabPlot Theme Manager and taught me so many things related to an open-source community (KDE). This was a wonderful experience for me to brush up my coding skills and grow up a little-bit as a programmer.

As I mentioned in my last blog, LabPlot is an open source tool to analyze and visualize scientific data, for which we needed to create a theme manager which can provide users with different themes for the plots as per their taste. Before creating a theme manager, understanding so many existing functionalities in LabPlot was challenging for me, also which components the theme manager could effect was an important decision to take.

So, I would like to start from the beginning of coding period. In the starting of development phase, diving deep into the existing code was the first step to go. I created the development environment for KDE , so that the code can run on my local machine. Being a fresh programmer in an open-source project, it was quite new for me from getting all the dependencies to build an existing project and contribute to it which was made an easy task with the help of my mentor Alexander.

As I am a bit experienced to work with Qt Framework, I started to understand snippets of code and to make changes in it. I analyzed current implementation of LabPlot dock widgets, their features and defined which modules will interact/ which values must be preserved while implementing the theme manager.

For creating themes, I had to first decide on different color palettes/schemes (most fundamental need!). For this I went through a lot of websites and some literature to get to know how the color schemes should be put together. I decided on defining five colors in each theme palette. After this step, I defined the properties of a theme for eg. filling colors and styles of different components of the plot. I created one file per theme consisting of such properties. I created 3 themes so far- Classic, Creme and GraySlate…😉

Now, the next step was to be able to read and apply these properties to a plot. For this functionality, I defined various overloaded functions- loadTheme() in component classes which were able to set the properties individually (component wise) by reading them from theme files. Since all these components were children of class Worksheet, I simply used this (overloaded) function for all the Worksheet’s child classes to apply the theme.

I also wrote an algorithm to generate a total of 35 shades and tints from each 5-color palette, as we need a range of different colors to be applied on different parts of the plot and its curves.

Additionally, I defined a class ThemeHandler which controls the primary functionalities of the theme manager. I created an object of this class in CartesianPlotDock which as a result added a button in its dock widget window for choosing a theme. Currently when clicked, this button lets the user choose a theme by clicking on its name from a list of existing themes. The next step of this project would be now to create a preview panel of these themes.

All in all , this concludes my mid-term results:) Don’t forget to give me your feedback!

GNU/Linux bundled application ramblings

Sat, 2016/06/25 - 7:41pm

It’s impressive how in the last few months (and especially the last few weeks) the discussion around bundled applications for the GNU/Linux Desktop has sparked.

It’s especially interesting because:

  • The problem is not new.
  • The solutions that have attempted to tackle the problem in the past have been ignored (both by us developers and by distributions).

First, let me try to subjectively summarize the problem: Historically, the resources we get in GNU/Linux come from the distributions. Anything: executables, libraries, icons, wallpapers, etc. There’s been alternatives to all of those, but none has flourished as a globally adopted solution.

This guarantees that everyone using a distribution will have access to the resources the distribution can offer. The more powerful the distribution is, the more we get. There’s limitations nevertheless, so some restrictions have to get in place. The ensemble of limitations and technologies adopted will effectively define the user’s experience.

This works. It has worked for years and, given the technology is in place, it could easily keep working. Like in most engineering solutions there’s drawbacks and properly addressing them can bear some goodness. It seems like now it’s the moment to review this situation. Let’s enumerate some of the problems we have nowadays:

  • We have users using really old versions of our software with issues we’ve solved in versions they can’t use.
  • It’s really hard for GNU/Linux users to get users to test unstable versions of our software.
  • We have users who want to use fresh versions of some software but not in the whole system.

There’s been many solutions to fix those, some easily come to mind: ArchLinux’s AUR (with yaourt), Ubuntu’s PPAs, big-tar application packages, OpenSuse’s OBS, and possibly others.

Far from showing the maturity of the Linux desktop, what this depicts is the deep fragmentation we’re into: we have come up with different solutions that break the established distribution paradigm by lowering the restrictions and considering the resources offered as unsupported (often tainting the whole system).

What has appeared recently is sandboxing. It’s especially interesting because by letting the users execute any binaries we’re increasing the exposition of their systems. Hence, jumping from our distributions’ nest into the lions. As always, sandboxing creates new challenges: It requires changes in applications (or frameworks) to adapt, often creating a user interaction fence (e.g. a popup asking if you let Kamoso access the webcam). For what it’s worth, that’s not new: Android does it, OS X does it, Windows does it (from the Store), Chrome OS does it, etc.

Now where are we?

We need to decide about GNU/Linux’s future. Or at least, we need to understand what Plasma users will have available. So far, most of the noise comes from the big players in the business trying to differentiate their products, meaning incompatible versions.

Without an agreed unified solution, we’ll have to assume we’ll end up having installed snappies, flatpaks, AppImages as well as applications from the distribution. Then it’s just a matter of:

  • Presenting it properly so that the user knows the risks taken by executing an application (!)
  • Make sure we don’t lose many features by sandboxing.

Still, one of the good things of this new approach is that it shouldn’t have to be necessary to have several people dedicated to build every single application and component. If the solution is to add 3 more solutions that will need dedicated people, we’re not really moving forward.


As soon as we’ve decided how we want to work, then the interesting stuff needs to appear. If this is properly engineered, it can bring really interesting possibilities that now we hardly ever find:

  • Newer versions of applications on administered systems (e.g. universities).
  • Enabling stable distributions on professional environments.
  • Beta channels.
  • Binary application 3rd party extensions.
  • Provision of debug symbols (some distros don’t offer them).

To finish the fantastic post, a note for the dreamers:
How easier would all that be in a microkernel architecture?

We need you!

Of course this will be a long journey and we need your collaboration. This year in Randa we started working on all these problems in several different angles. It’s important for the KDE Community to have your support, so we can keep providing quality software. Consider donating, doesn’t need to be a lot, everything counts.

DBUS dropped for digiKam under Linux

Sat, 2016/06/25 - 5:51pm


You might all be familiar with the popular message bus system i.e. DBus. It is an inter-process communication (IPC) and remote procedure call (RPC) mechanism that allows communication between multiple computer programs concurrently running on the same machine.

DigiKam earlier used DBus under Linux system, but its support under Windows and OS X made digiKam unstable. The database core implementation based on DBUS was only used with old KIOSalve which is now removed.

In the current version, DBus is now optional for Linux and completely removed for Windows and OS X. It is now a thread, not a separate process.

After more than 1 year of development, digiKam 5.0.0 release plan updated and finalized… Do take a look!

Bonne Journée!

Two in one

Sat, 2016/06/25 - 12:11pm

As you may know (unless you’ve been living in Alpha Centauri for the past century) the openSUSE community KDE team publishes LiveCD images for those willing to test the latest state of KDE software from the git master branches without having to break machines, causing a zombie apocalypse and so on. This post highlights the most recent developments in the area.

Up to now, we had 3 different media, depending on the base distribution (stable Leap, ever-rolling Tumbleweed) and whether you wanted to go with the safe road (X11 session) or the dangerous path (Wayland):

  • Argon (Leap based)
  • Krypton (Tumbleweed based, X11)
  • Krypton Wayland (Tumbleweed based, Wayland)

So far we’ve been trying to build new images in sync with the updates to the Unstable KDE software repositories. With the recent switch to being Qt 5.7 based, they broke. That’s when Fabian Vogt stepped up and fixed a number of outstanding issues with the images as well.

But that wasn’t enough. It was clear that perhaps a separate image for Wayland wasn’t required (after all, you could always start a session from SDDM). So, perhaps it was the time to merge the two…

Therefore, from today, the Krypton image will contain both the X11 session and the Wayland session. You can select which session to use from the SDDM screen. Bear in mind that if you use a virtual machine like QEMU, you may not be able to start Wayland from SDDM due to this bug.

Download links:

(The i686 image is a bit behind compared to the x86_64 - we’re working on that)

Should you want to use these live images, remember where to report distro bugs and where to report issues in the software. Have a lot of fun!

Post-Kickstarter News

Sat, 2016/06/25 - 8:24am

The campaign season is over, and we’re slowly recovering and getting back into a productive groove of coding, coding, coding and more. Kickstarter has transferred 34,594.37 to our bank account, and we’ve started planning the next releases. Time for an update!

Kickstarter Surveys

The survey replies have been streaming in! We’ve already contacted a dozen artists and commissioned artwork for the rewards. For Kickstarter rewards, we’re paying for the work, as promised! The backers who wanted character sketches have been put in touch with the artists who wanted to do those! Only the art book will feature work for the exposure. Check out the call for submissions! (We’re working on getting an icc proofing profile from the printer because… See below!)

With the majority of surveys returned, we can be pretty sure of the final stretch goal ranking. Python is No. 1 and SVG import/export is No 2! And with those goals reached, the other stretch goals have to wait until next year (or be implemented by a fearless volunteer — like 16. Numerical Input Widget. That looks like it might get into 3.1 after all!

Here’s the full vote breakdown:

  1. Goal 24. Python Scripting Plugin: 587 votes
  2. Goal 8. SVG Import/Export: 544
  3. Goal 1. Transform from pivot point: 258
  4. Goal 21. Flipbook/Sketchbook: 251
  5. Goal 2. Composition Guides: 239
  6. Goal 7. Vector Layers as Mask: 182
  7. Goal 15. Smoother Gradients: 175
  8. Goal 13. Arrange Layers: 167
  9. Goal 23. Audio Import: 164
  10. Goal 19. Improve Calligraphy Tool Drastically: 152
  11. Goal 6. Reference Images Docker: 134
  12. Goal 5. Export a tag as a bundle: 123
  13. Goal 17. Stroke Paths with Brushes: 113
  14. Goal 3. Global Texture for Texture Brush: 110
  15. Goal 4. Make bundles smarter to get a more usable interface: 101
  16. Goal 11. Convert Height Map to Normal Map: 100
  17. Goal 20. Stop-based gradient editor: 87
  18. Goal 22. Rotatable, Scalable Patterns: 71
  19. Goal 18. Objects Outliner: 62
  20. Goal 16. Numerical Input Widget: 45
  21. Goal 12. LUT Baking: 34
  22. Goal 9. Move Assistants to a Separate Layer Type: 32
  23. Goal 14. On-Canvas Layer Tooltips with Layer Selection Tools: 31
  24. Goal 10. Convert Vector Shape to Assistant: 28

With Python clearly in the lead, we already started on implementing a Python scripting plugin, in the hope that we can use that to implement some of the remaining 2015 stretch goals, like the improved palette/color swatches docker. It already somewhat works: you can create Krita plugins in Python that add dockers or menu items, and there’s a Python plugin that makes it possible to run Python scripts from Krita with access to the Krita Python API. Which is currently limited to counting the number of open windows, but still!


Google Summer of Code

It’s mid-term time already for our Google Summer of Code students! All of them passed, fortunately, because all of them are doing great work. Wolthera is nearly finished with implementing soft-proofing. She even already wrote a manual page for the feature, and it’s currently being tested extensively. If all is fine, and it’s looking good, soft-proofing will land in Krita 3.0.1, which would make Krita once again one of the first projects participating in Google Summer of Code to release the results to the world! Soft-proofing includes configurable gamut alarms and white-point adaptation slider that can be used to check the effect of paper on your image. Krita will save the proofing profile with your image, instead of relying on a global setup, which means you can create templates for particular printers, for instance. And you can have a proofed and an unproofed view of your image at the same time.


Julian Thijsen is hard at work on fixing the OpenGL QPainter engine. Check out his series of blog posts, detailing the paths his adventure takes him on. There’s a lot of work to do here, but progress is brisk! And in the end, his work will be submitted for inclusion in Qt, which means everyone will benefit. Right now, the goal is to fix Qt’s OpenGL backend for the QPainter class so it can render all of Krita’s tools — that’s things like cursors, or the assistants, or the line that the gradient tool draws.


Jouni Pentikainen is busy working on interpolation curves so we can automatically and smoothly animated changes in, for example, opacity. He’s also working on making it possible to animate transformation masks between frames — and that’s something that’ll give animators enormous freedom! Check out his blog for more information. This is a mockup, created during the discussions about the flow of interaction and user experience design:



We’ve made a new release schedule: 3.0.1 will be released July 15th. Apart from a lot of bug fixes, for instance for tablets with broken drivers, onion skinning, drag & drop of layers on Windows, there will be a host of goodies. Michael Abrahams managed to restore the full-screen mode on Windows, Eugene Ingerman has improved the look and feel of the histogram dialog (and is working on a histogram docker and improved thumbnails for the channel and layer docker and improved rendering for the overview docker). If all goes well with what is a big refactoring, you’ll be able to render an animation to animated gif and several other video formats directly from Krita (2015 stretch goal). The 2015 Fuzzy Strokes stretch goal is in! And we might get soft-proofing, too. For a zero-dot-one release, this is going to be pretty excuting!

We intend to release new version in the 3.0 series every month until 3.1 is released with all 2015 stretch goals included — and probably the first version of the Python scripting plugin, if it turns out we can use that to implement some more stretch goals. That should be done somewhere between end of August and end of October. It’s too early to have a hard-and-fast date for it!


Every other year or so, Krita developers and artists get together in sunny Deventer, the Netherlands, to discuss the project’s direction and goals, to hack together and basically to touch base with each other. These sprints are really productive. The last one was in 2014, so it’s time for another sprint! This will happen end of August. Travel is sponsored by our umbrella community, KDE. KDE is running a fund raiser right now to fund sprints like ours — so don’t hesitate to click here and chip in! The Krita Foundation is sponsoring accomodation — we try to fit as many people as possible in the Foundation HQ, but this time we’re going to need more beds!

Akademy! and fundraising

Fri, 2016/06/24 - 11:07pm

Akademy is approaching! And I can hardly wait. This spring has been personally difficult, and meeting with friends and colleagues is the perfect way to end the summer. This year will be special because it's in Berlin, and because it is part of Qt.con, with a lot of our freedom-loving friends, such as Qt, VideoLAN, Free Software Foundation Europe and KDAB. As usual, Kubuntu will also be having our annual meetup there.

Events are expensive! KDE needs money to support Akademy the event, support for those who need travel and lodging subsidy, support for other events such as our Randa Meetings, which just successfully ended. We're still raising money to support the sprints:

Of course that money supports Akademy too, which is our largest annual meeting.

Ubuntu helps here too! The Ubuntu Community fund sends many of the Kubuntu team, and often funds a shared meal as well. Please support the Ubuntu Community Fund too if you can!

I'm going!

I can't seem to make the image a link, so go to for more information.

Plasma 5.6 – Clean installation impression

Fri, 2016/06/24 - 9:56pm


I was wondering if i should just be silent, since this is a negative post about Plasma. On the other hand we should not be afraid negative critics, learn from them, improve and make a better product. With that in mind, I decided to write this post anyway in hopes that it will ultimately improve the situation where improvements would be nice.

Today i sat down to put some new life in my notebook. Freshly install Arch Linux with Plasma 5 to see how my experience would be. I hadn’t reinstalled it on any system in quite a while so had no real clue how good or bad the experience would be. In fact, the last time i did this was when there where more posts about Plasma’s not so good clean user experience from a couple years ago. Around that time the paper cut project (aimed to improve small but notable annoyances for a better user experience) was launched and lots of issues were fixed in that project.

One disclaimer about my setup. I’m using Arch Linux, this also means that i have to setup much myself so my user experience is undoubtedly quite different then when you install Kubuntu or a Plasma Fedora spin. In fact, i’m betting you won’t even see some of the issues that i’m going to describe when you use those distributions. Also, i’m trying to be constructive here. Pointing at issues, explaining how i worked around them (if i did) and what my suggestion would be to fix them. Don’t read it as if i’m a plasma hater, i’m not.

I literally always have issues with KWallet when it’s running. No exceptions. If it’s enabled and not automatically opened using PAM (that means the wallet is unlocked after you logged in thus you basically won’t notice it anymore) then KWallet really comes across as a very annoying application that keeps spamming you. It’s so annoyingly persistent in asking your password every time again and again that you’re tempted to just remove it or give in and use it. I on the other hand never used KWallet and i certainly wasn’t about to start using it now. I was trying to connect to a password protected wifi network and was hitting cancel with every kwallet request. It simply doesn’t allow you to login then. You *must* enter a password to unlock your wifi, but it gives you no clue which password you need to give. And since i had set none, blank or closing the request was my option. None of that worked. It simply refuses to connect. Next thing i tried was disabling KWallet. Yes, that is (still) possible in the system settings. Thankfully. But since KWallet had already popped up once it also apparently had done *something* (and i honestly don’t know what). Since it didn’t help one bit in getting my wifi access to work. KWallet tries to outsmart me. Spoiler: ultimately i win!

I really wanted my wifi connection, but i certainly wasn’t about to give into the relentless persistence of KWallet to get me as a user. No way! I would try patching the NetworkManager applet before giving in. Thankfully i didn’t had to patch it. I logged out of my user, removed all config folders to start with a clean slate again (i didn’t know which file to delete…). Then started plasma and tried getting on my wifi. By this time i was really getting frustrated by kwallet. What i did this time is – instead of closing kwallet immediately – picking the blowfish option, then next and then close it as soon as possible. After that i disabled KWallet in the system settings. I’m happy to say that it finally understood my persistence to not use it. It hasn’t complained since and i can finally use my wifi connection without kwallet spamming me.

I have the following suggestions.

  1. Stop forcing the user into using something if the user clearly indicates he doesn’t want to have it. I would implicitly disable kwallet if the user cancels it’s first window where you have to choose an encryption method.
  2. Also, just add a third method in the first KWallet dialog: “I don’t want to use KWallet”, Upon clicking that, just disable kwallet and let the user do it’s thing.
  3. If you choose an encryption method (not blowfish, but the other option) then you must tell the user how to setup a key! Right now it only tells you that no key has been set and you need to install one, but it gives you 0 hints on how to do that. That is very user unfriendly and very counter intuitive. This will also frustrate your user greatly. Another point i like to make here is that this has been a publicly known issue for years! Why has nobody looked at this yet?

Note: i will remove any comments that start whining about “but then you’re passwords are in plaintext”.. I know, i accept it. I’m the only user on this computer and don’t care.

NetworkManager (applet)
Overall this works like a charm. It looks ok and is mostly intuitive. However, while i had the kwallet issue i found an issue here as well. If you have typed in your wifi credentials and you get kwallet blocking you from actually getting a connection then your wifi connection disappears from the list! I was scrolling through it a dozen times thinking: “where is it… where…”. After a couple minuted i opened the “Configure network connections” on the network manager which lists the connections that you have configured. There it was, found it! Deleting it from there made it forget the configuration and it appeared in the applet popup again. This is probably just a minor bug, a corner case.. But a quite annoying one when you encounter it.

Another issue i found with the applet once i had a wifi connection that it keeps trying to make a wired connection as well. Even if you have a working wifi connection and not having any wired connection plugged in. Mind you, just for clarity. No wired connection was plugged in or out. The port was left alone with nothing in it! The surprising thing here is that NetworkManager really tries to make a wired connection for about 10 minutes or so before finally giving up. I have no clue why it does this and during that time you keep being spammed with popups notifying you that wifi is active and wired isn’t…. My solution for this was to just disable wired for the moment… Not a real solution but fixed the notification spam.

Inverse mouse scrolling
If i use a notebook it’s often a macbook air. However, this new install is done on a couple years old samsung notebook. I immediately noticed the scroll direction was “reversed” compared to mac which i started to get used to. So i wanted to have the setting like it is on mac. That’s fine and possible in Plasma. Just tick the “reverse scroll direction” in your mouse settings (in the system settings under “Input Devices”). That did the trick, but not completely. It only seems to be applied for the Qt applications! Yay, we have a good old setting per UI Toolkit again… But that’s wrong., we have libinput now and we can set “natural sorting” there. Why isn’t the settings page doing that under the hood if libinput is being used? Anyway, the fix here (comes from this link) was to do:

xinput set-prop 11 282 1

You probably have to set something else specific to your hardware, so be sure to look at the link above. That fixes inverse scrolling for me which now works across GTK and Qt apps. Great, another problem solved.

Mount points not appearing in Dolphin
This one is weird.. I know it’s working on my desktop. When i mount anything in the /mnt folder, it shows up in dolphin. I added an entry in my fstab file to an ftp location. That is being automounted by systemd, but the folder exists (as an empty one) in /mnt. That should make it appear in the devices section of dolphin, but somehow it doesn’t. I have no clue what might be wrong on my notebook, in my settings or perhaps in the Plasma side. I also tried /media and putting it there, but no joy either. I can access the mount just fine when i go to the folder so i doubt it’s a permission issue. This issue needs further investigation.

PulseAudio and Plasma’s device manager
This one i just don’t understand why it exists in Plasma these days. Plasma has a device manager for sound. Fine, that was useful when there was no alternative in the plain alsa days. But we live in different times now. PulseAudio is doing a much better job at it. An example is a USB headset.
** assumption, in both cases the PulseAudio module “module-switch-on-connect” is loaded, which it isn’t by default **
Under Unity, this happens when you play your favorite song with pulseaudio managing your audio. Your audio will be redirected to the headphone you just plugged in. That’s in my opinion how it should work and is intuitive.
Under Plasma this doesn’t happen. Under plasma a “special” rule kicks in for loading PulseAudio (literally an if statement in the PulseAudio startup script). It loads the Plasma specific module (which is existing for years, perhaps even a decade, now) “module-device-manager”, that one basically disables/breaks “module-switch-on-connect” if it was loaded and lets Plasma handle new connections. In my opinion this is a remnant of past Plasma code and should either be removed or reworked to work with the current “module-switch-on-connect”. I requested this Plasma specific module to be kicked out of the PulseAudio startup script and also requested the current Plasma to share his opinion on why that Plasma specific module is still required. Well, he is apparently of the opinion that the module should stay. I really wonder what would be better. Good working hot switching of audio output devices (the module-switch-on-connect from PulseAudio) or the -what-i-think-is-rarely-used- fine grained control of Plasma’s “module-switch-on-connect” which doesn’t do automatic switching. I think the later argument is used by very few people and the first argument is what users probably expect when they plugin a usb headset. It’s what they get on Unity. The user friendly way would be to go for “module-switch-on-connect”.

Konsole (font)
This is obviously a “user preference”, but i think the default is wrong here. By default the konsole font is set at (Oxygen) Mono. It’s a monospce font, that’s fine. But it just doesn’t look good or sharp. You either have to increase the font 1 or 2 pixels in size to make it look better or you’d have to pick a different font. In the konsole case i’m a big fan of the “Fixed [Misc]” font. There is no hinting in that font. It’s a fixed (bitmap?) font with very sharp characters. It reads very pleasantly and should be the default. Other fonts are also looking much better then the Oxygen Mono font, so if you don’t like Fixed fonts then perhaps the Noto Mono font works better. It certainly looks better then Oxygen Mono. In all fairness, i guess this is a bug on the Konsole side. The Noto font is (much to my disliking) the font that Plasma forces upon you when installing and changed much defaults to Noto, i guess they forgot Konsole. You can freely remove Noto after installing plasma, but updates will pull it in again. The font is OK, but you should not pull in the “Arial, Courier New, and Times New Roman” versions of the Noto fonts, they are meant as “drop in replacements” for those styles, but instead severely break font rendering. The regular Noto fonts are fine though.


These are the most prominent and user noticeable issues i have seen when running plasma freshly. In conclusion, Plasma is in a much better shape then it used to be a couple years ago. The developers have done an outstanding job at making it good by default and powerful when needed. The issues compared to a couple of years ago are minor. Keep up the great work :)


Remote searching [KRunner/Blade]

Fri, 2016/06/24 - 7:26pm

Just a screen-shot this time.

The setup goes as follows:

  • GUI (in the screenshot) runs on my main computer
  • The runner that searches for applications with few less important ones is in a separate process on the same system
  • Baloo runner is on another computer (since I have Baloo running only on that system)
  • And the Recoll runner runs on yet another separate system

The result is in the screenshot:

Remote search

I have a lot of issues to tackle, and to make it all usable by normal people, but I had to share this milestone since it is quite cool. :)

p.s. Mind that the Recoll runner is even able to return the section of the file where the search item appears

p.p.s. Don’t mind the ugly UI, it is just for the testing purposes.


ownCloud Client 2.2.x

Fri, 2016/06/24 - 1:06pm

A couple of weeks ago we released another significant milestone of the ownCloud Client, called version 2.2.0, followed by two small maintenance releases. (download). I’d like to highlight some of the new features and the changes that we have made to improve the user experience:

Overlay Icons

Overlay icons for the various file managers on our three platforms already exist for quite some time, but it has turned out that the performance was not up to the mark for big sync folders. The reason was mainly that too much communication between the file manager plugin and the client was happening. Once asked about the sync state of a single file, the client had to jump through quite some hoops in order to retrieve the required information. That involved not only database access to the sqlite-based sync journal, but also file system interaction to gather file information. Not a big deal if it’s only a few, but if the user syncs huge amounts, these efforts do sum up.

This becomes especially tricky for the propagation of changes upwards the file tree. Imagine there is a sync error happening in the foo/bar/baz/myfile. What should happen is that a warning icon appears on the icon for foo in the file manager, telling that within this directory, a problem exists. The complexity of the existing implementation was already high and adding this extra functionality would have reduced the reliability of the code lower than it already was.

Jocelyn was keen enough to do a refactoring of the underlying code which we call the SocketApi. Starting from the basic assumption that all files are in sync, and the code has just to care for these files that are new or changed, erroneous or ignored or similar, the amount of data to keep is very much reduced, which makes processing way faster.

Server Notifications

On the ownCloud server, there are situation where notifications are created which make the user aware of things that happened.

An example are federated shares:

If somebody shares a folder with you, you previously had to acknowledge it through the web interface. This explicit step is a safety net to avoid people sharing tons of Gigabytes of content, filling up your disk.


With 2.2.x, you can acknowledge the share right from the client, saving you the round trip to the web interface to check for new shares.

Keeping an Eye on Word & Friends

Microsoft Word and other office tools are rather hard to deal with in syncing, because they do very strict file locking of the files that are worked on. So strict that the subsequent sync app is not even allowed to open the file, not even for reading. That would be required to be able to sync the file.

As a result the sync client needs to wait until word unlocks the file, and then continue syncing.

For previous version of the client, this was hard to detect and worked only if other changes happened in the same directory where the file in question resides.

With 2.2.0 we added a special watcher that keeps an eye on the office docs Word and friends are blocking. And once the files are unlocked, the watcher starts a sync run to get the files to the server, or down from the server.

Advances on Desktop Sharing

The sharing has been further integrated and received several UX- and bugfixes. There is more feedback when performing actions so you know when your client is waiting for a response from the server. The client now also respect more data returned from the server if you have apps enabled on the server that for example
limit the expiration date.

Further more we better respect the share permissions granted. This means that if
somebody shared a folder without create permissions with you and you want to reshare
this folder in the client you won’t get the option to share with delete permissions. This avoids errors when sharing and is more in line with how the whole ownCloud platform handles re-sharing. We also adjusted the behavior for federated reshares with the server.

Please note to take full advantage of all improvements you will need to run at least
server version 9.0.

Have fun!

Gsoc 2016 Neverland #4 #5

Fri, 2016/06/24 - 6:50am

These passed two weeks were so stressful. My plan for mid-term is Neverland can build a WordPress theme.

As I mentioned in Gsoc 2016 Neverland #3 I have to choose between Sage or Underscore.
Sage is DRY but it also means that designers must seperate their theme into many smaller parts. And that also means we have more works to do.
So I choose Underscore. It’s not DRY, it’s just skeleton theme so I can adopt quickly. And one more reason I choose it because it is from .

There is a pitfall that Underscore(_s) require you rename all _s tag to your theme name I search for a underscore theme generator but there aren’t any promised packages.

Now the hardest part comes. I was thinking so much about how a WP theme is built from static html. Should I wrap the html with WP code or should I inject the html into WP code. You can think it’s not different. But it’s so much different when you actually code.
After tried and failed, I came up with a solution. I used mustache tags for both _s and html files. I used sublime and replace all the _s tags with mustache tags {{themeName}}. After that I deleted most of code in _s replace it with some tags like {{{ body }}},{{{content}}} … In blueprint files, I defines some tags like {{{article__author}}},{{{article__author--link}}} … (I follow BEM methodlogy). So basically rendered steps are like this:

  1. Neverland replaces {{themeName}} with the theme Name and copy all files from workers to buildings folder.
  2. Neverland renders {{{ body }}},{{{content}}} … with the corresponding files body.mustache,content.mustache
  3. Finally Neverland renders {{{article__author}}}, {{{article__author--link}}} … with WP tags like <?= get_the_author() ?>, <?= get_the_author_link() ?> …

With this approach the designers have to care about the tags {{{article__author}}} … In the future, maybe I will use DOM parser to identify the content automatically so the designers dont have to know the tags.

And here is the results:

Screen Shot 2016-06-24 at 13.31.24

HTML theme

Screen Shot 2016-06-24 at 13.41.14

WP theme – Homepage

Screen Shot 2016-06-24 at 13.41.55

WP theme – Single post page

As you can see, it is working. But there are still many works to do.

Thanks for reading.

CentOS 6.8 image with Qt5.7, Python 3.5, LLVM 3.8

Thu, 2016/06/23 - 10:05pm

While trying to bring my setup to package KDevelop standalone for Linux into a shape where it has a nonzero probability of me picking it up again in half a year and actually understanding how to use it, I created a docker base image which I think might be useful to other people trying to package Linux software as well. It is based on CentOS 6.8 and includes Qt 5.7 (including QtWebKit), Python 3.5 and LLVM, all built against the old CentOS libs (and thus e.g. compatible with most glibc versions out there). If you want to use it, simply install docker, and

systemctl start docker docker pull scummos/centos6.8-qt5.7 docker run -i -t scummos/centos6.8-qt5.7

Good luck with it!

If docker fails to pull the image, you might have to lower the MTU of your network device to 900 (sic).

Marble Maps in KDE Randa Meetings 2016

Thu, 2016/06/23 - 9:42pm

One more year of fun and intense productivity in Randa came to an end just a few days back, and I feel so good to have been a part of it. Much progress was made by the Marble team this year by Dennis, Torsten, David and me. I mostly worked on the Marble Maps Android app’s navigation feature, and would like to mention the changes here very briefly:

  • First of all, cmake was picking up the wrong target for aprs plugin, so I made cmake to skip the aprs plugin for Android, so that it does not lead to an unnecessary crash during building it.
  • There was a bug in the Route Simulation position provider plugin, it was not working in the Navigation mode. Made a fix to that.
  • Replaced the older QImage based current-position pointer, with a QML based one in the Android app. So now we are able to make our custom animations on it.
  • The current-position pointer will be staying at points on the route itself, when the user is very close to the route (not too deviated from it). With the new QML based pointer, we made sure that the radius of accuracy remains the same with respect to the Earth irrespective of which height we are viewing the map from. Plus, we have got rid of this accuracy-indicator when the position-pointer is already on the route, and are showing it only otherwise.
  • There was a minor bug with this in the desktop version as well. Imagine that while you’re simulating a navigation through a route between New York and Boston, you zoom out till the whole Earth is visible, then rotate the Earth so that the other half of the globe, for example India, is visible now, instead of America. In that case the position marker was showing up on the top-left corner of map (somewhere in the sky), when it was supposed to not be visible at all since it is theoretically present on the other “not-visible” side of the globe now. So this bug was fixed as well.
  • Used some new maneuver direction pixmaps for navigation in the Android app, getting rid of old ones, and also made them look sharper in the Navigation info-bar while they were shown during turn-by-turn navigation, by making some tweaks in the QMLs that were using them.
  • Finally, the distance text that shows on the top while in Navigation mode, its border height has been changed to match the height of that of the maneuver pixmaps shown on its left as well, so that the view looks much more uniform. Plus, a similar panel has been added at the bottom as well, that shows the speed and distance information during the navigation, each at two different sides of the panel.

That’s all about work. And adding to the yummy food that turns out to be truly delicious every single year, this time we had some more fun activities as well. We were fortunate enough to board a cable-car to ride to as high as possible in the mountains, so much that we ended up finding ourselves in snowfall ( it’s my first time being in a snowfall:) ), which was amazing! The special red tea we were greeted at a stop at a restaurant during the hike was pretty unique. We then visited a museum, and that was a lot of fun as well.

Such a lively and eventful week it was, thanks a lot to Mario and Simon for organizing the Randa Meetings one more year. Let’s keep the Marble rolling, and I hope to be a part of this again in the subsequent years to come!  :)