Do they exist? Especially as free software? I don’t actually know, but I’ve never seen a free software project use something like what I’ve got in mind.
That would be: a website where we could add any number of test scenarios.
People who wanted to help would get an account, make a profile with their hardware and OS listed. And then a couple of weeks before we make a release, we’d release a beta, and the beta testers would login and get randomly one of the test scenarios to test and report on. We’d match the tests to OS and hardware, and for some tests, probably try to get the test executed by multiple testers.
Frequent participation would lead to badges or something playful like that, they would be able to browse tests, add comments and interact — and we, as developers, we’d get feedback. So many tests executed, so many reported failure or regressions, and we’d be able to improve before the release.
It would be a crowd-sourced alternative to something like Squish (which I’ve never found to be very useful, not for Krita, not at the companies where it was used), it would make beta testers not just feel useful, it would make beta testing useful. Of course, maintaining the test scripts would also take time.
It sounds simple to me, and kind of logical and useful, but given that nobody is doing this — does such a thing exist?
These are my notes from the onboarding sprint. I had to miss the evenings, because I’m not very fit at the moment, so this is just my impression from the days I was able to be around, and from what I’ve been trying to do myself.
The KDE Onboarding Sprint happened in Nuremberg, 22 and 23 July. The goal of the sprint was to come closer to making getting started working on existing projects in the KDE community easier: more specifically, this sprint was held to work on the technical side of the developer story. Of course, onboarding in the wider sense also means having excellent documentation (that is easy to find), a place for newcomers to ask questions (that is easy to find).
Ideally, an interested newcomer would be able to start work without having to bother building (many) dependencies, without needing the terminal at first, would be able to start improving libraries like KDE frameworks as a next step, and be able to create a working and installable release of his work, to use or to share.
Other platforms have this story down pat, with the proviso that these platforms promote greenfield development, not extending existing projects, as well as working within the existing platform framework, without additional dependencies:
Apple: download XCode, and you can get started.
Windows: download Visual Studio, and you are all set.
Qt: download the Qt installer with Qt Creator, and documention, examples and project templates are all there, no matter for which platform you develop: macOS, Windows, Linux, Android or iOS.
GNOME Builder is also a one-download, get-started offering. But Builder adds additional features to the three above: it can download and build extra dependencies (with atrocious user-feedback, it has to be said), and it offers a list of existing GNOME projects to start hacking on. (Note: I do not know what happens when getting Builder on a system that lacks git, cmake or meson.)
KDE has nothing like this at the moment. Impressive as the kdesrc-build scripts are technically (thanks go to Michael Pyne for giving an in-depth presentation), they are not newcomer-friendly, with a complicated syntax, configuration files and dependency on working from the terminal. KDE also has much more diversity in its projects than GNOME:
Unlike GNOME, KDE software is cross-platform — though note that not every person present at the sprint was convinced of that, even dismissing KDE applications ported to Windows as “not used in the real world”.
Part of KDE is the Frameworks set of additional Qt libraries that are used in many KDE projects
Some KDE projects, like Krita, build from a single git repository, some projects build from dozens of repositories, where adding one feature, means working on half a dozen features at the same time, or in the case of Plasma replaces the entire desktop on Linux.
Some KDE projects are also deployed to mobile systems (iOS, Android, Plasma Mobile)
Ideally, no matter the project the newcomer selects, the getting-started story should be the same!
When the sprint team started evaluating technologies that are currently used in the KDE community to build KDE software, things started getting confused quite a bit. Some of the technologies discussed were oriented towards power users, some towards making binary releases. It is necessary to first determine which components need to be delivered to make a seamless newcomer experience possible:
Prerequisite tools: cmake, git, compiler
A way to fetch the repository or repositories the newcomer wants to work on
A way to fetch all the dependencies the project needs, where some of those dependencies might need to transition from dependency to project-being-worked on
A way to build, run and debug the project
A way to generate a release from the project
A way to submit the changes made to the original project
An IDE that integrates all of this
The sprint has spent most of its time on the dependencies problem, which is particularly difficult on Linux. An inventory of ways KDE projects “solve” the problem of providing the dependencies for a given project currently includes:
Using distribution-provided dependencies: this is unmaintainable because there are too many distributions with too much variation in the names of their packages to make it possible to keep full and up-to-date lists per project — and newcomers cannot find the deps from the name given in the cmake find modules.
Building the dependencies as CMake external projects per project: is only feasible for projects with a manageable number of dependencies and enough manpower to maintain it.
Building the dependencies as CMake external projects on binary factory, and using a docker image identical to the one used on the binary factory + these builds to develop the project in: same problem.
Building the (KDE, though now also some non-KDE) dependencies using kdesrc-build, getting the rest of the dependencies as distribution packages: this combines the first problem with fragility and a big learning curve.
Using the KDE flatpak SDK to provide the KDE dependencies, and building non-KDE dependencies manually, or fetching them from other flatpak SDK’s. (Query: is this the right terminology?) This suffers from inter- and intra-community politicking problems.
ETA: I completely forgot to mention craft here. Craft is a python based system close to emerge that has been around for ages. We used it initially for our Krita port to Windows; back then it was exclusively Windows oriented. These days, it also works on Linux and Windows. It can build all KDE and non-KDE dependencies that KDE applications need. But then why did I forget to mention it in my write-up? Was it because there was nobody at the sprint from the craft team? Or because nobody had tried it on Linux, and there was a huge Linux bias in any case? I don’t know… It was discussed during the meeting, though.
As an aside, much time was spent discussing docker, but when it was discussed, it was discussed as part of the dependency problem. However, it is properly a solution for running a build without affecting the rest of the developers system. (Apparently, there are people who install their builds into their system folders.) Confusingly, part of this discussion was also about setting environment variables to make it possible to run their builds when installed outside the system, or uninstalled. Note: the XDG environment variables that featured in this discussion are irrelevant for Windows and macOS.
As a future solution, Ovidiu Bogdan presented Conan, which is a cross-platform binary package manager for C++ libraries. This could solve the dependency problem, and only the dependency problem, but at the expense of making the run problem much harder because each library is in its own location. See https://conan.io/ .
The attendendants decided to try to tackle the dependency problem. A certain amount of agreement was reached on acknowledging that this is a big problem, so this was discussed in-depth. Note again, that the focus was on Linux again, relegating the cross-platform story to second place. Dmitry noted that when he tries to recruit students for Krita, only one in ten is familiar with Linux, pointing out we’re really limiting ourselves with this attitude.
A KDE application, kruler, was selected, as a prototype, for building with dependencies provided either by flatpak or conan.
Dmitry and Ovidiu dug into Conan. From what I observed, laying down the groundwork is a lot of work, and by the end of the evening, Dmitry and Ovidiu has packaged about half of the Qt and KDE dependencies for kruler. Though the Qt developers are considering moving to Conan for Qt’s 3rdparty deps, Qt in particular turned out to be a problem. Qt needs to be modularized in Conan, instead of being a big, fat monolith. See https://bintray.com/kde.
Aleix Pol had already made a begin integrating flatpak and docker support into KDevelop, as well as providing a flatpak runtime for KDE applications (https://community.kde.org/Flatpak).
This made it relatively easy to package kruler, okular and krita using flatpak. There are now maintained nightly stable and unstable flatpak builds for Krita.
The problems with flatpak, apart from the politicking, consist in two different opinions of what an appstream should contain, checks that go beyond what freedesktop.org standards demand, weird errors in general (you cannot have a default git branch tag that contains a slash…) an opaque build system and an appetite for memory that goes beyond healthy: my laptop overheated and hung when trying to build a krita flatpak locally.
Note also that the flatpak (and docker) integration in KDevelop are not done yet, and didn’t work properly when testing. I am also worried that KDevelop is too complex and intimidating to use as the IDE that binds everything together for new developers. I’d almost suggest we repurpose/fork Builder for KDE…
We’re not done with the onboarding sprint goal, not by a country mile. It’s as hard to get started with hacking on a KDE project, or starting a new KDE project as has ever been. Flatpak might be closer to ready than conan for solving the dependency problem, but though flatpak solves more problems than just the dependency problem, it is Linux-only. Using conan to solve the dependency problem will be very high-maintenance.
I do have a feeling we’ve been looking at this problem at a much too low level, but I don’t feel confident about what we should be doing instead. My questions are:
* Were we right on focusing first on the dependency problem and nothing but the dependency problem?
* Apart from flatpak and conan, what solutions exist to deliver prepared build environments to new developers?
* Is kdevelop the right IDE to give new developers?
* How can we make sure our documentation is up to date and findable?
* What communication channels do we want to make visible?
* How much effort can we afford to put into this?
So, two years ago I thought porting Krita to iOS or Android might make a dandy research project. A bit of context: if I spend 500 hours a year on an approved R&D project, I get a tax break. Plus, I like doing new stuff now and then. My 2018/2019 R&D project is Resource Management for the 21st Century, a previous one was Python Scripting.
In 2016, there wasn’t a decent Android tablet with a pen available anymore. The Wacom Cintiq Hybrid Companion is stuck on an ancient version of Android and wasn’t being made anymore, and Samsung’s Note tablet was an older model. The iPad Pro was new, so I decided to experiment with that. I got myself an iPad Pro, a Pencil and…
I tried to put a simple little example application on the iPad. I found something that demonstrated using the Pencil, and then discovered that Apple wouldn’t allow me to put code I had built myself on my iPad. I needed a developer account and keys and everything.
I told myself I would investigate that, but never had time to dig in.
Then in 2017, I gave the Cupertino Shylock the 99 ducats it wanted, and got the acccount. Mainly so we could sign our macOS builds and disk images — Apple making it deuced hard for people to run unsigned software. Now they’re going to make it even harder — they want applications in the macOS App Store to be notarized. But I digress…
SO, now, end of 2018, in the week off I usually allow me myself between Christmas and New Year’s Eve, I finally sat down to experiment a bit.
First, I loaded the test application I had selected in XCode. I plugged in my iPad in my Macbook Pro — for the first time since I had bought the hardware! Stuff happened, and I had to click various dialogs, and then the device popped up in XCode.
It was quite difficult to actually find where to put my Apple ID as the “Team” — it didn’t work to tell XCode what to sign the application with, it needed something it choose to call “Team”.
But then everything worked! Yay!
Okay, next step. Get a Qt application running on the iPad. I downloaded Qt again — usually I build it myself with a bunch of patches, but I didn’t want to try to build Qt for iOS myself, nor mess with the development tree I use for Krita.
Qt’s documentation was excellent, and pretty soon I had the Tablet example application running on the iPad. It looks a bit weird, because that’s a QWidget-based application, but that’s fine. ClipStudio Pro on iOS also is a compleat Desktop Application, with popup dialogs and menus and everything, so I am sure Apple wouldn’t mind… And the Pencil was supported really well, so that’s very hopeful.
Now I only had to make one more experiment before starting to tackle maybe porting Krita: port the Tablet example to CMake, load that in Qt Creator and use Qt Creator to build it and deploy it to my iPad.
Well, that was my Waterloo. CMake doesn’t officially support iOS yet. G’Compris, which does, does that by providing a qmake file and some manual instructions. Google turns up a ton of conflicting advice, some old and outdated, some newer and more hopeful. I have tried to make a start on it, but no dice yet. If you know how to make CMake build and deploy to an iPad, answers on a postcard, please!
We left really early this morning on the trains to Würzburg, Hannover, Deventer. I was pretty smart, if I may say so, because I gave ourselves half an hour or more time to change trains. Deutsche Bahn is a wonderful institution, but experience has taught me that especially in summertime, 6 minutes are not enough of a safety margin. So we made all our changes, and even had time to lunch in Würzburg.
So, yesterday really was the last day of Akademy for Irina and me. And all of that day was taken up with the fundraising training by Florian Engel. And it was worth it! Oh, gosh, wake me up in the middle of the night and ask me whether it was worth it!
Practical, to-the-point, flexible, engaging, going deep where we needed that, giving examples from outside of free software so we were all getting new ideas just from those examples. I need to prepare my notes for a discussion during Krita’s Monday meeting, about our September campaign, software platform, and donation page…
For the rest, it was great to meet so many people I hadn’t seen for way too long, including Inge Wallin, with whom I, back in the Nokia days, had founded KO GmbH. Or people I work with every day, but had never met, like Ben Cooksley. Productive discussions about things as diverse as debug symbols in appimages or ways to attract and retain new contributors. Meeting and sitting down with Eliakin, my 2017 student, was awesome as well; pity KDE is so busy that we couldn’t spend more time together!
I went to Akademy feeling that the relationship been Krita and KDE is kinda difficult. Krita is part of KDE, but at the same time, Krita is getting really big. We’re using up quite a chunk of bandwidth, after plasmashell, we’re the project with the second-most bugs reported per year, and still people working on Krita don’t have much of a tie to KDE, and people working on KDE seldom have much of an idea what’s going on in Krita — other than nodding and telliing me Krita is one of KDE’s flagship projects. Sure it is, and I got very much reassured that we’re not using too large a chunk of KDE’s resources, and could even use more. I’m not sure how to “fix” this, if a fix is possible. If we’d have our Krita sprint during Akademy, I’m sure that would help — but it would also be a pretty improductive sprint for Krita.
Tomorrow, there’s the fund raiser training session. Given that we’ve been raising funds for Krita since time immemorial (our first fund raiser was for two Wacom tablets and art pens so we could implement support for them, the second to let Lukas Tvrdy work on Krita for a couple of months and after that, we’ve had the kickstarters), that might seem superfluous. But I’m still hoping to learn lots. After all, it’s not like we’re exactly awash in money.
But today, we, me and Irina, we went all-out for a day in Vienna. Just took the day off, had a lazy morning with breakfast in the hotel room (tea and croissants…), then took the underground to the Karlsplatz. From there, it was an easy walk to the KHM. Vienna is quite compact.
One thing I love about Vienna is the ubiquitous availability of non-sugary soft drinks. That is, soda zitrone — sparkling water with lemon juice. Half a litre of that in the museum cafe rehydrated us sufficiently to go out and see the parts that we hadn’t seen before. The French/Italian/Spanish parts of the museum are not as paralyzing as the Flemish/German/Dutch parts, but there was plenty! In particular, the three portraits of the Infanta of Spain, at ages 4, 6, 8 (or thereabouts) were touching. Gramps, being the Holy Roman Emperor of the German Nation, had asked his son-in-law for regular updates on his little darling grandchild, and got them, painted by Velasquez.
The Roman/Greek/Egyptian part was curious more than impressive: quantity over quality, perhaps, but still, interesting. It’s also the most unreconstructed part of the museum, with the exhibits often being labeled only in type-written German, on yellowing paper.
Having gone through that section, we were conveniently close to the museum cafe again, where they do serve excellent food. So we lunched there, then went back to our favourites in the dutch/flemish/german paintings sections. I spent half an hour with Rogier van der Weyden again, and if there wouldn’t be that fundraising workshop tomorrow, I would spend an hour in that room again, tomorrow. But we’ve got a year pass and we will return. I like the KHM better than the Bodemuseum in Berlin… There were other paintings I have stared at, trying to remember all of it, like the Reynolds in a little side-room. I was going all squiggly-eyed, so I decided to try and find Irina.
As I was staggering towards the exit, I suddenly became aware of being spoken at by a clean-shaven person, in what I thought was Danish or Swedish or some other language I don’t speak. It turned out to be one of the other Akademy attendees, a Dutchman. I had so much trouble coming down to earth and realizing that he was speaking a language that I could understand! Afterwards, I felt like a loon.
From there, we went out in search of beer. It was, by now, afternoon, and a warm one. We failed though! First we reached the Treasury. Our year pass is valid there as well, and we had been told the Treasury museum is in the medieval part of the Hofburg. And since the Hofburg is, sorry…, weird, it’s like an ordinary, rather plain, apartment building like you find them all over Vienna, we were like, let’s see what the medieval parts look like!
Well, there wasn’t much of that visible. But the presentation was really pretty good: excellent explanations, impressive exhibits, lots of ancient costumes, too. What I really want to know, though, is: how can textile dating back to the Norman kingdom in Sicily, C12, be as smooth and hale as the socks and tunics and orarion are that are shown? Those 1000-year old swords: how can the steel look like it was forged last year? I’m sure it’s that old, but how has it been conserved and preserved like that?
From there we went on, and found a Kurkonditorei — I guess it’s Kur, because you can only get beer in 0.3 and not 0.5 measures, which must have a slimming effect. Still, the beer was cool, my sandwich was good, Irina’s topfenknodel were good too, or so I have been told, and there were so many interesting people to watch… We had another beer.
And then it was time to go back to the hotel, shower, read mail, go out back to the venue area, find that the Bep Viet restaurant was packed, have a pizza at the pizza place, go back again, and realize that this has been one of the nicest Akademy’s I’ve attended, and that Vienna’s one of the nicest places I’ve visited.
On our arrival, Valorie went to the A&O, where most KDE people went, and Irina and I went to our “apartment”. We like to dine out, but we also like to cook, especially if we’re somewhere where supermarkets carry things that we cannot get in the Netherlands. So, an apartment. Well, City Center Hoher Markt is not really that kind of apartment. Three sparsely furnished rooms, ours with torn and dirty sheets on the bed, and filthy rugs on the floor shared one sparsely equipped kitchen where it would be impossible to cook.
We actually left the apartment four days early to move into a modern hotel near the hauptbahnhof, so we could get clean sheets, clean towels, and bearable room temperatures.
Pity about the great little bakery at the ground level of the block, where they provide the most amazing coffee and really fresh Viennoiseries.
Next day, Friday, we woke up around six in the morning, or rather, got up, having had no sleep. Too warm, the air fetid with tobacco smoke and burnt frying fat, we decided to get up, get out and walk about until the bakery opened. Like I said, that coffee was amazing.
From there we wandered around a bit, came across the Donau Canal, thought it was the Donau, and that the Donau was rather overrated, realized our mistake, walked around some more, saw some picturesque sights and discovered that Vienna, unlike, say London, is really quite walkable.
It is a lovely city! If I were a resident, I would perhaps advocate a name change to Chantilly, because everything looks like it’s been smothered in architectural whipped cream and meringues. (After all, what’s a name change or two? Been there, done that, got the kimageshop mailing list.)
People are friendly, and they do not insist on speaking English! So we could exercise our German, compare what we’re used to with with what is usual here, and, which is useful, get some excellent coffee while having the totally erroneous feeling we’re blending in.
Vienna, in fact, is so walkable that we arrived at the KHM at nine o’clock in the morning. That’s an hour early, so we went to sit in the Hofburg Garden, cool down a bit, and watch some people in grey suits sit on whitish horses.
The KHM is awesome. First we got a huge glass of sparkling water with fruit juice in the Museum cafe, then saw the most amazing works of art, then had a great lunch, then went on to see more works of art, until my eyes were bubbling and we just had to leave.
Fortunately, the Akademy pre-reg event was imminent.
My first impression was one of shock: had I grown old and forgotten all those familiar people? Had those people grown so old, or rather, young, that I could no longer recognize them?
Realization soon dawned: this is not only a spectacularly well-attended Akademy, but we have a host of first-time attendees! Later, from a show of hands in the main auditorium on the first day, it really looks like about half of us are here for the first time. That’s just so awesome…
The food at the pre-reg was excellent: dainty, portable, tasty, varied, filling. The beer was nice, the wine generously measured, the meetings with people, some of whom I hadn’t seen for years, heartening.
Saturday and Sunday are the conference days proper, with talks and keynotes, while the rest of the week is hacking and birds-of-a-feather sessions. Keynotes are, at Akademy at least, meant to broaden the attendants’ horizon, enlarge their frame of thinking and make them consider the wide, wide world. This year’s keynotes did that for sure.
Saturday’s was all about how a small band of brave people have the foresight to start collecting information now to support the transition of North Korea to a country under the rule of law. The country led by a man who was so warmly met by the current president of the United States is a place where atrocities are so normal that it’s almost impossible to feel shocked, instead of just soul-weary. Dan Bielefeld, in a very understated, collected and impressive way gathered the threads for us, and made it clear to everyone that this just cannot and will not endure.
Sunday’s keynote by Claudia Garad was, in a way, closer to home, but also really inspirational. In “W for Welcome” she explained how the Wikipedia community works to make contributors welcome. This ties in quite neatly, of course, with one of the three Goals of KDE: privacy, usability, onboarding — goals that adorn all our lanyards! (Those lanyards were designed by Kenny, and are awesome.)
For me, Akademy isn’t so much about presentations, though there were some very cool ones, like Paul Brown demonstrating KDEnlive in a very engaging way — and why don’t we have more presentations showing off how to use this or that KDE application? It’s not like even a majority of KDE people present here have any clue about, say, Krita…
The VVave presentation was a bit unique in that it was about one person booming their project — and I think we need more of that! That sort of confidence was also expressed by Nate’s talk: the first time in many years since I’ve heard someone in the free software community urging us to disdain the Moon, or Mars, but reach for the stars.
But, for all of that, for me, Akademy is about the conversations, the meetings, figuring out how share knowledge and making sure we all go home a bit smarter, deeper, wiser — and more engaged than we arrived. That’s even more important than the presentations.
I’ve had wonderful talks with many people, I’ve been able to sit down twice with Eliakin, who did a succesful Summer of Code project with Krita last year, I have met Inge again, my one-time business partner and KOffice/Calligra compatriot — and much-missed friend.
We’ve just had two days of Birds of a Feather sessions. And the KDE e.V. AGM, of course. KDE e.V. is the backbone of the KDE project and community. The yearly general meeting, however, is usually characterized by unbearable tedium. This year’s proceedings were different: all the interesting bits, that is the reports, were given in public, during Akademy, and only the most boring bits, the legally mandated bits, were to be gotten through during the AGM. The goal was twenty minutes. The goal was not met, not by a country mile.
For me, tomorrow is KHM day again, with maybe a side dish of Belvedere, or some strolling about town. Thursday, we’ll have a training in fund raising that will take all day. Timely too, because we want to do another Krita fundraiser in September! And on Friday, we’ll take the train to Würzburg, where we’ll take the train to Arnhem, where we’ll take the train to Deventer, where we’ll discover whether our house still has a roof. It has been said that there have been storms and rains in the Netherlands…
We’ve already got Valorie here in Deventer, and next week we’ll take the slow train, the international train and then the ICE to Vienna, to attend Akademy. Last time Irina and I attended Akademy was in A Coruña, to present the work I had been doing on Plasma Mobile.
This time, I’m not going to present anything. I’ll be around, I want to listen to people, I want to meet people, I want to forget the past two years, which have been really tough, and I want to…
See this thing in real life. Let’s hope the Kunsthistorisches Museum is open, and hasn’t closed the room with the Cellini salt cellar, or loaned it out…
I like fixing bugs… It makes people happy who have their bugs fixed, it makes Krita better, and it can be done in relatively small time laps. And it gives one a sense of having been usefully productive to go to the weekly bug summary, and see oneself in the top-five of bug resolvers. Not that I’m there right now, though I was last week, because sometimes one has to dig deeper.
These weeks I’m working on refactoring Krita’s resource systems. Resource in graphics app parlance are things like brushes, gradients, patterns — mostly small files that are stored somewhere on disk and that are loaded on start up. This code dates back to 2000 or so and was originally designed for a world where people would have a few dozen of each resource installed, and where brushes and patterns wouldn’t be bigger than 64 x 64 pixels.
These days, people want to have libraries containing hundreds of resources, and many are huge, like 5000×5000 pixel images. Krita cannot simply load all of that in memory like we’re doing now. It takes too much memory. It takes too much start-up time. It makes organizing resources too hard for the user. Because it uses the ancient KDE system for finding resources in the installation, local installation and local user folder in a tiered system, some resources cannot be edited, like with kxmlgui customization files, any application update will spell disaster.
The whole system will have to be scrapped. We’ll have to have a buffer between the actual resources on disk and the application — a caching database. I kinda feel like I’m jumping down an akonadi-type rabbit hole!
And then there’s tagging and organizing and all the bugs that 18 years of accretion have both fixed, added and papered over. The codebase is the most amazing mix of simple-minded, fiendishly over-complicated and sometimes downright mis-guided patterns and anti-patterns.
So, I’m coding, for the first time since the export filter warning project a couple of years ago, lots and lots and lots of new code. It’s fun! It’ll take at least two months of solid work, probably more, especially since most of it is actual research…
Still, going so deep and losing oneself in the high of concentrated coding means that bug fixing falls by the wayside — even though the result should end with scores of bugs closed — that I feel pangs of guilt. I know that this or that thing is broken, and my fingers itch! But I find it impossible to really carry all that’s needed for this refactoring in my head, and dig into problems in other systems.
Some time ago, I compared 2:1 devices, which was a new form factor back then. This time, triggered by an experiment with a Wacom Mobile Studio Pro during the last Krita sprint, I want to look into the various drawing devices I’ve used over the years, and which ones worked well, or not.
This was the first device where I could draw with a pen on the screen. I got it in 2007. The pen technology was Wacom, and it worked with Linux out of the box. The pen was a bit tiny, but could be stored in the laptop itself. The screen only had a 1024×768 resolution, which is incredible these days, but it was fine: nobody was creating 4k images back then. The pen was quite accurate, except at the borders of the screen, a familiar Wacom issue. Palm rejection was fine, and it was a very usable little thing. The hinge mechanism was its weak point though: it turned only one way, and one day someone forced it the other way…
Lenovo Thinkpad Helix
When we were developing Krita Sketch and Krita Gemini, Intel sent us two devices: a Lenovo Helix and a Dell XPS 12. The Dell had a touch screen, but was not pen compatible, the Helix came with a pen, but didn’t have the 2:1 drivers that would switch the device from laptop into tablet mode when ripping it out of the keyboard.
The pen was as tiny as the one in the X61T, and also used Wacom technology. It sort of worked fine, but the device itself always felt cramped when using it for art. Part of that was because the screen was only 11″, part of it because when connect to the keyboard, it didn’t bend back enough, part maybe because it always felt a little slow. It was fairly heavy, too. It ran Linux perfectly well, but not in tablet mode: I never figured out how to make Linux switch automatically between landscape and portrait mode.
The idea was fun, but it was far from an ideal art device, or even a good device for someone developing an art application.
Microsoft Surface Pro 3
It was going at a reduced price, and I wanted something with an n-trig pen, to test Krita with. Plus, I was thinking of, you know, picking up drawing again, and maybe learn how to use Krita. I got the lowest-end model.
When I got it, it was running Windows 8, which was a good fit for the device. Better than Windows 10, to be honest. I liked the PDF reader that came with Windows 8, which got replaced by a web browser by now. I liked to use the device to read comics, too, using the Comix reader. But…
For using it as an art device, there were some big problems: there’s a tiny, but noticeable bit of latency between pen and device. It’s even noticeable when clicking on a menu or a button, and very noticeable when trying to draw. I though that was the n-trig pen, or the bluetooth connection, but later on I learned that this might well be Windows. I never even tried to put Linux on it: this device was for testing Krita on Windows with n-trig/windows Ink, and the 64GB ssd was too small to partition.
The pen is thinnish and not too comfortable to hold, more a Bic feeling than a Waterman feeling. Palm rejection while drawing is pretty bad as well, and there seem to be a ton of things that need to be disabled in Windows before things get to a tolerable state, like all the flicks and things.
In sum, it was slow, laggy and burdened by Windows 10 and all its fancy features that only get in the way.
Wacom Cintiq Hybrid Companion
Wacom contacted us in 2013 and offered to donate some devices to the Krita project so we could improve support for them. One of those devices was the Wacom Cintiq Hybrid Companion.
The Cintiq Hybrid Companion was one of the first attempts by Wacom at creating an untethered art device. It felt (and feels) very luxurious: a very nice sleeve, a nice pen case with lots of replacement nibs and a flimsy but pretty stand came with the device. The pen feels great, too.
The device can run independently, and then it runs Android. The Android version never got updated, though, so it’s still stuck at 4. There were a number of interesting art applications included on the Android side, like Manga Canvas. The application I liked best on Android was ArtFlow. I even considered porting Krita to the device, but never got started. Not a big problem… Wacom never made another Android tablet, and Android tablets with pen support are pretty rare these days.
There were, and are, a couple of issues, though. There is a strong parallax effect near the screen edges. It’s smallish and the resolution isn’t very high. It works best when coupled with a big monitor, two windows and two views on the same image. It’s also rather too heavy to keep on your lap, but as a desk-bound thing it’s fine.
When using it with Windows, touch is unreliable, and when using it with Linux, it is pretty hard to calibrate — and somehow, but that probably is a driver issue, after a couple of strokes it messes up the pointer state, and suddenly every mouse move selects or moves windows, even when nothing is clicked and no modifier key is held.
I keep the companion around, and I often play with it and use it as a test tablet, but until the Linux wacom driver bugs get ironed out, I won’t be using it for real stuff.
iPad Pro 12.9″
Last year, I was thinking of porting Krita to iOS and Android, both because there’s demand for it and because we might be able to generate some extra income to fund development by having Krita in more app stores. I decided to start with iOS because there just are very few Android tablets with a real pen. I got an iPad Pro, created a dev account with Apple and put the dev environment on the krita-for-macOS-build-macbook-pro. I played with some demo scribble application, but by that time I had begun to really, really, really dislike iOS.
I dislike its flat ugliness, its lack of consistency, its invasiveness, the ubiquity of advertisements in the “free” apps. I dislike how indiscoverable features can be. I actually bought Procreate for iOS to check out the competition, and this was the first time in years when I had to read the manual to figure things out.
Hardware-wise, it’s a beautiful, if a bit big, tablet. The screen is great. It’s quite fast. The pen is fine as well, if a bit top-heavy, and doesn’t have an eraser end. Charging the pen is ridiculous:
The available software is weird. I tried OpenCanvas, Medibang, Procreate and Autodesk Sketch. OpenCanvas actually has menus, popup dialogs and everything a desktop application has! Medibang looks quite normal by comparison, but looks and feel more like an Android application than an iOS application. Procreate looks and feels native. It’s all quite usable, and all not quite what I want to use, though.
I’m currently using it to read books on C++ in PDF format (I haven’t found a good CBR or ePub reader for iOS yet…) I still intend to try again to port Krita to iOS, but maybe I’d better sell the thing.
Wacom Mobile Studio Pro
A webshop had a barely-used but seriously reduced price offer for this device. It’s the 16″ model. Pretty much Wacom’s flagship pen tablet, running Windows 10. It comes with a dildo^Wcigar tube^W^Wnew style pen holder, and that’s it. No sleeves, no stands, no usb-c-to-something-useful converter. Even if you want to use it Cintiq-like, you have to buy the Wacom Link converter. For a device at this price level — new it’s more than 3000 euros, that’s a bit mean.
The device itself has its good and its less good points.
Good is the screen: it’s big, bright, high-res and has very good color coverage. Good is the pen: it has a nice weight and with the felt tips feels great when painting. Lots of disk space, choice between Intel and NVidia GPU, also good. Lots of express keys, great.
The Intel Realsense 3D camera never worked, though: it crashes when starting the calibration app. The screen has a yellow splotch in the bottom-left corner (or top-right, depending on how you’re holding it).
It’s heavy, of course, but, well, that’s normal for a 16″ device, and I’ve found various strategies to work with it held on my lap nonetheless. It does get warm, though, especially when we’ve made any little mistake that makes Krita more CPU than needed.
I’ve used it with Windows 10 for about a year: this works, but I noticed it changed my drawing style. More blobby, rendery, less line work. And I’m now guessing that that’s because this device with Windows 10 has sort of the same problem as the Surface Pro 3: a little bit of latency between pen and device.
Which is weird, because it’s Wacom, so the pen and the screen are directly connected, not the pen to the os through bluetooh. So when we had the house full of artists for the sprint, and David, Raghu and Timothee were playing with the device, and they all declared they couldn’t work with it like this, we first tried to find out whether we could improve it under Windows. Disabling the Windows window compositor made a bit of difference, but David was still disgusted with the feel of the device.
Then we tried to run Linux on it. A year ago, that was still a big problem, and when Aryeom of Zemarmot got an MSP, Jehan had quite a bit of work to make things run. However, we just plugged an Ubuntu 18.04 USB stick in the USB hub, rebooted, added a second USB stick with Krita and everything worked.
And the latency was gone! The next weekend, I put Kubuntu on it (still waiting for the 18.04 based release of KDE Neon), and that works much nicer. Pity there isn’t a good HiDPI virtual keyboard for Linux/X11 — but I can just keep my normal keyboard connected to it when I’m drawing and sketching at my desk. There are some problems still: the touch screen doesn’t work in Krita (where the touch screen of the hybrid companion works perfectly), and synchronizing the rotation of the screen and the tablet doesn’t work yet.
Lenovo Yoga 920
I’ve also gotten a Lenovo Yoga 920 at a discount, but I haven’t done much with that yet. Now that the battery life of the Surface Pro 3 is gone, the Dell XPS-12’s keyboard broken and with the Helix out of commission, I wanted something I could take with me (when I go to Akademy, for instance) that I could draw on. But I’ve had very good reports: good Linux compatibility, no latency between pen and computer and it’s very portable. I’ll be spending some more time today getting Krita up and running on it.
So… My blog originally started out as a book review blog, and to celebrate its return (we moved from a home-hosted server to something cloudy), let’s talk about two gosh-darned awful books.
The thing is this: I’ve been so busy with actually maintaining a 600 kloc project that I’ve neglected keeping up with the changes in the language the project uses. Yet C++ has changed a lot, even if our codebase hasn’t. I did buy Stroustrup’s C++11 Programming Language book, but never had time to read it.
And now we’re at C++17. So I thought I’d get a couple of books with C++17 in the title to help me figure out what has changed, why it has changed, how it has changed and what the changes are good for. I got two books from Apress, which is a Springer imprint.
So let’s go for a quick syntax overview. C++17 Quick Syntax Reference. The author, Mikael Olsson, is a Fin, and weirdly enough his bio is smaller than the technical reviewer’s bio, Massimo Nardone. Massimo also gets his picture printed, Mikael doesn’t. Judging from the layout, the book itself is obviously either meant for pre-school children or people with vision problems even worse than mine: the letterpress is enormous. All that would not be a problem, but…
A quick syntax reference has no business explaining how to choose an IDE or how to create a Hello World application. A C++17 syntax reference should also teach modern, 2017-level C++, not 1972 level C. The final chapter, Chapter 27, explains what Headers are, Why Use Headers — with this gem of an observation “C++ requires everything to be declared before it can be used.” Then it goes on the show how to #include a header — heady stuff! After some more kindergarten stuff it finished up explaining include guards.
The book isn’t actually written in broken English, but it is unreadable all the same. Just look at this quote from page 58:
“In addition to passing variables by value, reference, or address, a variable may also be returned in one of these ways. Most commonly, a function returns by value, in which case a copy of the value is returned to the caller.”
Okay, writing this book was a waste of time for the author, unless he’s getting rich from it, which I doubt. It’s a waste of time for the reader, and spending more time on it is going to be a waste of time for both me, and you, my reader. The book will be pulped and recycled.
Next. “Clean C++”: Sustainable Software Development Patterns and Best Practices with C++17″ by Stephan Roth.
Same publisher, same awful print quality, but since Stephan produced a lot of text, the font size is very small. This time the author gets his mugshot printed, and the technical reviewer, Marc Gregoire not. Marc is Belgian, Stephan German and I suspect that the copy editor was Martian. The very first sentence is already broken:
“It is still a sad reality that many software development projects are in bad conditions, and some might even be in a serious crisis.”
Some things are nice, the author uses actual code samples from actual projects, like Apache OpenOffice, to show problems. Some chapters have promising titles, like “The Basics of Clean C++”, but then start exhorting the reader that “Names Should Be Self-Explanatory”. I thought I was reading a book on clean C++, not Java for high school students? Apparently, removing the license header from your source files will also make your code more Clean!
“Advanced Concepts of Modern C++” is more interesting, though either I am dumb, or the author was in need of a good editor to help him explain what he means, because much of the text I just cannot follow. I would also have liked some clear explanation of why automatic type deduction is good, while at the same time, we’re exhorted to do Type-Rich Programming. The rest of the book rehashes in an an abbreviated way what was explained much better elsewhere: Object Orientation, Functional Programming, Test Driven Development, Design Patterns and UML.
The book promised to show how to use C++17 to write clean code. Instead it regurgitates every bromide from Code Complete and similar books published in the past two decades without adding anything interesting or even talking about C++17 much.
Maybe I’m hypercritical these days… But this book will also be pulped. In any case, any suggestions for something that will teach me to read and write real modern C++17 are very welcome!