Skip to main content

the avatar of Federico Mena-Quintero

Madrid GNOME+Rust Hackfest, part 3 (conclusion)

The last code I wrote during the hackfest was the start of code generation for GObject interfaces. This is so that you can do

gobject_gen! {
    interface Foo {
        virtual fn frob(&self);
    }
}

and it will generate the appropriate FooIface like one would expect with the C versions of interfaces.

It turns out that this can share a lot of code from the existing code generator for classes: both classes and interfaces are "just virtual method tables", plus signals and properties, and classes can actually have per-instance fields and such. I started refactoring the code generator to allow this.

I also took a second look at how to present good error messages when the syn crate encounters a parse error. I need to sit down at home and experiment with this carefully.

Back home

I'm back home now, jetlagged but very happy that gnome-class is in a much more advanced a state than it was before the hackfest. I'm very thankful that practically everyone worked on it!

Also, thanks to Alberto and Natalia for hosting me at their apartment and showing me around Madrid, all while wrangling their adorable baby Mario. We had a lovely time on Saturday, and ate excellent food downtown.

Sponsored by the GNOME Foundation

Hosted by OpenShine

a silhouette of a person's head and shoulders, used as a default avatar

From novice to Community Manager

As of March 28th, 2018, my contract as Community Manager for LOPSA started. Indeed, of all the adventures I've had in openSUSE, this is the biggest yet. Many readers are probably wondering how my position in LOPSA can be considered an openSUSE adventure. In short, without openSUSE it almost certainly wouldn't have been possible.

The purpose of this article is twofold. First, as thanks to the openSUSE Community. And to invite the openSUSE Community to join me!

To join, sign up at https://lopsa.org/Join-or-Renew and apply the coupon-code 'OPENSUSE' at checkout. You can also email me directly at community@lopsa.org with any questions, comments, or suggestions. 

Thank you openSUSE Community!

I could go on at great length about the numerous valuable things I learned over the years of my participation in the openSUSE project. To do so, however, is beyond the scope of this article and would lead it to be entirely too verbose. That story is for a later time.

The openSUSE Community is full of so many wonderful people, full of energy, passion, skill, and good-will. Volunteering with openSUSE has afforded me some of my best and most memorable experiences in my adult life. It's been a privilege to work with such exceptional people, and my life has certainly been enriched by it.

To give full credits would be a long crawl, but a couple pivotal figures I feel must be mentioned. 
  • Drew Adams is a long-time friend whom I first met in meat-space at a friend's Birthday. He got me involved with the openSUSE Project, showing me I didn't need to be a programmer to contribute. As a current LOPSA board member, he alerted me to the Community Manager position, which thankfully due to the involvement he engaged me in so many years ago I qualified for. 
  • Bryen Yunashko was a member of the openSUSE board when I began participating in the Project. He encouraged me to represent openSUSE at the 2011 Novell Brainshare Expo in Salt Lake City. He also helped me to arrange the means to get there. At the time I remember arguing for my own lack of experience and knowledge against this. His vehement encouragement pushed me to go and showed me I had not been giving myself enough credit. 

Join me and the LOPSA Community!

LOPSA is a great community and organization already. My appointment as their first Community Manager signals an upshift to more dynamic expansion. Besides expanding our Membership and Chapters, we are working on deals and alliances to afford more benefits to our Members. Currently, I'm negotiating a number of initiatives with the Linux Professionals Institute (LPI). In conjunction with our newly revamped and relaunched Mentorship Program, we are angling to provide learning opportunities to advance our Member's careers. 

I want to invite you, my friends and peers to join me at LOPSA. Now is a very exciting time to get involved with huge developments on the horizon as we move forward to becoming a much more dynamic organization. I hope that we can inject a healthy dose of the openSUSE spirit into LOPSA and have a lot of fun

To join, sign up at https://lopsa.org/Join-or-Renew and apply the coupon-code 'OPENSUSE' at checkout. You can also email me directly at community@lopsa.org with any questions, comments, or suggestions. 

the avatar of Federico Mena-Quintero

Madrid GNOME+Rust Hackfest, part 2

Hacking on gnome-class continues apace!

Philippe updated our dependencies.

Alberto made the syntax for per-instance private structs more ergonomic, and then made that code nice and compact.

Martin improved our conversion from CamelCase to snake_case for code generation.

Daniel added initial support for GObject properties. This is not finished yet, but the initial parser and code generation is done.

Guillaume turned gir, the binding generator in gtk-rs, from a binary into a library crate. This will let us have all the GObject Introspection information for parent classes at compilation time.

Antoni has been working on a tricky problem. GTK+ structs that have bitfields do not get reconstructed correctly from the GObject Introspection information — Rust does not handle C bitfields yet. This has two implications. First, we lose some of the original struct fields in the generated bindings. Second, the sizes of the generated structs are not the same as the original C structs, so g_type_register_static() complains that one is trying to register an invalid class.

Yesterday we got as far as reading the amd64 and [ARM][arm] ABI manuals to see what the hell C compilers are supposed to do for laying out structs with bitfields. Most likely, we will have a temporary fix in gir's code generator so that it generates structs with the same layout as the C ones, with padding in place of the space for bitfields. Later we can remove this when rustc gets support for C bitfields.

I've been working on support for GObject interfaces. The basic parsing is done; I'm about to refactor the code generation so I can reuse the parts that fill vtables from classes.

Yesterday we went to the Madrid Rust Meetup, a regular meeting of rustaceans here. Martin talked about WebRender; I talked about refactoring C to port it to Rust, and then Alex talked about Rust's plans for 2018. Fun times.

Sponsored by the GNOME Foundation

Hosted by OpenShine

[arm]:

the avatar of Federico Mena-Quintero

Madrid GNOME+Rust Hackfest, part 1

I'm in Madrid since Monday, at the third GNOME+Rust hackfest! The OpenShine folks are kindly letting us use their offices, on the seventh floor of a building by the Cuatro Caminos roundabout.

I am very, very thankful that this time everyone seems to be working on developing gnome-class. It's a difficult project for me, and more brainpower is definitely welcome — all the indirection, type conversion, GObject obscurity, and procedural macro shenanigans definitely take a toll on oneself.

Gnome-class internals

Gnome-class internals on the whiteboard

I explained how gnome-class works to the rest of the hackfest attendees. I've been writing a document on gnome-class's internals, so the whiteboard was a whirlwind tour through it.

Error messages from the compiler

Antoni Boucher, the author of relm (a Rust crate to write GTK+ asynchronous widgets with an Elm-like model), explained to me how relm manages to present good error messages from the Rust compiler, when the user's code has mistakes. Right now this is in a very bad state in gnome-class: user errors within the invocation of the procedural macro get shown by the compiler as errors at the macro call, so you don't get line number information that is meaningful.

For a large part of the day we tried to refactor bits of gnome-class to do something similar. It is very slightly better now, but this really requires me to sit down calmly, at home, and to fully understand how relm does it and what changes are needed in the syn parser crate to make it easy to present good errors.

I think I'll continue this work at home, as there is a lot of source code to understand: the combinator parsers in syn, the error handling scheme in relm, and the peculiarities of gnome-class.

Further work during the hackfest

Other people working on gnome-class are adding support for GObject properties, inheritance from non-Rust classes, and improving the ergonomics of class-private structures.

I think I'll stop working on error messages for now, and focus instead on either supporting GTypeInterfaces, or completing support for type conversions for methods and signals.

Other happenings in Rust

Paolo Borelli has been porting RsvgState to Rust in librsvg. This is the big structure that holds all the CSS state for SVG elements. This is very meticulous work, and I'm thankful that Paolo is paying good attention to it. Soon we will have all the style machinery for librsvg in Rust, which will make it easier to use the selectors crate from Servo instead of libcroco, as the latter is unmaintained.

Food

Food in Madrid

Ah, Spanish food. We have been enjoying cheese, jamón, tortilla, pimientos, oxtail stews, natillas, café con leche...

Thanks

Thanks to OpenShine for hosting the hackfest, and to the GNOME Foundation for sponsoring my travel. And thanks for Alberto Ruiz for putting me up in his house!

Sponsored by the GNOME Foundation

a silhouette of a person's head and shoulders, used as a default avatar

the avatar of Robert Riemann

First Gem: jekyll-onebox

Initially, I wanted to blog about my travels. In the end, I refactored old code on my computer to publish eventually my first Ruby gem in the official repo at RubyGems. Welcome now jekyll-onebox on Github and RubyGems! :tada: :clap:

So if you use Jekyll for blogging, you can install this plugin and add HTML previews for links to popular websites very easily.

{% onebox https://github.com/rriemann/jekyll-onebox/blob/master/README.md %}

The previews are rendered using the gem onebox that powers link previews for Discourse forums.

Have fun with it and let me know if you encounter problems!

the avatar of Klaas Freitag

Kraft Version 0.80 Released

I am happy to announce the release of the stable Kraft version 0.80 (Changelog).

Kraft is desktop software to manage documents like quotes and invoices in the small business. It focuses on ease of use through an intuitive GUI, a well choosen feature set and ensures privacy by keeping data local.

After more than a dozen years of life time, Kraft is now reaching a new level: It is now completely ported to Qt5 / KDE Frameworks 5 and with that, it is compatible with all modern Linux distributions again.

KDE Frameworks 5 and Qt5 are the best base for modern desktop software and Kraft integrates seamlessly into all Linux desktops. Kraft makes use of the great KDE PIM infrastructure with KAddressbook and Akonadi.

In addition to the port that lasted unexpectedly over 12 months, Kraft v. 0.80 got a whole bunch of improvements, just to name some examples:

More Flexible Addressbook Integration

As Akonadi is optional now, Kraft can be built without it. Even if it was built with, but Akonadi for whatever reason is not working properly, Kraft still runs smoothly. In that case it only lacks the convenience of address book integration.

The Address book access was also nicely abstracted so that other Addressbook backends can be implemented more easily.

GUI Improvements

Even though the functionality and GUI of Kraft was not changed dramatically compared to the last stable KDE 4 version, there were a few interesting changes in the user interface.

  • A new, bigger side bar simplifies navigation.
  • In the timeline view, a click on years and month in the treeview show summaries of the selected time span, ie. the number of documents with financial summaries per month or year.
  • A filter allows to limit the view on the current week or month.

Reduction of dependencies

Kraft makes broad use of the core Qt5 libraries. The required KDE dependencies were reduced to a bare minimum. Akonadi libraries, which enable KDE PIM integration are now optional. The former dependency on heavyweight web browser components were completely removed and replaced by the far more simple richtext component of Qt.

These changes make it not only easier and more transparent to build Kraft but allow make a port to other platforms like MacOSX more easy in the future.

Under the Hood

A countless number of bugfixes and small improvements went in. Also updates to the newer C++ concepts where applicable make the rather mature code base more modern and better maintainable.

The Reportlab based PDF document creation script was updated and merged with a later version for example.

Deployment

Installing Kraft is still a bit complicated for unexperienced users, and distributions sometimes haven’t made a good job in the past to provide the latest version of Kraft.

To make it easier to test, there is an AppImage of Kraft 0.80 available that should be runable on most modern distributions. Just download a single file that can be started right away after having added the executable permissions.

Linux packages are already built for openSUSE (various versions) or Gentoo.

Kraft’s website will contain a lot more information.

the avatar of Klaas Freitag

Kraft Version 0.80 ist da!

Heute wurde Kraft Version 0.80 herausgegeben.

Nach mehr als zwölf Jahren Lebenszeit beginnt für Kraft heute eine neue Ära: Kraft ist jetzt auf Qt5/KDE Frameworks 5 portiert und damit wieder problemlos auf modernen Linux Distributionen lauffähig.

KDE Frameworks 5 und Qt5 sind die beste Basis für einen modernen Desktop, aber Kraft integriert gut mit allen verfügbaren Linux Desktop Systemen. Mit KDE kann Kraft das Akonadi-basierte Adressbuch zur Adressverwaltung nutzen.

Zusätzlich zur Portierung, der mehr als die letzten 12 Monate in Anspruch nahm, beinhaltet Kraft V. 0.80 eine große Menge an Verbesserungen, zum Beispiel:

Weniger Abhänigkeiten

Kraft verwendet vor allem die Qt5 Bibliotheken. Die verwendeten KDE Frameworks Abhängigkeiten wurden bewusst auf das nötige Minimum reduziert. Die Akonadi-Bibliotheken, die die KDE-Adressbuch-Integration ermöglichen, sind jetzt optional. Die vorherige Verwendung von einer schweren Webbrowser-Komponente wurde komplett entfernt und von einer viel simpleren, in Qt integrierten Richtext-Komponente ersetzt.

Alle diese Änderungen machen es nicht nur einfacher, Kraft zu übersetzen, sondern vereinfachen einen möglichen Port auf andere Platformen wie MacOSX oder Windows.

Adressbuch-Integration

Da Akonadi nun optional ist, kann Kraft ohne es übersetzt werden. Selbst wenn es mit gebaut wurde, aber Akonadi aus irgendwelchen Gründen nicht funktioniert, läuft Kraft ohne Probleme weiter. Es fehlt dann lediglich der Komfort der Adressbuch-Integration und Adressen müssen manuell eingetragen werden.

Die Empfehlung ist weiterhin, die gute Integration mit dem Akonadi-basierten Adressbuch zu verwenden.

Benutzeroberfläche verbessert

Auch wenn die Funktionalität von Kraft in diesem Release im Vergleich zum letzten KDE4-basierten Release nicht wesentlich verbessert wurde, sind doch einige interessante Veränderungen passiert.

  • Eine neue, größere und vereinfachte Seitenleiste erleichtern die Bedienung
  • Im Zeitverlauf kann jetzt auf ein Jahr oder einen Monat geklickt werden und es wird eine Übersicht der versendeten Dokumente in dem Zeitraum angezeigt.
  • Ein Filter erlaubt die Reduzierung der angezeigten Dokumente auf die aktuelle Woche oder den letzten Monat.

Unter der Haube

Zusätzlich wurden eine große Menge weiterer Fehlerbehebungen und Verbesserungen eingebracht. Ausserdem erleichtert der Umstieg auf modernere C++ Standards die Weiterentwicklung und Pflege der Software.

Installation

Kraft zu installieren ist nicht einfach, und Linux-Distributionen haben in der Vergangenheit nicht immer einen guten Job gemacht, die aktuelle Version von Kraft zu liefern.

Um es einfacher zu machen, Kraft auszuprobieren, wird ab jetzt ein sog. AppImage von Kraft angeboten, das auf den meisten modernen Linuxen lauffähig sein sollte. Dazu muss nur ein einziges File heruntergeladen werden.

the avatar of Federico Mena-Quintero

Refactoring some repetitive code to a Rust macro

I have started porting the code in librsvg that parses SVG's CSS properties from C to Rust. Many properties have symbolic values:

stroke-linejoin: miter | round | bevel | inherit

stroke-linecap: butt | round | square | inherit

fill-rule: nonzero | evenodd | inherit

StrokeLinejoin is the first property that I ported. First I had to write a little bunch of machinery to allow CSS properties to be kept in Rust-space instead of the main C structure that holds them (upcoming blog post about that). But for now, I just want to show how this boiled down to a macro after refactoring.

First cut at the code

The stroke-linejoin property can have the values miter, round, bevel, or inherit. Here is an enum definition for those values, and the conventional machinery which librsvg uses to parse property values:

#[derive(Debug, Copy, Clone)]
pub enum StrokeLinejoin {
    Miter,
    Round,
    Bevel,
    Inherit,
}

impl Parse for StrokeLinejoin {
    type Data = ();
    type Err = AttributeError;

    fn parse(s: &str, _: Self::Data) -> Result<StrokeLinejoin, AttributeError> {
        match s.trim() {
            "miter" => Ok(StrokeLinejoin::Miter),
            "round" => Ok(StrokeLinejoin::Round),
            "bevel" => Ok(StrokeLinejoin::Bevel),
            "inherit" => Ok(StrokeLinejoin::Inherit),
            _ => Err(AttributeError::from(ParseError::new("invalid value"))),
        }
    }
}

We match the allowed string values and map them to enum values. No big deal, right?

Properties also have a default value. For example, the SVG spec says that if a shape doesn't have a stroke-linejoin property specified, it will use miter by default. Let's implement that:

impl Default for StrokeLinejoin {
    fn default() -> StrokeLinejoin {
        StrokeLinejoin::Miter
    }
}

So far, we have three things:

  • An enum definition for the property's possible values.
  • impl Parse so we can parse the property from a string.
  • impl Default so the property knows its default value.

Where things got repetitive

The next property I ported was stroke-linecap, which can take the following values:

#[derive(Debug, Copy, Clone)]
pub enum StrokeLinecap {
    Butt,
    Round,
    Square,
    Inherit,
}

This is similar in shape to the StrokeLinejoin enum above; it's just different names.

The parsing has exactly the same shape, and just different values:

impl Parse for StrokeLinecap {
    type Data = ();
    type Err = AttributeError;

    fn parse(s: &str, _: Self::Data) -> Result<StrokeLinecap, AttributeError> {
        match s.trim() {
            "butt" => Ok(StrokeLinecap::Butt),
            "round" => Ok(StrokeLinecap::Round),
            "square" => Ok(StrokeLinecap::Square),
            "inherit" => Ok(StrokeLinecap::Inherit),

            _ => Err(AttributeError::from(ParseError::new("invalid value"))),
        }
    }
}

Same thing with the default:

impl Default for StrokeLinecap {
    fn default() -> StrokeLinecap {
        StrokeLinecap::Butt
    }
}

Yes, the SVG spec has

default: butt

somewhere in it, much to the delight of the 12-year old in me.

Refactoring to a macro

Here I wanted to define a make_ident_property!() macro that would get invoked like this:

make_ident_property!(
    StrokeLinejoin,
    default: Miter,

    "miter" => Miter,
    "round" => Round,
    "bevel" => Bevel,
    "inherit" => Inherit,
);

It's called make_ident_property because it makes a property definition from simple string identifiers. It has the name of the property (StrokeLinejoin), a default value, and a few repeating elements, one for each possible value.

In Rust-speak, the macro's basic pattern is like this:

macro_rules! make_ident_property {
    ($name: ident,
     default: $default: ident,
     $($str_prop: expr => $variant: ident,)+
    ) => {
        ... macro body will go here ...
    };
}

Let's dissect that pattern:

macro_rules! make_ident_property {
    ($name: ident,
//   ^^^^^^^^^^^^ will match an identifier and put it in $name

     default: $default: ident,
//            ^^^^^^^^^^^^^^^ will match an identifier and put it in $default
//   ^^^^^^^^ arbitrary text

     $($str_prop: expr => $variant: ident,)+
                       ^^ arbitrary text
//   ^^ start of repetition               ^^ end of repetition, repeats one or more times

    ) => {
        ...
    };
}

For example, saying "$foo: ident" in a macro's pattern means that the compiler will expect an identifier, and bind it to $foo within the macro's definition.

Similarly, an expr means that the compiler will look for an expression — in this case, we want one of the string values.

In a macro pattern, anything that is not a binding is just arbitrary text which must appear in the macro's invocation. This is how we can create a little syntax of our own within the macro: the "default:" part, and the "=>" inside each string/symbol pair.

Finally, macro patterns allow repetition. Anything within $(...) indicates repetition. Here, $(...)+ indicates that the compiler must match one or more of the repeating elements.

I pasted the duplicated code, and substituted the actual symbol names for the macro's bindings:

macro_rules! make_ident_property {
    ($name: ident,
     default: $default: ident,
     $($str_prop: expr => $variant: ident,)+
    ) => {
        #[derive(Debug, Copy, Clone)]
        pub enum $name {
            $($variant),+
//          ^^^^^^^^^^^^^ this is how we invoke a repeated element

        }

        impl Default for $name {
            fn default() -> $name {
                $name::$default
//              ^^^^^^^^^^^^^^^ construct an enum::variant

            }
        }

        impl Parse for $name {
            type Data = ();
            type Err = AttributeError;

            fn parse(s: &str, _: Self::Data) -> Result<$name, AttributeError> {
                match s.trim() {
                    $($str_prop => Ok($name::$variant),)+
//                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ expand repeated elements

                    _ => Err(AttributeError::from(ParseError::new("invalid value"))),
                }
            }
        }
    };
}

Getting rid of duplicated code

Now we have a macro that we can call to define new properties. Librsvg now has this, which is much more readable than all the code written by hand:

make_ident_property!(
    StrokeLinejoin,
    default: Miter,

    "miter" => Miter,
    "round" => Round,
    "bevel" => Bevel,
    "inherit" => Inherit,
);

make_ident_property!(
    StrokeLinecap,
    default: Butt,   // :)

    "butt" => Butt,
    "round" => Round,
    "square" => Square,
    "inherit" => Inherit,
);

make_ident_property!(
    FillRule,
    default: NonZero,

    "nonzero" => NonZero,
    "evenodd" => EvenOdd,
    "inherit" => Inherit,
);

Etcetera. It's now easy to port similar symbol-based properties from C to Rust.

Eventually I'll need to refactor all the crap that deals with inheritable properties, but that's for another time.

Conclusion and references

Rust macros are very powerful to refactor repetitive code like this.

The Rust book has an introductory appendix to macros, and The Little Book of Rust Macros is a fantastic resource that really dives into what you can do.

the avatar of Klaas Freitag

Kraft out of KDE

Following my last blog about Krafts upcoming release 0.80 I got a lot of positive reactions.

There was one reaction however, that puzzles me a bit and I want to share my thoughts here. It is about a comment about my announcement that I prefer to continue to develop Kraft on Github. The commenter reminded my friendly that there is still Kraft code on KDE infrastructure, and that switching to a different repository might waste peoples time when they work with the KDE repo.

That is a fair statement, of course I don’t want to waste peoples time. What sounds a bit strange to me is the second paragraph, that says that if I decide to stay with Github, I should let KDE people know that I wish Kraft to not be a KDE project anymore.

But … I never felt that Kraft should not be a KDE project any more.

A little History

Kraft has come a long way together with KDE. I started Kraft in (probably) 2004, gave a talk about Kraft at the Akademy Dublin 2006, maintained it with the best effort I could contribute until today. There is a small but loyal community around Kraft.

During all the time I got little substancial contribution to the code directly, with the exception of one cool developer who got interested for some time and made some very interesting contributions.

When I asked a for the subdomain http://kraft.kde.org long time ago I got the reply that it is not in the interest of KDE to give every little project a subdomain. As a result I reserved http://volle-kraft-voraus.de and run it since then, happily showing a “Part of the KDE family” logo on it.

Beside the indirect contributions to libraries that Kraft uses, I shipped Kraft with the translations made by the KDE i18n team, for which I always was very grateful. Otherwise I got no other services from KDE.

Why Github?

Githubs workflow serves me well in my day job, and since I have only little time for Kraft, I like to use the tools that I know best and give me the most efficiency.

I know that Github is not free software and I am sceptical about that. But Github also does not lock in, as we still are on git. We all know the arguments that usually come on the table at this point, so I am not elaborating here. One thing I want to mention though is that since I moved to Github publically I already got two little pull requests with code contributions. That is a lot compared to what came in the last twelfe years when living on KDE infrastructure only.

Summary

Kraft is a small project, driven by me alone. My development turnaround is good with Github as I am used to it. Even if no KDE developer would ever look at Github (which I know is not true) I have to say with heavy heart that Kraft would not take big harm by leaving KDEs infra, based on the experience of the last 12 years.

If the KDE translation teams do not want to work with Github, I am fine to accept that, and wonder if there could be a solution rather than switching to Transifex.

One point however I like to make very clear: I did not wish to leave KDE, nor aimed to move Kraft out. I still have friends in the KDE community, I am still very interested in free software on desktop and elsewhere, and my opinion is still that KDE is the best around.

If the KDE community feels that Kraft must not be a KDE project any longer because it is on Github, ok. I asked KDE Sysadmins to remove Kraft from the KDE git, and it is already done.

Kraft now lifes on on Github.