Touching on Touchscreen Support

Touchscreens are no longer just for tablets and phones. Touchscreen laptop computers and desktops are becoming the norm, if not more common, in the computer market. Much of this has been spurred-on by Microsoft and Windows 8, whose “Modern” interface is about as touchscreen-friendly as you can get. In fact, it is what is driving the laptop market to include capacitive touchscreens.

In April of this year, I was gifted a touchscreen laptop preloaded with Windows 8.1. I had experienced Windows 8 prior to this, but was not impressed with it until I experienced it on a touchscreen. With the touchscreen (but only with the touchscreen), it is an extremely friendly interface. As long as you know how to swipe from the sides, top, or bottom, you’re good to go.

However, I’m a Linux user. At the time, I was using openSUSE 13.1 as my daily driver. I managed to take my installation(s) from my old machine and move it to my new machine using Clonezilla. However, it wouldn’t boot until I had converted it to be used with UEFI (a post for another time, perhaps).

I had two openSUSE installations: one GNOME, one KDE. Being the experimenter I am, I tried the touchscreen experience with both. I was unimpressed. Knowing of Ubuntu’s vision for touchscreen, I decided to install Ubuntu 14.04. I was blown away by how good the touchscreen support was in Unity.

Since then, I have evaluated the main Linux desktop environments with their touchscreen gesture support. Here my evaluation:

LXDE/LXQt

Nonexistant. In this case, I can understand because the developers have no desire to support touchscreen gestures. The target for these/this desktop is for older computers that need as little overhead as possible while still having a complete desktop, and very few (if any) of those machines have touchscreens.

Xfce

Nonexistant. Again, a situation where the developers have no desire to support touchscreens. In fact, seeing the “sloth on benadryl stuck in molasses in January” development style of this desktop, I cant imagine it will ever support touchscreen gestures.

MATE

Again, nonexistant. Since the MATE desktop is a fork of the GNOME 2.x desktop, and is targeted at users that miss GNOME 2.x, I doubt touchscreen gesture support would come to fruition. However, knowing at least one of the developers, I wouldn’t put it past them for a distant future additional feature.

Cinnamon

Though this desktop is forked from the newer GNOME 3.x desktop, any and all touchscreen gesture support is nonexistant. Understandable, since the target users for this desktop are similar to those of the MATE desktop, as Cinnamon is intended to have the GNOME 2.x experience on GNOME 3.x technology.

KDE Plasma (4 and 5)

This one is tricky. The Homerun Launcher amazingly supports touchscreen one-finger gestures, but the rest of the experience in Plasma is lackluster at best. KDE did have Plasma Active going for a while, but it was really targeted toward tablets without any vision of touchscreen laptops or desktops. As such, it has no mouse or keyboard support that I could find. If KDE could merge Plasma with the features of Plasma Active sometime in the 5.x releases, they might have something going. I do know the vision is there, and Aaron Seigo did at least touch on this.

GNOME 3.12

This showed promise with the Activities Overview and certain GNOME apps being the only real area to support touch gestures. Dragging windows was fairly straightforward using the titlebar, but trying that on Qt apps would crash GNOME Shell. This wouldn’t be fixed until…

GNOME 3.14

As of this writing, GNOME 3.14 has yet to go stable in any distro. I’ve been trying the beta (3.13.92) on my laptop in Fedora 21 Alpha, which probably has a few more bugs than the stable release. Gone is the Qt app dragging bug, and the new gesture support is really coming along. Also, gesture support is now native to GTK 3.14, but until developers for 3rd party (read: non-GNOME) apps and distros are willing to support it, it’s mostly going to be confined to the GNOME Apps.

Ubuntu/Unity

Since Ubuntu 14.04 was released back in April, Unity’s support of touchscreen gestures has been absolutely amazing and second-to-none. None of this should be surprising considering the comittment to moblie devices such as tablets and smartphones. The touchscreen gestures in Unity are, in my opinion, much more intuitive than the GNOME 3.14 offerings. For example, in GNOME 3.14, to drag a window, you must use a single finger to drag the window via its titlebar. In Unity, you can drag a window from any point in the window by using three fingers to move it. Additionally, simply tapping the window with three fingers will not only allow you to drag the window with one finger in the center, but you can also resize the window with 8 other touchpoints shown on the window (that fade after 1 second of inactivity). The touchscreen support is so integrated into the Unity desktop that Ubuntu has enabled the touchscreen support in Chromium by default! I can’t get touchscreen support in any other browser or desktop without addons.

Touchscreen computers are not going away anytime soon. In my opinion, with the exception of Unity, this is a place where Linux desktop is severly lacking and behind. As these touchscreen laptops and desktops become more common, we, as a Linux community, need to become more proactive (rather than reactive) on how to support and integrate these technologies. Otherwise, we’re going to continue to appear to the rest of the world like we’re computing in the dark ages.