On touchscreen devices, require only one finger to perform navigation gestures
The new GNOME desktop is has a lot of cool features aimed at making touchpads and touchscreens more welcome in the Linux ecosystem than ever before. With that said, I believe the current implementation of the touchscreen navigation gestures are a obtuse at best.
Abstract
When using a modern touchpad, there's 20 years worth of learned behavior. You move your finger along the surface to move the mouse cursor. You tap or click or otherwise apply pressure to click as you would with a mouse. You use 2 fingers to scroll and zoom, and 3 or more fingers to do other mundane tasks in the UI, from switching desktops to opening an overview.
The touchscreen is fundamentally different. Much like the pointer functionality on the Nintendo Wii, there's a semblance of interacting with things by touching them being "natural". You don't need to move your mouse cursor to click things. You don't need to think twice about scrolling and zooming, since back in 2007 we learned to move the content with your finger. A single finger has multiple intuitive uses in this paradigm, and this is why all touchscreen gesture implementations that I would consider "good" use a maximum of 2 fingers (even then, two fingers are only for zooming in and out of content). GNOME using the touchpad philosophy on the touchscreen breaks this convention in a way I would not consider natural.
Implementation
I have skimmed the source code, specifically here, and I'm confident most of the changes can be done solely through this file. On line 482, simply set the number of fingers specified by GESTURE_FINGER_COUNT
to its own constant set to 1, and it's about ready to go.
Of course, it's not as easy as that. Touchscreen scroll events also use one finger, so any attempt at doing so will cause the navigation gesture to override. Zoom gestures fail for the same reason. I have two potential solutions to this problem:
- Detect if a touchscreen gesture starts at the edge of the screen before firing it,
- Use the shell UI as a "grabbable" anchor to start a gesture, so that gestures dont fire when interacting with an active window.
A combination of both would probably be ideal for this, since the Overview page also has a bunch of actions possible with one finger.
Desirability
I have recently made a quick proof of concept and posted it on Reddit, to very positive response. I believe that with the correct implementation, people would use this feature.