[Proposal] Optional system-wide gestures

TLDR: This post proposes a range of additional system-wide gestures which can be optionally and individually enabled and configured by every user to make the workflows within Sailfish OS and its apps more accessible and fluent.

Foreword

  • This proposal has already been posted 2 years ago on TJC. However, since I doubt that the content on TJC will ever be migrated to this forum, I post it here again with some amendments.
  • This proposal does not replace any already existing ways for using Sailfish OS and its apps. It rather introduces some additional ways which - if the user individually enables them - increase efficiency and accessibility and might also help to “declutter” the screen from some of the permanently visible UI elements which became more and more with every new release of Sailfish OS.
  • In other words: This proposal tries to give those users, who want to have it, more of the “traditional” Sailfish OS feeling (as still described on the official website), without cutting down on features or possibilities.
  • The best place for configuring these additional gestures is the gestures page in the Settings app. To make the users acquainted with them, every applicable option on this page could have an info button next to it, that will invoke a respective mini tutorial (similar to what the Tutorial app does for explaining the core basics of using Sailfish OS).
  • The descriptions of the already existing features (native features, not patched features) relate to what exists in Sailfish OS as of version 3.4.0.24 (Pallas-Yllästunturi). I haven’t tried version 4.0.1.48 (Koli) yet.

What’s already existing: Swipes FROM the screen edge

In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:

  • A swipe from the right screen edge takes you to “Home”, with the covers of all your currently running apps.
  • A swipe from the left screen edge takes you to “Events” (if the “Quick Events access” option is enabled), where you can see your current notifications.
  • A swipe from the bottom screen edge takes you to the app grid, from where you can launch your installed apps.
  • A swipe from the top screen edge closes the currently used app (when done from either corner) or takes you to the top menu (when done from the middle), where you can switch ambiences, trigger quick actions and toggle settings.

What can be additionally enabled: Swipes AT the screen edge

In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge (still inside the screen area of course :wink:) and moving it along this edge, instead of moving it towards the screen center. Here is what they’ll do:

  • Right screen edge:

    • Zoom in and out (e.g. in the Gallery app):
      No need to use your second hand or place your device on a surface first. Of course, you can still use the well known 2-finger pinch gesture, if you like.

    • Aimed scrolling (e.g. in the People app or in the Gallery app):
      Actually, the People app already implements this concept with a permanently visible letterbar. While you move your finger over this letterbar, an additional hint banner appears, telling you where you currently are in your contacts list. In a similar fashion, users could quickly scroll to a desired capture date in the Gallery app. The difference however would be, that no datebar is permanently visible in this case, but only the hint banner while you are scrolling, to keep the screen uncluttered. Of course, you can still scroll to that specific photo right in the middle of a list of thousands of photos by flicking with your finger over the screen more than a dozen times.

  • Left screen edge: Access top/bottom pulley menu
    No need to scroll up/down the entire page in order to access a pulley menu (if one is available). With this gesture, it’s immediately accessible no matter where you currently are on the page. Of course, you can still scroll to the top/bottom of the page first, which is no problem, but how do you conveniently get back to where you initially were?
    For Android apps, since they do not have pulley menus AFAIK, this gesture could instead move the entire app down and back up on the screen, so that UI elements at the top can be reached more easily, especially on large devices. A similar idea, which proposes a different gesture however, can be found here.

  • Bottom screen edge: Go to next/previous element
    No need to bring up any navigation bar first. Immediately switch to the next/previous webpage, document page, picture, video, audio track, e-mail, and so on. Of course, you can still use the already existing navigation methods, if you like.

  • Top screen edge: Control a user-selectable “secondary” app
    No need to distract your workflow by having to temporarily switch apps or use cover buttons for small actions, such as skipping to the next audio track in the media player. Of course, you can still do it by manually switching apps or using cover buttons.
    Furthermore, instead of selecting a secondary app, the currently used app could also be automatically selected to receive this gesture.

For maximum freedom of choice, users should not only be able to select which of these gestures they want to use, but also be able to assign their favorite screen edge to them.

More optional gestures: Double-taps AT the screen edge

Instead of swiping at the screen edges, users can also double-tap on these regions to trigger additional actions related to those described above:

  • Right screen edge:

    • Reset zoom level (e.g. in the Gallery app)
    • Open search function (e.g. in the People app or in the Gallery app)
  • Left screen edge: Switch between the 2 last used apps (similar to what Alt+Tab does on a desktop computer and a different approach to this gesture)

  • Bottom screen edge:

    • Browser: Go to the tab selection screen
    • Gallery: Close the current picture/video
    • Email: Close the current e-mail
    • Media: Pause/resume playback
    • Documents: Close the current document
    • and so on…
  • Top screen edge: Action depends on which app has been selected as “secondary” app

Again, users should be able to configure and also disable these gestures, since it surely takes some practice to effortlessly use them, or they might be a hindrance in certain use cases.

Assisting the user by providing tooltips

To assist the user in discovering/remembering which gesture does what, tooltips can be enabled. If the user places one finger at any of the screen edges without swiping or tapping, icons or similar visual hints will appear that tell what will happen if the user swipes into a certain direction from the current position or double-taps at this position. If the user lifts the finger without having moved it, the tooltips will automatically disappear, and no action will be triggered.

These tooltips could also include more specific information. In case of the Media Player for example, this additional information could be the title, elapsed time, remaining time, or even the cover art of the currently playing audio track.

Home screen: What’s already existing

Currently, we have the following possibilities on “Home”:

  • Tapping on an app cover switches to that app.
  • Tapping on a cover button of an app triggers the respective cover button action.
  • Tapping on a free spot on the screen briefly brings up the topmost row of the app grid to conveniently launch your most often used apps.
  • Tapping on the top screen edge briefly brings up the 2 topmost rows of the top menu.
  • Long-pressing anywhere on the screen switches to “Close/Rearrange apps” (housekeeping) mode.
  • Swiping left or right from anywhere inside the screen takes you to “Events”.
  • Swiping up from anywhere inside the screen takes you to the app grid.
  • Swiping down from anywhere inside the screen takes you to the top menu.

Home screen: How it can be enhanced

If enabled by the user, it now makes a difference if you swipe up/down from a free spot inside the screen or from an app cover:

  • Swiping up from a free spot inside the screen still takes you to the app grid.
  • Swiping up from an app cover reveals additional cover buttons (if the app provides any). Depending on the current size of the app cover, up to 6 such additional buttons could be well realizable IMHO. This will greatly improve the feasibility of the cover button concept, because apps are not limited to just 2 cover buttons anymore. The presence of such additional buttons could be indicated to the user with a glowing line at the bottom of the app cover, similar to the glowing line that indicates the presence of a pulley menu.
  • Swiping down from a free spot inside the screen still takes you to the top menu.
  • Swiping down from an app cover will close that app. Of course, you can still enter housekeeping mode in the known way to batch-close your apps.

Tapping on a free spot anywhere inside the screen could also be enhanced natively. Instead of just briefly displaying a part the app grid, a part of the top menu could be displayed at the same time. The number of app grid and top menu rows should be made individually configurable, as well as the time delay after which the menus automatically disappear again. Also see this question on TJC.

9 Likes

the long edge swipe that does app switching: i would like to use that as close app

i would like to have all the gestures enlisted, just like keyboard shortcuts be possible to cobfigure what actions i want
to perform:
minimize app
close app

Personally i am against overcomplicating the gestures in SFOS. Even the “edge” gesture from the top to close the app that is near the center gesture from the top to show the lock menu is kind of pushing it.

It shouldn’t be more than one edge gesture one action. And i would be totally against gestures that incorporate moving your finger in multiple directions.

1 Like

I can only agree with @ApB . But for closing apps, the gesture was there first, and i don’t need the top menu inside apps, it was perfectly fine to have just on the home screen.

Anyway, back to the concrete suggestions here.
The edges are what you risk touching by mistake, and making them more magic will mean more unintended actions will happen. Here is some general feedback, and i apologize in advance if it sounds mean, i’m just trying to be to-the-point.

Swipe at right screen edge:
I can’t see how you could get around having to hook up the zoom individually in each app to the particular element you want, making this not so global. At least not without really dirty tricks that risks breaking things.
How is this mode of scrolling better than the one you already acknowledge exists? Sounds like you just want that in more apps? And similarly, each app would still have to “publish” the property to be displayed here, so also not something that can easily be made to a global system gesture.

Swipe at left screen edge
(Long) Scroll and pulley is just bad design in the first place and seems like a somewhat rare construct thankfully. At least where you end up in the use case you mentioned of having to get back down, there are places with unrelated use cases. Not a fan of toolbars, but that is the solution for this case.

Swipe at bottom screen edge
Here some of the hooking up might be able to done from the OS, but not all, so it would still not be that much of a system gesture if only participating apps have it. Some of the things you ask for have full-page swipes already, and if they can they definitely should. Browser and email definitely needs more gestures, but isn’t that really mostly it?

Swipe at top screen edge:
(Who can reach this anyway nowadays?)
Seems very preliminary in its use cases… i don’t know what to say… covers are pretty quick already.
And with being user selectable, you’d be limited to one target app, which might not even be what you are using at the moment.

Double taps
Now these actions you mention here basically all already exist in-app.
Zoom is reset by double-tapping anywhere on an image.
Going back to the gallery from having viewed an image is done by swiping up or down.
Closing an email is the same as going, back, with the normal swipe. Same goes for documents.

That doesn’t leave much on the list, just a few apps needing improvements, not something to warrant system-wide gestures.

Tooltips
If something needs a tooltip or an instruction, it is too complicated. Good design doesn’t need explaining, and if you feel it will even need re-explaining you are on really thin ice.

Home screen:
A lot of your suggestions seems to suggest there is some free spot to swipe on. I often wouldn’t have one available. Relegating normal functionality to being conditional on that doesn’t sound like a good idea. I too miss swiping on covers, but with the current paradigms, and in order to not clutter things up, the home screen is about as action packed as it can get already.

1 Like

@pawel.spoon, @ApB, @attah: Thanks for your comments and criticism.

When I wrote “system-wide”, I meant respective libraries / interfaces / signals / etc. that are provided by the operating system to all apps, just like there are libraries with Sailfish OS-specific GUI elements, that all developers can use to shape the visual appearance of their apps.
Apps can hook themselves to those interfaces (or part of them) and react through them to the gestures in whatever way their developer sees fit, or they can completely ignore them. There shouldn’t be any obligations, except for some design best-practices defined by Jolla maybe.

Regarding all the examples that I posted: I just tried to point out, how all these proposed additional gestures could be used in the already existing apps that are shipped with Sailfish OS. Third-party apps may use the proposed gestures for completely different purposes. Hence the also proposed tooltips, which then of course have to be populated with the correct information for the user by the app itself.

If Sailfish OS provides the respective interfaces in a standardized way (see above), this shouldn’t be a problem IMHO.

That’s exactly what I proposed (the hybrid “Close app” / “Open top menu” thing wasn’t my invention).

If you mean that “Swipe FROM the edge” gestures and “Swipe AT the edge” gestures might be too confusing to understand for some users: These are orthogonal directions for every screen edge. It’s the same thing like how you use your apps with swipes inside the screen today: Swiping up/down is one action (scrolling content), swiping in the orthogonal direction (left/right) is another action (going forward/backward in the app).

I didn’t propose anything like this here.

I am aware of this, but I don’t think that it is an unsolvable problem.
Besides: I pointed out, that these additional gestures should be optional, not mandatory. Their intention is to make workflows more fluent (e.g. pulley menus accessible from anywhere) and features more comfortable (e.g. easier one-handed device usage). Hence the references to the already existing usage patterns in my examples, which should remain available in parallel.

It’s not about forcefully sticking a new gesture to an old app that does not support this gesture. It’s about providing a greater variety of different gestures for new and updated apps to act upon if they like.

Exactly! :grinning:

So where is the problem?
Again, old apps will not benefit from these additional gestures, but updated and new apps will.

I see it this way: A permanently visible toolbar takes away screen space which the user’s content could occupy instead. A pulley menu, if it is accessible without having to scroll first, does not take away screen space when not used (except for the glowing line at the screen edge), and is therefore a much more elegant solution. If you now say “But a pulley menu is more difficult to use.”: Just flick with your finger to open it entirely and take your time for making your selection.

You used the correct keyword: “Participating” apps. :wink:
See also my comments above.

An example for the Gallery app: If you are taking several photos of the same scene in order to make sure that you get a really good one, and want to quickly sort out the not-good ones afterwards, you would probably want to do it as follows:

  1. Open the first photo and zoom into the interesting area, such as a particular face.
  2. Switch to the next photo while keeping the set zoom level and viewport, to compare it with the first one. How do you do this? That’s what the “Swipe at the bottom screen edge” gesture could provide here.

Indeed, because you can also quite quickly do it by using the respective cover buttons, I have put these actions at the top screen edge.

Housekeeping mode on the home screen could be used to quickly select your secondary app. An additional “Select as secondary app” button could be added to every app cover in addition to the “Close” button.

I know, but I’m thinking of more sophisticated apps here, such as image editors. The developer of such an app might want to assign the double-tap on the image to an entirely different function, but still give the user the possibility to also reset the zoom level without having to switch to a different operation mode first.

If the image is currently zoomed, you have to scroll to the upper or lower end first (or reset the zoom level), before this gesture works.

I agree with you on this one.

What about all the apps in the Jolla Store and in Openrepos? :wink:

This is true for experienced users. But new users and especially users who feel unconfident with technology might be happy about having tooltips, even if they have already seen them more than 100 times. That’s why I proposed that all users should have the possibility to disable these tooltips if they don’t need them (anymore).

Your mileage differs from mine here. But you have a point.
That’s why, like with all other proposed features, this too should be only an option and not mandatory.

BH, you have a couple of valid points here. I, too, am missling some features and ideas to make Sailfish stick out in all that Android and iOS swamp. Your idea sounds neat and interesting.

I remember BlackBerryOS having this up-and-right (or left for left-handed people) gesture to open up all notifications and messages. This gesture - too - could be a replacement for Alt-Tab.

All in all, I like your proposal!

1 Like

I’m sorry, but the proposals are just not “done” enough to interest me. Not as a user, and not as a app developer. Don’t get me wrong, i too would like more use cases for gestures (nb. not necessarily more gestures). So this seems like i’m whining, but i’m trying to see if we can’t shake out some gold nuggets here somewhere.

If your go-to defence of these gestures is that they would optional, and you thereby don’t have to solve conceptual problems with them just proves to me that they need more work (and probably simplification). I actually don’t think mistaken touches is easily solved.

But pulley menus should not need to be accessible from anywhere, that is bad design, therefore there is no need to cater for it. Let’s come up with an even better concept than toolbars and poorly placed pulley menus instead.

So we agree, fixing up apps to use more of currently available constructs is maybe even more important.

Well, if the use cases doesn’t come basically for free, or they aren’t absolutely “killer”, not many apps will adapt, and calling it anything like a convention falls on adoption.

I agree, but pulley menus combined with scroll is at least as terrible. So make it a long press menu if you need to remain in-context. Maybe i’m an apple user in disguise, but there should be one canonical way to do things, and it should work well. Combining pulley menus with (long) scroll should not be encouraged with this optional workaround. (some app makers may assume all use it)

…and this will be pretty terrible for discoverability, and worse, consistency.

This is a niche use-case that i can’t see translating to basically any other app. Therefore it is better solved inside the app itself. Say, by adding a long-press gesture-based menu (long-press and swipe to the side to go back or forward).

Aren’t they better of making their own controls? If it will only be a few apps participating, and to varying extent, with varying functions, it makes for really bad consistency.

I was dismissing use cases, not apps. And you need use cases that are consistent enough across many apps to make it a convention. So, indeed, what about those apps, what would they need?

I think they will be even happier with super simple and consistent actions. I have someone close to me that has Alzheimer’s, and things just popping in and demanding attention is definitely not populator. I feel quite the same way if i get stressed.

And back to the first thing; if this is your defence, the concepts needs more work (and simplification). Having 4 or 6 apps open is quite common, it’s a multitasking OS after all.

Thanks for staying with me.

I want to draw a comparison with desktop PCs and laptops:
Some people prefer to operate the GUI of their desktop PC apps mainly with the mouse, other people mainly use keyboard shortcuts for the same tasks. Even nowadays, many desktop PC apps with a GUI can be entirely operated with a keyboard only. The possibility to also use a mouse is just optional. It caters to a different style of usage, but it does not necessarily intend to solve any conceptual problems which the keyboard-only style of usage might have.

On touchscreen-only devices, the same can be said for e.g. the on-screen keyboard. Some users prefer to enter their words by tapping on every single letter, others do it by swiping over every letter with one single finger stroke and let some word recognition algorithm do the rest. The swipe method is just optional (you even need an alternative keyboard app as of now). It does not intend to solve a conceptual problem of the tap method (there is no such problem IMHO).

Of course, this doesn’t mean that any nowadays optional method of interacting with apps will stay optional forever. If there are enough clever implementations across many different programs, it might also become a standard in the future.

I think it’s mainly a matter of proper processing (calibration, filtering, etc.) of the data from the touch sensors. If the operating system does all this stuff, apps don’t have to fiddle around with it themselves.

Maybe you misunderstood me here. I never proposed that every app should provide a pulley menu on every of its pages. I just proposed that if a pulley menu is already available on a certain app page, this pulley menu should also be accessible without having to scroll to the top/bottom first.

While pulley menus can be a great addition or even replacement for a toolbar, they are not an universal solution of course. Same with context menus or any other such GUI element.

Such as?

Yes. But this shouldn’t keep us away from thinking about new constructs and how they could benefit the current UX.

If Sailfish OS provides e.g. the proposed additional gestures via APIs that can be used the same way like the already existing ones, app developers shouldn’t have much trouble integrating these.
At the end of the day, the users are IMHO the ones who declare what is a “killer” feature, not the developers (unless they have a very effective marketing department :wink:).

My proposed gesture for the pulley menu entirely removes any need to scroll first. Being forced to scroll first, as it is currently necessary, is the conceptual problem here.

Regarding discoverability: That’s why tooltips can be displayed to the user.
Regarding consistency: That’s why design guideline documents from Jolla (should) exist.

Anything like this would always have to be solved inside the app itself. The only thing that the operating system should do in this case is distinguishing between normal swipes and edge swipes and communicating this information to the app. Apps however shouldn’t do this distinction themselves for consistency reasons.

Think of the Silica library. It offers a set of standardized GUI elements to Sailfish apps to provide system-wide visual consistency. Just like standardized GUI elements, standardized gestures can be provided to Sailfish apps too. Like said above, the operating system can just tell the app if and which standard gesture just happened. It cannot (and should not) tell the app how to react to it. Things stay consistent however, if the respective design guideline documents include recommendations for using these standard gestures, and the app developers consider these recommendations.

IMHO a versatile and feature-rich, but still consistent toolbox of APIs and libraries to choose from and well written documentation explaining their recommended usage.
Not that this wouldn’t already exist in Sailfish OS and be already good enough for a variety of apps, but there is always room for improvement.

Indeed, if things are popping in without the user having asked for them, this can be very annoying. But for the cases where users explicitely want to know if something really is what they think it is, any such guidance is a helpful feature.
Of course, the more intuitive a system is (read: the less additional guidances are needed), the better.

You seem really reluctant to improve on or rethink your proposals… Are you expecting them to be picked up like this and refined by Jolla, or what are you actually asking? I (personally) think you are asking way too much. One or two really killer new gestures would probably have a much better chance.

As a user i am so far uninterested. As a developer i’m actually slightly terrified.

Reasonable, but these are paradigms within the same input method. As such, adding multiple ways to do things and catering to niche use-cases has a much higher burden of evidence, imo.

Also, there is a very important distinction here. Those commands are built in to the OS and various toolkits for the most part, and come for free for developers. Whereas your suggestions mostly doesn’t. These actions on desktop OSes are also much more clearly defined than your “do whatever you want, let’s define it later” suggestions.

No, you are not hearing me. It is garbage design to combine pulleys with long scroll, and especially while needing to remain in-context. Thus, there is no need to cater for it, and encourage it.

Longpress menu, longpress-and-hold gestures.

Yes, this. The use case mentioned above needs to stay away from them.

No, but it is quite a big ask when there is low-hanging fruit like this.

So you are putting it on Jolla to define what these actions actually do? You are not doing a great job selling this.

If there is basically no implementation support and hookup to existing components, i think selling it as OS level features are just too much.
But what makes exactly these 8 things the ones that needs adding first?
Why are they better than e.g a longpress-and-swipe-with-hints UI element? You need to sell it better!

@attah: I suggest that, for the time being, let’s see what other forum members (including Jolla staff) have to say on this topic.

And by the way: Jolla already seems to be thinking about the pulley and scrolling topic. Specifically:

I have handed in my proposal, you have handed in yours, other forum members can hand in theirs. And Jolla will eventually make a selection.

1 Like

I need the settings gesture very very hard.

@motomeizu2017: Sorry for my late reply. I don’t understand what you mean with “settings gesture”. Can you please explain better?

I’m all for keeping things as simple and as uncluttered as possible - ‘less is more’ is so often the approach to achieving elegant design, and the simple designs are often the best and easiest to use. On the long scroll and pulley menu issue - yes, this is a complete pain and needs looking at. I was reading this (long thread) on the Sailfish Forum viewer app - but since its a viewer you can’t post - the solution? Scroll all the way to the top to access the pulley menu and use the ‘open external browser’ option. There are other Sailfish apps I use all the time where this is equally a pain - e.g. email, messages, Quickddit, Piepmatz, etc, etc. I know you have the ‘fast scroll’ arrows that appear when you start scrolling, but even these are a bit hit and miss sometimes to activate.

1 Like

Simple fix in this case: add a pulley menu at the bottom that is only enabled if and when the view is longer than one page (contentHeight > page.height+Theme.horizontalMargin).

I lean more towards minimalism, but I will say that I would like a last-app gesture. It’s one of my most appreciated functions on Android… Also, Sony has a feature called Side Sense, which is basically what op talked about with the ideas of sliding up and down, or tapping / double-tapping the screen edges, and I found that useful sometimes, after getting used to it. It’s highly configurable, and can be easily turned on or off, and has a function of showing a mini app launcher, overlaid on your current screen, which is a very good idea…

1 Like

This is actually already implemented, but disabled by default (with no GUI option to enable):

1 Like

I agree. This is the feature that should be worked on first IMHO.

These arrows are useful. They quickly take you to the top or bottom of the current page. Unfortunately, they won’t take you back to the scroll position (viewport) you came from.

That’s a possibility, but not the most elegant solution IMHO:

  1. It creates unnecessary redundance.
  2. It doesn’t solve the problem mentioned above (being able to access the pulley menus from the middle of a very long page without having to change the viewport first).

Interesting. So if such an interaction concept seems to work for Android users, it will surely also work for Sailfish users.

  1. yes and no: for the developer yes, for the user no because you’ll never see both at once (if implemented correctly)
  2. true, a new gesture/way for that would be awesome