TLDR: This post proposes a range of additional system-wide gestures which can be optionally and individually enabled and configured by every user to make the workflows within Sailfish OS and its apps more accessible and fluent.
- This proposal has already been posted 2 years ago on TJC. However, since I doubt that the content on TJC will ever be migrated to this forum, I post it here again with some amendments.
- This proposal does not replace any already existing ways for using Sailfish OS and its apps. It rather introduces some additional ways which - if the user individually enables them - increase efficiency and accessibility and might also help to “declutter” the screen from some of the permanently visible UI elements which became more and more with every new release of Sailfish OS.
- In other words: This proposal tries to give those users, who want to have it, more of the “traditional” Sailfish OS feeling (as still described on the official website), without cutting down on features or possibilities.
- The best place for configuring these additional gestures is the gestures page in the Settings app. To make the users acquainted with them, every applicable option on this page could have an info button next to it, that will invoke a respective mini tutorial (similar to what the Tutorial app does for explaining the core basics of using Sailfish OS).
- The descriptions of the already existing features (native features, not patched features) relate to what exists in Sailfish OS as of version 18.104.22.168 (Pallas-Yllästunturi). I haven’t tried version 22.214.171.124 (Koli) yet.
What’s already existing: Swipes FROM the screen edge
In Sailfish OS, we currently have “Swipe from the screen edge towards the screen center” gestures:
- A swipe from the right screen edge takes you to “Home”, with the covers of all your currently running apps.
- A swipe from the left screen edge takes you to “Events” (if the “Quick Events access” option is enabled), where you can see your current notifications.
- A swipe from the bottom screen edge takes you to the app grid, from where you can launch your installed apps.
- A swipe from the top screen edge closes the currently used app (when done from either corner) or takes you to the top menu (when done from the middle), where you can switch ambiences, trigger quick actions and toggle settings.
What can be additionally enabled: Swipes AT the screen edge
In addition to those “Swipe FROM the screen edge” gestures, this proposal introduces “Swipe AT the screen edge” gestures. They are performed by simply placing one finger at the desired screen edge (still inside the screen area of course ) and moving it along this edge, instead of moving it towards the screen center. Here is what they’ll do:
Right screen edge:
Zoom in and out (e.g. in the Gallery app):
No need to use your second hand or place your device on a surface first. Of course, you can still use the well known 2-finger pinch gesture, if you like.
Aimed scrolling (e.g. in the People app or in the Gallery app):
Actually, the People app already implements this concept with a permanently visible letterbar. While you move your finger over this letterbar, an additional hint banner appears, telling you where you currently are in your contacts list. In a similar fashion, users could quickly scroll to a desired capture date in the Gallery app. The difference however would be, that no datebar is permanently visible in this case, but only the hint banner while you are scrolling, to keep the screen uncluttered. Of course, you can still scroll to that specific photo right in the middle of a list of thousands of photos by flicking with your finger over the screen more than a dozen times.
Left screen edge: Access top/bottom pulley menu
No need to scroll up/down the entire page in order to access a pulley menu (if one is available). With this gesture, it’s immediately accessible no matter where you currently are on the page. Of course, you can still scroll to the top/bottom of the page first, which is no problem, but how do you conveniently get back to where you initially were?
For Android apps, since they do not have pulley menus AFAIK, this gesture could instead move the entire app down and back up on the screen, so that UI elements at the top can be reached more easily, especially on large devices. A similar idea, which proposes a different gesture however, can be found here.
Bottom screen edge: Go to next/previous element
No need to bring up any navigation bar first. Immediately switch to the next/previous webpage, document page, picture, video, audio track, e-mail, and so on. Of course, you can still use the already existing navigation methods, if you like.
Top screen edge: Control a user-selectable “secondary” app
No need to distract your workflow by having to temporarily switch apps or use cover buttons for small actions, such as skipping to the next audio track in the media player. Of course, you can still do it by manually switching apps or using cover buttons.
Furthermore, instead of selecting a secondary app, the currently used app could also be automatically selected to receive this gesture.
For maximum freedom of choice, users should not only be able to select which of these gestures they want to use, but also be able to assign their favorite screen edge to them.
More optional gestures: Double-taps AT the screen edge
Instead of swiping at the screen edges, users can also double-tap on these regions to trigger additional actions related to those described above:
Right screen edge:
- Reset zoom level (e.g. in the Gallery app)
- Open search function (e.g. in the People app or in the Gallery app)
Left screen edge: Switch between the 2 last used apps (similar to what Alt+Tab does on a desktop computer and a different approach to this gesture)
Bottom screen edge:
- Browser: Go to the tab selection screen
- Gallery: Close the current picture/video
- Email: Close the current e-mail
- Media: Pause/resume playback
- Documents: Close the current document
- and so on…
Top screen edge: Action depends on which app has been selected as “secondary” app
Again, users should be able to configure and also disable these gestures, since it surely takes some practice to effortlessly use them, or they might be a hindrance in certain use cases.
Assisting the user by providing tooltips
To assist the user in discovering/remembering which gesture does what, tooltips can be enabled. If the user places one finger at any of the screen edges without swiping or tapping, icons or similar visual hints will appear that tell what will happen if the user swipes into a certain direction from the current position or double-taps at this position. If the user lifts the finger without having moved it, the tooltips will automatically disappear, and no action will be triggered.
These tooltips could also include more specific information. In case of the Media Player for example, this additional information could be the title, elapsed time, remaining time, or even the cover art of the currently playing audio track.
Home screen: What’s already existing
Currently, we have the following possibilities on “Home”:
- Tapping on an app cover switches to that app.
- Tapping on a cover button of an app triggers the respective cover button action.
- Tapping on a free spot on the screen briefly brings up the topmost row of the app grid to conveniently launch your most often used apps.
- Tapping on the top screen edge briefly brings up the 2 topmost rows of the top menu.
- Long-pressing anywhere on the screen switches to “Close/Rearrange apps” (housekeeping) mode.
- Swiping left or right from anywhere inside the screen takes you to “Events”.
- Swiping up from anywhere inside the screen takes you to the app grid.
- Swiping down from anywhere inside the screen takes you to the top menu.
Home screen: How it can be enhanced
If enabled by the user, it now makes a difference if you swipe up/down from a free spot inside the screen or from an app cover:
- Swiping up from a free spot inside the screen still takes you to the app grid.
- Swiping up from an app cover reveals additional cover buttons (if the app provides any). Depending on the current size of the app cover, up to 6 such additional buttons could be well realizable IMHO. This will greatly improve the feasibility of the cover button concept, because apps are not limited to just 2 cover buttons anymore. The presence of such additional buttons could be indicated to the user with a glowing line at the bottom of the app cover, similar to the glowing line that indicates the presence of a pulley menu.
- Swiping down from a free spot inside the screen still takes you to the top menu.
- Swiping down from an app cover will close that app. Of course, you can still enter housekeeping mode in the known way to batch-close your apps.
Tapping on a free spot anywhere inside the screen could also be enhanced natively. Instead of just briefly displaying a part the app grid, a part of the top menu could be displayed at the same time. The number of app grid and top menu rows should be made individually configurable, as well as the time delay after which the menus automatically disappear again. Also see this question on TJC.