I’m porting my game engine (third’s time the charm!) to the Sailfish OS and glad to see SDL2 APIs are considered stable and library gets updated pretty often. It means there isn’t a lot to be done and apart from manual screen rotation (it’s unfortunate compositor years later still can’t do this on it’s own), which I implemented now, only broken essential feature is the keyboard.
In SDL, calling SDL_StartTextInput enables IMEs, including on-screen keyboard in Android and iOS, though it doesn’t seem to do anything on Sailfish. Could somebody give me some pointers here?
As far as I understand, Sailfish uses Maliit keyboard which has some D-Bus APIs but I’m not sure what to do with them.
EDIT: @mal@pvuorela I had a look for SDL examples since I had done some testing but found nothing keyboard related. Is there any test code using keyboard?
Some info on interaction with keyboard still would be useful. Maybe should look which dbus methods are available and try to play with them. Sailfish documentation also mentions both com.jolla.keyboard_and_ org.maliit.server: D-Bus APIs | Sailfish OS Documentation . Not sure what’s the difference between them.
Never got around to do anything for SDL text input support, so indeed shouldn’t work at the moment.
If anyone feels like needing that enough, patches for our SDL[1] are welcome Needs something like what ibus has there[2], implementing the maliit d-bus interface for some basic parts [3].
Another long pending item is switching our Maliit to use wayland interfaces which SDL even supports, I think.
I have experimental builds of SDL3 in the OBS repo below.
Note that all the packages build, and install, and are partially patched for Sailfish OS like SDL2 was (mainly for the old wl_shell wayland protocol).
However, I haven’t been able to get the default examples or test programs to display anything.
This is most likely because setting up the screens and surfaces must be done in a particular way - and I’m simply not a good programmer, c, SDL or otherwise, to get this done.
So if anyone is interested in playing with this, please do and I’d be happy to hear about any fixes or other changes which improve the state of things.
One exciting thing about this is mentioned in the SDL3 Wayland README, which is combining Qt(6) and SDL3 surfaces in the same app.
This would mean we could have “native” (Qt) UI wrappers around SDL applications.
In fact, this could solve the original issue in theory: Make a small Qt6 app using the maliit-inputcontext plugin, and pass the keypresses over to SDL.
I have dabbled in adapting that example with Sailfishapp (so, Qt5/Silica + SDL2/3) and partially succeeded. Someone who actually knows what they’re doing may build upon that:
wow literally messing around with the same stuff (gave up for now) with keyboard input for some stupid sdl terminal (not useful just for fun), maybe you’ll find a way to open keyboard (because i then decided to add a shit sdl based keyboard but this is not the way to go)
I am not really a good programmer, and I don’t know if this is actually useful, but the following very short snippet does activate the SFOS VKB from SDL.
I decided to try glib instead of Qt for no particular reason:
#include "maliit-2/maliit-glib/maliitinputmethod.h"
class TouchUI_SailfishOS : public TouchUI {
public:
TouchUI_SailfishOS();
~TouchUI_SailfishOS();
void getKeyboardString(const char* string);
private:
// glib
MaliitInputMethod *m_im;
}
TouchUI_SailfishOS::TouchUI_SailfishOS() {
fprintf(stderr, "Maliit interface loaded.\n");
m_im = maliit_input_method_new();
}
void TouchUI_SailfishOS::getKeyboardString(const char* string);
// trigger the virtual keyboard
maliit_input_method_show(m_im);
... // now make something useful with it.
}
Sure, yeah, I was just copying from my prototype which happens to be cpp-style.
Cool - but what’s next?
I think one has to set up a “MaliitContext” as well(??), and register some event handlers and a proper event loop to call them.
And then somehow integrate that with the SDL event loop.
So I tried to implement Maliit as a IME in libsdl itself instead of on the application side.
It is somewhat working, but I’m facing some - questions.
I’d appreciate if someone could look over the implementation (which is very crude at the moment), and/or answer some of the following:
Most other implementations, and examples, seem to be listening for signals on dbus on com.meego.inputmethod.inputcontext1. However, logging what comes over the connection I only see Method Calls, Replies, never Signals. I am initializing the connection like this. Is that correct?
Maliit seems to have this winId property, to be set via updateWidgetInformation on the com.meego.inputmethod.uiserver1 interface. However the example code I saw always uses an X11 window ID property for this field. We don’t have that with wayland. Which ID is there to be used? Or is it not needed at all?
On my one test SDL app (exult), the keyboard shows up immediately after starting, even though there has not been a SDL_StartTextInput issued AFAICS. Also, after closing the keyboard, it can not be opened again, SDL_IsTextInputActive() shows true.
Any hints on the interconnection between SDL’s tracking of input being active or not, and the right sequence of calls to activateContext, showInputMethod and hideInputMethod? Am I supposed to call SDL_StopTextInput from the IME plugin when the user hides the keyboard?
SDL itself - Preedit - so existing text can be sent to Maliit as “preedit” text - however in the SDL IME interface there is no way to send/initialize this existing text. How is that supposed to work?
To run an SDL app with this enabled, set SDL_IM_MODULE=maliit in the environment.
Oh, and forgive any gross confusion and stupidity in both the questions and my code – I don’t actually speak C, and I have not really grokked Maliit either.
(Also just because it deserves to be said always: I hate DBus!!)