-
Notifications
You must be signed in to change notification settings - Fork 67
Feature/input emulation #40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: vnc
Are you sure you want to change the base?
Conversation
Oh thanks! I was thinking of adding key strokes actually to be able to flip pages... |
Definetly keep me in the loop! I am now an expert of the RM2 touch and digitizer key codes :D so maybe i can help from the technical side here as well. I plan to write up my findings in form of a technical documentation, but i have to find time for it and many details are still not understood. Please let me know if you are interested to integrate my work in the next weeks, than i would clean up my part and rebase it, such that i can make a proper PR. Maybe one disclaimer, i am only owning an RM2 hence my input codes might not work properly on the RM1, but it seems that the they dont differ much (when reading arround in the other repositories) Also let me know if you can confirm that everything is working. for you. |
Re testing: I will test it on my RM1 maybe this weekend. |
frankly i dont know a lot about evdev even though i came across it when looking at other code. Perhaps there is a much easier way to deal with it than emulating the touch events to trigger page flip definetly makes sense but i still like the enhanced functionallity to also be able to do other stuff. If you are right with evdev, i think it would indeed make sense to add it as a dependency, as using an existing library is certainly beneficial for robustness. |
Let me clarify: Python-evdev cannot be used as is because it only manipulates local devices afaict. |
i am not 100% sure it can only be used on local device files and not on ssh mounted files I have been looking at the stuff that has been documented so far in other projects, however the single commands for the stylus are still quite complicated and i had to try arround a lot to get the device working. touch was quite trivial actually. You can see how elaborate the codes are if you look at my code. The RM2 has no physical keys! I afaik the only way to make page flipping is to emulate the touch gestures. |
Have you had time to look at it in the mean time and are you interested to integrate it into the project? |
Just to clarify---the vnc server supports digitizer, touch, and key events in the computer->tablet direction, independently of the other direction. I use this occasionally myself; feel free to ping me if any changes in the support would be useful. |
(One advantage of using that is that it takes care of setting up a uinput device for the keyboard input on the rm2 where there is no native keyboard.) |
Ah @pl-semiotics that's amazing thanks, I did miss that! |
@stheid it is indeed a nice to have feature, but I would implement it by sending the events to the VNC server as @pl-semiotics suggested. Feel free to make an attempt at it if you feel like it, the rfb.py module should support this (I have to check) |
@bordaigorl Ah, wonderful, glad I could clear that up! Feel free to ping me if you run into any trouble with the interface. @stheid By the way, in case you were looking for the documentation around all of this, the kernel input system is documented nicely here. The evdev event interface (providing
The code that the VNC server uses to handle interacting with input devices is here. |
Definetly way better to implement it properly! I am also interested to help depending on my time. |
I implemented this in the |
@pl-semiotics I am noticing that sometimes buttons become unresponsive on the tablet (until next restart of xochitl) after using it with rmview (with the |
@stheid @pl-semiotics could you test the |
Sry, didnt read. i will checkout and test your code within this week |
drawing and clicking works great :) Somehow the gestures dont seem to work, but maybe i can figure out whats going on. Very nice. I think this PR can be closed and details are perhaps better discussed in the other PR? tested on RM2 |
The thing with the gestures seems to be a little more severe:
workarrounds
source of the errorconnecting to the VNC server via remmina results in the same behaviour (click on the screen via the vnc client breaks the touch gestures) hence i assume its related to the vnc server below and not an issue related to rmview itself. button eventsdespite the fact that the rM2 does not have physical keys, your shortcuts for paging (ctl+left...) work perfectly (basically recreating the gestures in nice, however i want to point out again, the gestures are also broken when physically touching the device unless you reset the events with the stylus first.) pausing and resumingalso seems to work like a charm ❤️ |
Oh dear, I apologize for not seeing this earlier. This is an interesting failure mode that I have not seen in my own testing; probably there is some slight error in the multitouch-protocol-b state machine that makes xochitl think that there is an extra finger down or some such, causing issues. I haven't been able to easily reproduce this, so I think I need to look at input logs. I cross-compiled @bordaigorl there shouldn't be anything terribly interesting w.r.t. termination (well, maybe don't kill it while it's in the middle of writing an event, but that shouldn't be too likely)---I also usually just kill it when done. By buttons, do you mean the ones on the touchscreen or the hardware ones? The touchscreen I can imagine (although not reproduce; I will try to debug it if I can get event logs), but I really don't see a reason why the hardware keys ought to be affected. |
When i was talking about "button emulation" i was talking about the following: The RM1 has physical buttons to go to next/previous page and close current file (3 in total) That said it is very nice and not a bug of any sort. i was merely surprised that this would work at the rm2 as i would have expected it to ignore the events. |
To be clear:
or should i record anything when triggering events though rmview? |
Yes, I know. I was asking about the phrasing that @bordaigorl identified in #40 (comment) . I am glad the keyboard emulation is useful. Luckily, enough code is shared between xochitl on RM1 and RM2 that it still uses any keyboard devices that appear, so a uinput virtual keyboard can work.
Sorry, I should have been more clear. Ideally:
So that I can see the event sequence that rM-vnc-server is sending that is causing the issue in order to debug it. (If it requires multiple attempts to reproduce the issue, please send the file containing just the events from the attempt that did cause the issue.) |
haha, fortunately the bug is very consistent ^^. i will do in the next couple of days. Thanks for your cool work :) |
I think the requested logs dont reflect anything about the bug :( As said before, i can trigger the bug by only triggering the emulated stylus. As seen in the log, this causes actually no events at all on the touch device. Those are the two logs for playing out the gesture on the tablet.
As i said already, the bug can be triggered by emulating the stylus and fixed by using the physical stylus. Hence the bug can be triggered and resolved without triggering anything on the touch device. emulation vs the real thingtouchfor further narrowing down i looked what "emulating touch" triggers and how it differs from actually touching the device. stylusFinally i checked how the emulated stylus and real stylus events look my best guessi think the bug is perhaps triggered by xochil waiting for something to finish. as said before triggering a stylus event clears all problems, however triggering a touch event does not. When looking at the bug inducing logs i have the impression there are 2 distinct bugs for each device that both cause xochil to be unhappy. |
@stheid Thank you for the clarifications! Sorry, I didn't realize that you were triggering this via the stylus---since you were discussing gestures, I assumed that "click once on the screen" meant emulated touch input. Thank you for adding on the stylus logs, then---that's certainly necessary. I think I see the issue. On the RM1 I tested touch mostly by using the menu items since I do use the buttons for navigation, and the menu items continue working fine even when xochitl believes the stylus to be in proximity. The gestures however must be disabled for palm rejection reasons. So, the problem is that the vnc server sends stylus hover events when you move the mouse cursor around, in case some future application on the tablet will want to take advantage of the stylus position even when it is not touching the display. But, there is no way for it to know when the (virtual) stylus is in/out of proximity, so it ends up thinking the stylus is in proximity forever. When you use the real stylus and then move it away from the display, the out-of-prox event is sent which makes everything happy. I can think of a few ways to solve it:
Any thoughts as to which you would prefer @stheid @bordaigorl ? I can do any of them. |
Wow thank you so much @stheid for the accurate testing! First of all: rmview does not send pointer move events unless a button is pressed; this could simplify a solution because
For move events with button up state a timeout makes sense (although I don't think they would be useful from rmview and so it's a feature that we wouldn't be using). An explicit proximity event by extending the protocol could give the highest flexibility I guess (the client is in full control). Certainly one should keep track of whether out-of-proximity has been fired so that if can be fired if needed when the server is killed. |
Btw I can confirm @pl-semiotics 's diagnosis: the issues I experienced with the RM1 can be reliably triggered by sending stylus or touch events and then resolved by getting the physical stylus close and then far again. |
@pl-semiotics nice analysis of the bug, happy to help 👍 is there any disadvantage to just sending an "out-of-proximity" after each mouse move on the vnc client? i dont understand why waiting for a timeout is useful I also see the fix for rmview quite easy, but i think it would be cool to solve it once and for all also on the vnc side. I find it really cool to be able to connect to the remarkable with any standard vnc, although i will probably always use rmview as it automatically starts the server on the remarkable. Thats a huge selling point. ;D |
Yes, I as well did not think about the lack of out-of-prox. I apologize for the obvious bug!
Oh, good, I'm glad that we're hopefully only dealing with the one issue here! Thanks for confirming :)
Yes, if you are not sending move events with no buttons then it's certainly reasonable to combine the prox events with the button events. I don't know whether or not I'll write something that uses the hover events in the future, but it seems plausible, so I'd like to hang on to the protocol-level ability to send them, though (vs. moving the "only send events while buttons are pressed" into the vnc server).
Yes, I suppose the only real issue there is what to do with clients that do not support it. As to the specific mechanism to use to convey the information, I shall try to hack something together in the next day or two.
Most certainly!
Basically, right now it is not significant, but I am generally of the opinion that it's nice to preserve flexibility in case other software comes along later and decides to do something interesting with the stylus-hovering events. I don't know if I will or not, but I might use them in some of my own software in the future as well. Sending in/out events on every mouse motion (that doesn't have buttons pressed) would probably mess with such things, although it is also a reasonable option.
I also use a number of VNC clients, so I'm also concerned about what to do with clients that don't support any particular extension :). However I end up passing the information around, I'll make sure that there's a handshake and that we do something reasonable (like fall back to the in/out-on-every-movement workaround) when client's don't indicate that they want to be smart. |
I wish this feature would be available. I tried to checkout the branch, but it didn't let me. |
@heikeOnScale that is weird you should be able to access the branch just fine (it is public). |
@pl-semiotics any news on implementing the proximity events fix? Let me know if I can help. |
To be honest, if i where you @bordaigorl i would merge the current version anyways, as the bug is very minor (only effects the gestures) and the feature is large. but obviously it would be nice if it where fixed eventually :D |
@bordaigorl Whatever I try, I always get the same error message: ssh: Could not resolve hostname stheid: Name or service not known |
@heikeOnScale your question seems unrelated to this discussion. If so can you open a new issue and post more details (like your configuration with password omitted). If instead it is on topic, can you clarify how? |
It has been related to my previous comment. I am not able to checkout this branch. |
@heikeOnScale ah sorry I forgot about that. Looks like your config isn't right or you are issuing the wrong command. |
@bordaigorl I was able to checkout the code, I hadn't realised that the branch name was "pointer-events". And it is such a cool functionality. I love it! |
I think the new test binaries I uploaded for rm2fb/2.6+ support should include the fix for this---if you do want to send prox-but-not-contact events, button 2 can be used. |
@pl-semiotics thanks so much for this. I tested the new standalone version on RM1. Everything works flawlessly, |
@bordaigorl Strange, I tested this on rM1 as well and I thought that the bug was fixed. The issue before was that there was no way to send out-of-prox events, since every call to Oh, actually, perhaps rmview is only sending events with button 1 down? If so, that would explain why it worked when I tested it but not when you did. If so, could you send an event with no buttons down at the end of the interaction? The component on the tablet needs some way to know when the user stops the click-and-drag interaction. (Sending an out-of-contact+out-of-prox after every received event messes up drawing---we need to only send out-of-contact when the interaction is actually ended). If that hypothesis is right, you should be able to see that sending a touch event from rmview also restores the touch on the tablet, since receiving any input event without button 1 or 2 down will send the out-of-prox. |
@pl-semiotics sorry my fault completely, I messed up a step and I was still using the old executable! |
@bordaigorl Glad to hear it is actually working! The only thing I'm waiting on to make a release is confirmation from people who use rm2 that it works (both with xochitl natively and with rm2fb). I don't have any way to test it on rm2 myself, so it's mostly just "well I there was at least one report of it working before the 2.6 upgrade" at the moment. You may also want to expose some setting for the QSG_RM2FB_PID environment variable, which with the current autodetection code must be set if rm2fb is running not-via-systemd. (If it is running via systemd/toltec, everything ought to work---unless the unit name or the systemd api has been changed.) |
would that mean finding the pid of rm2fb and invoking the server with |
If the rm2fb process is running via systemd as in the packaged versions (which is the easiest way of "finding" it), the backend-qsg will find it for you. If it's running some other way, then yes, that environment variable needs to be set. I don't think it's terribly easy to figure out whether a random fb-owning process is rm2fb or not, so I figured that it'd be easiest to just give a user who wants to run it in their own fashion the ability to specify the pid with via whatever means they want. (I was imagining for rmview perhaps something like a config file setting for a script that returns the pid or some such.) |
got it. I think the easiest way would be to allow customizing the path of the server (and disabling automatic upgrade) so that advanced users can use their custom scripts to launch it |
I have created a proof of concept implementation to add mousepointer emulation to your tool. This is related to an issue i initially opened at reStream, but is much more well suited in your project rien/reStream#37
The code is terrible and i would definetly not recommend to merge it :D
I wanted to show the feature and maybe discuss on how to cleanly reimplement it. Reverse engineering the touch and stylus where challenging enough for the first step :D
Please let me know what you think.
(also sorry for all the diffs, but pycharm seems to disagree with your file formating ^^)