Touchscreen issues?

Hi, guys!

I just discovered the game a few days ago, and am working getting my bridge together. My kids and I played a couple of limited setup games last night and it was great fun :)

However, I've had a rather annoying issue with the 22" touchscreen PCs that I'm using for stations: A button can't simply be touched to activate. It requires a bit of a slide or a twist of the finger, and makes some things very difficult, such as selecting targets. Touch outside of the game works fine, and I have tried tweaking every Windows 7 touch setting that I can find. Any ideas?

I'm willing to offer up one of my songs for use in the game, if it will help to bribe the devs to fix this :)

Here's the song, on SoundCloud: https://soundcloud.com/byteshift/06-journeyarrival

Thanks!

Comments

  • I've been having the same issue with our all-in-one Windows 10 touchscreens. Sliding or clicking repeatedly on buttons seems to be the only way to toggle them. (By the way, nice song, Byteshift!)
  • Thanks!

    Another gotcha I encountered on every PC was a 'Failed to open audio device' message on teh console, and subsequently no sound. I discovered that removing
    openal32.dll from the EE folder and then downloading and installing OpenAL1.1 fixed it on all my PCs. I'm sure all you guys have figured this out already, though.
  • I'm aware of the OpenAL issue.
    I've also had reports before on the touchscreen problems. Issue is, that the touch screens I have do not show this problem, so I don't know the root cause.
  • Is there any info that I can collect or anything I can do that might help you? I have six of these 22" touchscreen All-in-one PCs that I'm using. They're two-finger multitouch, if that makes any difference.
  • All the ones I have are single touch, so that could be one of the differences.

    The only thing I can think off that could cause this if the "click" happens before the move of the mouse cursor. In the end, it's using the mouse interface, not a true touchscreen interface.

    Related code for clicking is actually here:
    https://github.com/daid/SeriousProton/blob/master/src/input.cpp#L79
  • Hmm.. maybe a crazy idea, but what if the touchscreen code would be changed so that it detects mouse movement instead of key presses? As long as that code is only executed if touchscreen mode is active in options.ini it should'nt interfere with mouse setups.

    But I have to admit, I have no idea how mouse emulation on touchscreen devices is usually done. If a touch input simulates a moving path between last and current position, that obviously won't work. But if the mouse position just "jumps", it might.

    Anyways, downside of this solution would be that you could'nt change between mouse and touchscreen seamlessly. Even a mouse that is plugged in at the same time would probably cause serious problems
    So maybe instead of using the already existing touchscreen option,an extra option could be added that handles mouse movement as click?

    Of course that would be rather a workaround instead of a real solution.
  • The screens I have just jump the cursor to the new location, and then fire a mouse button event.

    Potentially, the cause can be found if some logging is added here:
    https://github.com/daid/SeriousProton/blob/master/src/engine.cpp#L204
    If the "elapsedTime" is logged together with mouse down/up and motion events.

    (I don't have a lot of time&energy right now to set this up. We have a fresh new crew member in our house of 2.5 weeks, which uses up my spare time and sleep)
  • Congratulations! I know how that is, even though my crew has been on board for a while :)
  • I had the same issue (also Multi Touch) - but I thought it was a "Feature" as double clicking is working fine.
    I did look it up and it seems that this may be a common problem https://forums.adobe.com/thread/595473
    Is there a reason that the code uses the mouse button release event instead of a click event?
  • Is this fixed for you because it is still a problem on my Surface pro.

    I think your issue is that on modern Windows touch screens when emulating mouse events the mouse click happens before the mouse position is moved (with click position embedded with the click event, which you do not reference). Double clicking works because Windows makes a click, the move, and then the second click. And the reason the click does not happen at the last mouse position is because you internally reset the mouse position (input.cpp line 93-97). Dragging works is because the old position is not reset while the click is still active.

  • But I'm not responding to the event other then setting a flag. And I handle all the events before handling the flags later on. As the code is state driven instead of event driven.

    Lines 93-97 shouldn't be the problem, as per default the touch_screen is false, it's only true if you set it in the options. And is used to workaround the "focus" problem that the touchscreens we have are showing.
  • daid said:

    But I'm not responding to the event other then setting a flag. And I handle all the events before handling the flags later on. As the code is state driven instead of event driven.

    Lines 93-97 shouldn't be the problem, as per default the touch_screen is false, it's only true if you set it in the options. And is used to workaround the "focus" problem that the touchscreens we have are showing.

    Hum, okay.

    Interestingly a click event created via autohotkey gets ignored as well.

    I'll try playing with the code once I figure out how to successfully compile it. I was trying but kept running into errors before I found that you have a page with build instructions (no CMake instructions yet :( ) and have not tried again yet.


  • Greetings all in the new year and (learning from this post) belated congrats to you Daid on your new crew member back in June!

    I'm working with 5 dedicated high school students on Empty Epsilon.
    Our skill level is scripting and modding level. We've even convinced the programming teacher here to give us 4 days of his class for "Intro to Lua script via Empty Epsilon." (imagine a whole lab of kids writing and playing their code with your software!) I'll get pictures! One engineering student is working on a solid metal 'swing' STNG style console design. Digging into and compiling source code is a lofty, distant goal we hope to someday reach.
    I'm so inspired by their work that I'd like to invest in hardware for them.... Hence the reason for the post in this thread.
    I've purchased one beta test machine:
    Dell - Inspiron 21.5" Touch-Screen All-In-One - AMD E2-Series - 4GB Memory - 1TB Hard Drive - Black Model: I3265-A067BLK-PUS

    EE works perfectly except for the Windows OS touchscreen double tap button issue.
    Does anyone know the status of where this is at? I've searched and...

    - Was hopeful that the new 2017_12_25 build (based on a newer SFML ver) might fix this ..(alas no). [However the Open AL audio issues was fixed.. awesome!!]

    - Found reference to ( Touchscreen=ON ), in an options.ini ? But other comments pointed at this as obsolete and I can find so such options.ini file or template file. (dead end?)

    - Found reference to event that has location grab before event mouse down as being a possible culprit in source code?

    - Also found reference that Daid's touchsreens do not exhibit this behavior? Windows?
    Linux? Hardware brand?

    - I've even searched for a Windows mouse driver alternative interpreter, to make a single click into a double click.. no luck. Probably a good thing as that might introduce a new world of other problems.

    ------- Final question on status (I promise) for the master programmer folks: At present, is the double tap touchscreen issue looking like something that can only be fixed in the source code itself or is this a simpler fix that someone has found and I'm just missing; Is this a high priority enough to warrant a source code fix sooner than later? ie. (No plans->deal with it buddy!) or (I've almost got it)

    This answer will help me determine hardware investment here for my students.

    Gratitude!
  • The touchscreen issue is also reported on github: https://github.com/daid/EmptyEpsilon/issues/464

    Main problem is that I don't have hardware to test this issue on. So I cannot debug a solution/workaround. I'll ask around at the office if anyone has a device like this. But we have very few Windows users at the office (I'm surrounded by Linux nerds)
    So the issue is at a stall for me, unless someone else figures out a fix.


    The "touchscreen=1" option in options.ini only hides the mouse cursor and mouse hovers. It doesn't change anything else. So it works around issues created by touch screen emulating a mouse. It also allows in game touchscreen calibration (which 1 of my screens needs, and it was easier to add touchscreen calibration to the game then figuring out how to do it in the driver)

    The issue most likely stems from the fact that I grab the click from the events, while I grab the position from the "always available" API to get mouse positions.
    Related code:
    https://github.com/daid/SeriousProton/blob/dc232f90c755fe001310409d4a7556e0f1d2f3b8/src/engine.cpp#L205
    https://github.com/daid/SeriousProton/blob/dc232f90c755fe001310409d4a7556e0f1d2f3b8/src/input.cpp#L75
    Possibly the issue if fixable by only updating the mouse position based on events.


    Our setup looks like this:
    http://daid.eu/~daid/20150723_184520.jpg (old photo, we now have bezels on the screens to hide the gap between the top panel and the screen) I can share the designs if you want. This design is made to be transportable, as you can take it apart without screws. And we took some careful measurements to make sure they are the proper height to stand behind (error we made with our arcade cabinet, which now as 15cm legs to fix the height)
    We are using 2nd hand ELO touch screen, exactly this one https://www.elotouch.com/1537l.html
    Which where 80 euros a piece 2nd hand. But are very costly new. They are however, indestructible. We run the network-boot Linux. Really easy to run a lot of hardware on, as you don't need to install it everywhere. And no extra harddrives that can fail. However, netbooting isn't something you should try without some linux experience.
    But these screens also show no problem on Windows.


    I hope this answers all your questions (and raises new ones :) )
  • Oh, and cool to use EE as Lua teaching method. Should I invest some time to improve the API documentation? Which is a bit, quickly mashed together?
  • What Information do you need to debug the problem? @daid
  • You are always so responsive. It's uncanny and welcome!

    It sounds like you've got the touchscreen problem narrowed down in the source code for some surgical testing. We stand ready to invest many hours and some finances into this project but have to be aware of our limits and debugging and building from source is currently tantamount to deep sea diving at this point. Yes, we can but it's going to take many months to get there... if ever. (The "Build from Source" Wiki guide makes the dive possible at least)
    To that end, your gifts are far more suited to code engineering not necessarily documentation.
    We would LOVE to help build the documentation/scripting guide based on what the students here need as they learn it using them as a beta test edu-shed.
    Depending on how you want to get this done...
    My initial vision of a useful doc source for everyone would be a chapter in your WIKI
    https://github.com/daid/EmptyEpsilon/wiki
    Called "Mission Scripting."

    SubChapter -> "How to Guide"
    You've already written the first pages of this from your guide on the main web page.
    http://daid.github.io/EmptyEpsilon/#tabs=4

    SubChapter -> "Script Reference"
    To start with we could use the script_reference.html as a skeleton. Under each script reference, we can add functional descriptions, variable meanings/tolerances and sample code etc..
    You can watch what's being set up, fill in the cracks and unknown classes/functions etc..

    After you've given this some thought and highlighted your documentation vision, we can get started immediately once you give the green light.

    Finally, coming back to the thread at hand... If hardware is the stall point...
    To fully maximize your effect.. I noticed that on the main page we can donate funds and that you currently don't have anything indicated to buy..
    It seems very logical that you should have a hardware test bed for each platform the software runs on, for testing things just like this... I'd love to just buy the project (you) a Windows touchscreen PC but as a school and me personally giving, our donation would only buy a portion of a hardware device. But please consider it and let me know if this is something you want to do and we will donate earlier than we had planned.

    You have an epic and elegant setup there!
    Our aim is super-quick setup and take down.. ie. All in one PCs or Androids. Our vision here was "The Expanse" controls but less articulated and a little larger screen. Kind of a cross between this and STNG swivel console...

    image
  • The script_reference.html is generated from the source code. So there are no missing classes and functions in there :wink:
    Example:
    https://github.com/daid/EmptyEpsilon/blob/master/src/spaceObjects/artifact.cpp#L9
    The lines that start with 3 slashes are used to document whatever is below it.
    To expanded documentation is best put there, so the end result is auto-generated and less manual update work to ensure it's complete and up-to-date.
    The template of this file is located in the code at https://github.com/daid/EmptyEpsilon/blob/master/compile_script_docs.py#L187
    Which was done quick&dirty.
    I could also setup the build system to push an updated script reference to the website whenever a release is made.

    Also know that the website source code is also available for patches on github: https://github.com/daid/EmptyEpsilon/tree/gh-pages
    So if you would want to add/improve scripting tutorials you could also contribute to that.

    Compiling the sources is a bit of a pain. I know. It's not as bad as some other projects I've worked on. But it's not click&go as well. I don't even know if the wiki documentation is up-to-date.

    The wiki itself is also a bit of a mess... maybe we should organize that a bit, with sections about scripting, playing, custom hardware and compiling/source code. Instead of everything on 1 pile. (And delete some of the old outdated things in there)


    As for donations for hardware. Don't think that's the best way forward.
    I've build a version with extra logging on clicks trying to debug this issue:
    http://daid.eu/~daid/EmptyEpsilonTouchDebug.exe
    Place it in the latest release. Run this exe, do a few touch presses (on the main menu), exit, and send a copy of the "EmptyEpsilon.log"
    It will log the mouse presses that I get and related screen positions around the time of the mouse press. Maybe this will give some clue what is happening with these touch screens. And maybe figure out a fix/workaround.
  • Thank you for spending time on this!
    If this is solved, we can move forward with hardware here with no reservations.

    I will read your last post more carefully as we are starting to write curriculum and the computer teacher here is toying with the idea of making Empty Epsilon a main software of his second semester "Game Design" programming class. Last year was Unity.

    ... but... want to post the requested log file ASAP to you.

    For better debugging (if it matters).. The button press sequence I did was:
    { Main Menu}
    Single press (Options) Fail
    Double press (Options) Success
    Single press (Thrust Sequence) Fail
    Double Press (Thrust Sequence) Success- Music Plays
    Slide Music Volume Up () Success - Volume increases
    Double press (Galactic Temple) Success - Music Plays
    Double press (Back) Success
    Double press (Quit) Success

    The log file is located here: (Shared to the entire world)

    https://drive.google.com/file/d/14rRMgoPQ2umh8iJwjdUQ5FzJSGTWOEtj/view?usp=sharing
  • That provides some useful information :-)

    Improved the logging a bit and implemented a possible fix for the touch issue:
    http://daid.eu/~daid/EmptyEpsilonTouchDebug2.exe


    Note that Unity and EmptyEpsilon are two whole different beasts. Unity is an evolution of many people of many years. While EE is a one man project, which has some pitfalls and deep rooted mistakes.
    On the other hand, if Unity is too complex (as it's quite a beast) you can move from mission scripting in EE to https://love2d.org/ which is simpler in it's core then Unity, and also builds on top of Lua.
  • I think you might need to give your code... "the finger." :-)
    You're not laughing now but read on and hopefully you will later.

    Initial testing was negative (same behavior) but I DO see that the mouse position is correct now however.
    So I ran the same sequence of button presses this time with the touchscreen AND the mouse to compare.

    Touchscreen Log Here:
    https://drive.google.com/file/d/1wlkWnVx6mvLr-fSXqGt1jre7ZTWDGyuz/view?usp=sharing

    Mouse Log Here:
    https://drive.google.com/file/d/1OzTOSOg7UqE_F0Y-0WBvntocAJDODTss/view?usp=sharing

    Findings ----------------------------------
    Define:
    P/R = Mouse Press/Release
    P/R delay = Time between Mouse Pressed and Released

    (OPTIONS) Button
    @ ~ 236, 917
    Mouse-> 0.05s P/R delay-> SUCCESS
    Touch 1st press-> 0.00s P/R delay -> FAIL
    Touch 2nd press-> 0.085s P/R delay -> SUCCESS

    (THRUST SEQUENCE) Button
    @ ~ 1503, 632
    Mouse-> 0.05s P/R delay-> SUCCESS
    Touch 1st press-> 0.00s P/R delay -> FAIL
    Touch 2nd press-> 0.09s P/R delay -> SUCCESS

    (MUSIC VOLUME) Slider
    @ ~ 276, 414
    Mouse-> Pressed, MouseMoved events (no Release) -> SUCCESS
    Touch -> Pressed, MouseMoved events (no Release) -> SUCCESS

    (GALACTIC TEMPLE) Button
    @ ~ 1488, 163
    Mouse-> 0.07s P/R delay-> SUCCESS
    Touch 1st press-> 0.00s P/R delay -> FAIL
    Touch 2nd press-> 0.09s P/R delay -> SUCCESS

    (BACK) Button
    @ ~ 237, 993
    Mouse-> 0.07s P/R delay-> SUCCESS
    Touch 1st press-> 0.00s P/R delay -> FAIL
    Touch 2nd press-> 0.07s P/R delay -> SUCCESS

    It looks like no delay between Press and Release is causing the Windows touchscreen fails.
    A delay of 0.05 seconds, literally the time it takes for a finger to click a mouse up and down is not built into the touchscreen and appears to be incompatible.
    Is there a way to add a "finger" delay (~ 0.07 seconds) from receipt of hardware MouseButtonReleased and the execution of the associated code?
    This minor delay is already inherent in the mouse play anyway and would be an imperceptible "slowing down" of your code so no major "Apple/Old Battery" issues ahead. :-)

  • AHHH, so that's the problem. The touch screen isn't keeping the mouse down. Yes, I can fix this :-)
  • And exe nr3:
    http://daid.eu/~daid/EmptyEpsilonTouchDebug3.exe

    I think this will fix the bug. The logging is also no longer there, so if this works, I'll push a new full build based on this patch.
  • You are correct sir! (with one small exception), the Windows touchscreen is now completely functional in game with single clicks.
    It's really a pretty cool to experience!!
    Actual game touchscreen play seems to be totally fixed!

    Exception: (Pre-game intro menus, Touchscreen only)
    *This behavior is NOT present in game.
    *All normal mouse behavior seems unchanged. (completely functional)

    When the pre-game menus change out, there is a touch press in that same position on the next menu being created.
    Examples:
    {Main Menu}, Press (OPTIONS) -> {Options Menu} Loads
    Press (BACK)
    {Main Menu} loads and (QUIT) is pressed immediately. (as these buttons are in the same screen position. Program exits.

    OR

    {Main Menu}, Press (START SERVER) -> {Server Menu} loads with (BEAM/SHIELD FREQ) being pressed immediately.

    Given that 99% of the time, your in game play... I would consider this bug fixed even if the main menus weren't fixed. It's far better in my opinion to simply use the mouse until the game is running.. If however, you know what's going on and want to bring this to 100%. Working with you is really cool!
    Thanks for dedicating your time to this fix!
  • edited January 2018
    Ah, I can explain that bug. The menus are directly replaced when the click on the button is released. As the press and the release are now happening "at the same time", the new menu also sees this press&release, and thus also responds as if the button is pressed. (As a button click is a press and a release on the same button)
    I kinda feared a bug like that could happen :-) but I think I might know an easy fix.
  • 100% solved!
    Now... we need more people with Surface Pros and other Windows touchscreens to try it out to verify this fix is universal.

    THANK YOU DAID!!!!!!
    I'd like to continue the dialog about EE documentation and how we can help out, if you want the help. Is there a thread for that or should we start a new one to continue that?

    It's just so refreshing to work on a great project like this!
    Kudos for having it at all!
    I'll send pics of the class next week doing a 5 day mini- LUA course with Empty Epsilon. I started conversation about "LOVE2D" (the link you sent) but these students at this point really want to do EE. (I think I can convince them to do both eventually)

    Best wishes and cheers!
  • Best start a new thread about the documentation.

    Love2D would be something some students might want to move next to after EE.
    Note that if your students create a cool mission script, I have no problem in including that in the official releases :-)
Sign In or Register to comment.