-
-
Notifications
You must be signed in to change notification settings - Fork 401
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
start of adding touch input #1898
base: main
Are you sure you want to change the base?
Conversation
Where? |
Thanks |
/// <summary> | ||
/// The last known normalized position of the finger. | ||
/// </summary> | ||
public Vector2 NormalizedPosition { get; } | ||
|
||
/// <summary> | ||
/// The last known speed of the finger. | ||
/// </summary> | ||
public Vector2 Speed { get; } | ||
|
||
/// <summary> | ||
/// The last known normalized speed of the finger. | ||
/// </summary> | ||
public Vector2 NormalizedSpeed { get; } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could there be some documentation in here about what normalized means in this specific context? In the case of graphics programming this could very well mean -1 -> 1
or 0 -> 1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added (0..1) and (-1..1). Is this enough? In general positions are normalized to range 0 to 1 and speed values to range -1 to 1 as they have a direction. I tried to add this information everywhere before but forgot it here I guess.
I did a test of the multitouch feature and it is working well. When is it planned to be released? |
I don't know. Nobody reviewed this yet. |
Everyone on the team has been very busy the past few months, and haven't had a whole lot of time to put towards Silk.NET, so open PRs such as these have stalled quite a bit. We apologize for the lengthy review/response times, and have no ETA for the next release/when this PR will be merged. |
I notice this currently doesn't have a proposal in documentation/proposals. Ideally, this should be done to ensure we can review the API at a high-level and keep a record of our discussion/decisions. |
if this ends up being accepted, I am very willing to port my gesture recognition work I'm making for Godot to Silk. IIRC the only work I have left on it is godot-specific implementation details. It features support for multiple types of gestures occurring simultaneously and felt really good to me at the time I had it running on Godot 3. |
Thanks @domportera. It's hopeful to see the positivity on this PR! At the moment this is currently blocked on a proposal needing to be written (as per |
looking at the code in this PR there is actually a ton of overlap in what this addresses vs what mine does, so maybe it's best not to step on anyone's toes that being said, because of that, I can probably whip up a proposal that aligns pretty closely with the original author's intent. I'm not terribly informed on the state of touch in Silk though, as I've only ever used Silk for desktop imgui stuff atp. I'm assuming it currently looks like "raw" touch events with finger persistence from the SDL's implementation, but not so low level as to have actual sensor information? do other backends have a working touch implementation? |
Anything you can get through SDL is at your disposal. I'm assuming GLFW doesn't have the APIs we require. |
@domportera Note that there's some interest from the Stride project about sensors. I think it may just be worth writing this proposal atop Multi-Backend Input (i.e. 3.0) or at least evaluating the differences between 2.X and 3.0 to see if there's a way we can have one proposal for both 2.X and 3.0, if not I'd rather we focus on 3.0. I have copied the feedback I have received from Stride so far on Multi-Backend Input below for reference. I read over the whole Input proposal (okay, I skimmed some parts of it), as I have decent experience in that area. There are a few things
|
damn ok this seems right up my alley, as I have another project (in its ABSOLUTE infancy, called omnio, made in godot, also will be using my gesture library and arbitrary sensors lmao) that is going to have to tackle very similar issues of abstracting inputs from all sorts of sensors, gamepads, and mouse/KB/touch. Im also interested in expanding the input capabilities of tooll3 after our next release and have begun to sneak silk windows into the codebase already. so this seems like a really good opportunity to get the whole damn .net ecosystem covered if we wanna go that deep I'm absolutely game and have started to put some thought into that. I work with weird sensor inputs in C# for a living and have had my fair share of pain points so I can probably brainstorm something that more or less covers all the bases. I'm down to propose an everything-input-proposal but would need to seriously ramp up on Silk to keep it relevant |
agree |
Summary of the PR
This adds touch devices to the input abstraction. The goal is to add touch gesture recognition to the SDL backend to enable touch gestures and simple touch finger events on Android and iOS.
Related issues, Discord discussions, or proposals
I proposed this a long time ago and finally found some time for it.