Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend pointer events to support raw trackpad data #206

Closed
RByers opened this issue May 11, 2017 · 23 comments
Closed

Extend pointer events to support raw trackpad data #206

RByers opened this issue May 11, 2017 · 23 comments
Labels

Comments

@RByers
Copy link
Contributor

RByers commented May 11, 2017

From old bug tracker:

In adition to mice, stylii and touch screens, I've heard requests for an API web sites could use to get detailed touchpad data (rather than assuming they're identical to a mouse). In particular, when the hardware and OS supports it, it should be possible to receive co-ordinates for all the touch points on a trackpad.

The most interesting difference here is that the co-ordinate are not directly connected with screen co-ordinates. I could imagine having an additional pointer type where pointerdown corresponded to putting a finger on the trackpad, and there were additional fields for trackpad co-ordinates (perhaps in some fixed range like [0,100]).

I don't think it's critical that we need to flush out the details of this scenario now, but I'd like to keep it in mind as we define potential extensibility points.

I still don't think this is at all urgent, but I'd like to have a place in GitHub to refer to for this possible future scenario.

@RByers
Copy link
Contributor Author

RByers commented May 11, 2017

Note that there is presumably still a connection to the screen in this case - the normal co-ordinates, and event targeting would presumably all represent the mouse cursor location. So it's not an entirely "off screen pointing device".

@RByers
Copy link
Contributor Author

RByers commented Dec 8, 2017

Note Edge is shipping something kinda like this where touchpad emulates a touchscreen, but without any of the API / spec changes we've discussed years ago. @dtapuska and I expressed our reservations about this plan to @scottlow at TPAC. Chrome has no intention of following this approach.

Chrome gets the same perf benefit with passive wheel listeners (sadly opt-in) and soon nearly the same by default with async wheel events. Although we'd probably have to define a 'wheel-action' CSS property to handle all the same edge cases with full fidelity.

We don't believe we could stop firing wheel events completely for touchpad scrolling for web compat reasons. We know the vast majority of websites still don't support pointer events for touchscreen operations like panning. Also it could be confusing to emulate a touchscreen on devices with a touchpad but no touchscreen (MacBook, some Chromebooks).

All that said, I don't anticipate too much Interop pain from developers being surprised by the difference between Edge and Chrome here. I guess we'll see.

@RByers
Copy link
Contributor Author

RByers commented Dec 8, 2017

A few questions for @scottlow on the design in Edge:

  1. What does 'navigator.maxTouchPoints' return on a PTP device with no touchscreen?
  2. Are the pointer/hover media queries impacted as if there was a real touch screen?
  3. How are the co-ordinates scaled to map from the touchpad surface?
  4. If I call 'releasePointerCapture' for one of these fake points, how will hit-testing behave?
  5. For zoom operations, how does hit-testing behave? Can the two pointers get targetted to different elements / frames?

In terms of the Interop risk, I expect the main developer pain point to continue to be the difference in presence of wheel events. So I don't think this is net-worse than today (I can see an argument for it being net-positive for Edge, even without a path for Chrome to be able to match).

@scottlow
Copy link

  1. In our current implementation, it returns 0 since we did not touch the logic in this area as part of our initial commit. I've opened a bug to change this behavior so that we return 2 in this case instead.

  2. No. Since PTP is still a "fine" pointing device, we didn't think it made sense to update these media queries.

  3. For zooming, there is no scaling; each contact’s relative position to the center at the start of the gesture becomes their relative position to the cursor position on the screen with no scaling (a contact +200x/+200y himetric from the gesture center on the down will translate to a point a +200x/+200y himetric from the cursor location on the screen. For panning, we do scale the input slightly based on velocity. It's worth noting that in both of the aforementioned cases, we believe that scaling is optional/best left to the discretion of implementing user agents so that they create experiences that "feel right" for them.

  4. In our current implementation, we do not do an implicit capture when we emulate touch. As a result, releasePointerCapture only does something if setPointerCapture is called on an element other than the element that is under the cursor first. We have a change going in currently to alter this behavior so that we will fire gotPointerCapture (as we do for touch) for implicit capture of the element under the cursor when a gesture begins. If releasePointerCapture is called in the implicit capture case, hit-testing will start on the element under the cursor (which will not move while we are processing gestures)

  5. All events are fired to the element under the cursor location, so pointers will not target different elements/frames.

@wffurr
Copy link

wffurr commented Jan 22, 2019

I'd love access to these for a drawing surface, e.g. the drawing notes in keep.google.com. I don't have any way to differentiate between events from a mousewheel or a touchpad. We previously had a default behavior of wheel would zoom and ctrl+wheel would scroll, as zooming was much more common, but this worked poorly with touchpad gestures.

Detecting rotation gestures would be a neat feature too.

@gked gked pinned this issue Jun 25, 2019
@patrickhlauke
Copy link
Member

Closed (discussed in PEWG meeting today https://www.w3.org/2019/07/10-pointerevents-minutes.html) - concerns about being too low-level, crossing over with things such as sensor API, and - at least at this stage - few use cases/resources in this area.

@porg
Copy link

porg commented Sep 1, 2020

@patrickhlauke As soon as web apps facilitate more and more touch gestures, this will become a necessity. 3D web apps are a good example. The mention above i.e. is a use case in ScrollMaps which is hindered by your limitation.

@aeharding
Copy link

I wish this would be reconsidered. 😊

Just as an example, MacOS has a scroll bounce/overflow effect used in every native app on MacOS.

Blink (and thus, electron) only support this effect on the main body of the HTML page.

This means its currently impossible to replicate this effect with web technologies. There are two possibilities to resolve this:

  1. Expose a more detailed, modern trackpad API for projects such as elastic-scroll-polyfill to better replicate this behavior with Javascript, or
  2. A CSS property to enable this effect on a scrollable <div>

I'd take either approach, but it really feels like the raw trackpad event data should be made available. In 2020, we have WebUSB, bluetooth apis and so much more, but for some reason we don't have trackpad event data to allow web apps to replicate a really fundamental macOS app design element.

@trusktr
Copy link

trusktr commented Feb 1, 2021

It is impossible to make proper custom intertial scrolling based on current wheel events.

The closest I could get is the following, but because there's no way to detect start/end of interaction, the janky inertial scroll happens even when fingers are still on the trackpad (tested in Chrome):

https://codepen.io/trusktr/pen/YzGbeKG

The jank intertia is my fault: I could probably average a few last deltas. But if you try flinging the rotation, that works really well without such needed trick. I hope for something as nice as the rotation (pointer events) but for the zoom (touchpad).

It would be super great to have the ability to know when any finger goes down, moves, and goes up on a trackpad.

I would imagine pointer events on touchpads would work the same, but the events would have some flag that says it's happening on a touchpad instead of in the window.

@dawCreator
Copy link

I also believe it's important to expose touch locations on trackpads. If I could react to custom gestures I could improve the workflow on my website immensely.

@bxff
Copy link

bxff commented Oct 5, 2022

Just like @trusktr and @aeharding mentioned, there is no way to properly implement MacOSs rubberband effect/elastic scroll on browsers because of current API limitations. The main limitation for the rubberband effect is that there is no way to find whether the scroll event ended, as in whether the finger is off the trackpad.

This is the same effect implemented in react-indiana-drag-scroll using the mouse event:
Screen Recording 2022-10-05 at 8 56 55 PM

In 2022 I believe this issue should be given some serious considerations.

@patrickhlauke
Copy link
Member

The main limitation for the rubberband effect is that there is no way to find [...] whether the finger is off the trackpad

Once the finger is lifted, it should fire the pointerup event the same way mouseup is fired, or am i missing something?

@bxff
Copy link

bxff commented Oct 5, 2022

The main limitation for the rubberband effect is that there is no way to find [...] whether the finger is off the trackpad

Once the finger is lifted, it should fire the pointerup event the same way mouseup is fired, or am i missing something?

Note I am talking about scroll events, as far as I noticed pointer events aren't called on scroll events, at-least not on MacOS.

Are pointer events suppose to be called on scrolls on trackpads?

@aeharding
Copy link

The main limitation for the rubberband effect is that there is no way to find [...] whether the finger is off the trackpad

Once the finger is lifted, it should fire the pointerup event the same way mouseup is fired, or am i missing something?

It is not currently possible to detect when a user lifts a finger (and puts a finger down) on a macOS touchpad. View codepen, move finger around the div, pause, and lift finger up. No event fired.

https://codepen.io/aeharding/pen/JjvBvpR

We are missing a lower level API for macos touchpad. Native apps have this ability, like NSTouch. https://developer.apple.com/documentation/appkit/nstouch

@patrickhlauke
Copy link
Member

Are pointer events suppose to be called on scrolls on trackpads?

no, by their very nature pointer events do not fire when a gesture is executed with the pointer input. if the browser or OS take over for a particular movement (like doing a scroll in response to a trackpad gesture), then the pointer is cancelled and control is ceded directly to the browser/OS, so no further pointer events are fired. as this is fundamental to the current behaviour/way that pointer events are specified, this is not a feature that is likely to be changed. you'd have to do something like that react-indiana-drag-scroll you mentioned, using touch-action: none, and handling it from basic principles using the various pointer* events

@stam
Copy link

stam commented Nov 3, 2022

This is a shame. For interacting in 3d space, mouse interaction and trackpad interaction are completely different, and there is literally no way to tell if pointerevents come from a mouse or trackpad. They're all pointerType: "mouse"

@Exatex
Copy link

Exatex commented Nov 28, 2022

@patrickhlauke I would like to weigh in on the "demand" side and heavily vote for a reopen. As @stam already said, this is currently an unsolvable issue for web-based 3D applications and severely limiting using web technologies at all for 3d applications whose controls even remotely resemble non-web industry standards and that need to use all available degrees of freedom for spatial navigation. There is no real acceptable workaround. The underlying assumptions of the current high level solution, which gestures correspond to which intentions, are now just not true anymore I fear.

Right now, we have to use heuristics to guess what the original gesture was, and thus have tons of users complaining that with certain input devices the inputs are mis-interpreted, rendering the webapp unusable. I understand there are some concerns on some technical implementations, but it is hard to imagine that whatever solution would be proposed is worse than us training an ML model running in the frontend just to guess the lost information of mouse events. This is actually our current plan if there will be no fix.

@porg
Copy link

porg commented Nov 29, 2022

@patrickhlauke You see to what lengths developers need to go who need trackpad data? ML to interpret mangled input data. Horrible what efforts they have to go to just b/c that data is not passed through…

@patrickhlauke
Copy link
Member

as this issue has now morphed beyond "get raw trackpad data" it seems (because if you got raw data, you'd still have to interpret it to work out what gesture, if any, the user actually intended to do), suggest opening a more specific new issue that specifically outlines the exact ask that you have here with regards to 3D applications.

@funwithtriangles
Copy link

Heya, colleague of @Exatex here 👋

I have opened a new issue to explain the problem in more detail.

@govizlora
Copy link

as this issue has now morphed beyond "get raw trackpad data" it seems (because if you got raw data, you'd still have to interpret it to work out what gesture, if any, the user actually intended to do), suggest opening a more specific new issue that specifically outlines the exact ask that you have here with regards to 3D applications.

"get raw trackpad data" is still the only issue. Interpreting the gestures can be handled by the developers respectively, instead of being provided by w3c.

@patrickhlauke
Copy link
Member

patrickhlauke commented Jan 31, 2023

define what you mean exactly by "get raw trackpad data", exactly. what does this data look like? how does it differ from what pointer events currently offer?

also, how this would just be limited to trackpad - is there any more generalised ask here for touchscreens or other pointer input types?

(it's not going to make it into version 3, and it may not even make it into pointer events but rather a sibling spec, but the conversation here has veered into odd "i can't tell a scrollwheel and a trackpad apart" which is a different concern altogether)

@patrickhlauke
Copy link
Member

discussed this at today's pointer events working group meeting https://www.w3.org/2023/02/01-pointerevents-minutes.html#t04

while we understand the developer need/use cases here, we feel it's not quite the appropriate fit for the pointer events API - it is arguably far more low-level, and may be more closely related to the pointer lock API or a more generalised sensors API.

we recommend posting a proposal (including use cases) to the Web Incubator Community Group (WICG) https://discourse.wicg.io/ where it may find more traction and a more appropriate venue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests