移动端触摸事件介绍

By publishing this Recommendation, W3C expects that the functionality specified in this Touch Interface Recommendation will not be affected by changes to HTML5 or Web IDL as those specifications proceed to Recommendation. The WG has completed and approved this specification‘s Test Suite and created an Implementation Report that shows that two or more independent implementations pass each test.The Touch Events specification defines a set of low-level events that represent one or more points of contact with a touch-sensitive surface, and changes of those points with respect to the surface and any DOM elements displayed upon it (e.g. for touch screens) or associated with it (e.g. for drawing tablets without displays). It also addresses pen-tablet devices, such as drawing tablets, with consideration toward stylus capabilities.

Introduction

User Agents that run on terminals which provide touch input to use web applications typically use interpreted mouse events to allow users to access interactive web applications. However, these interpreted events, being normalized data based on the physical touch input, tend to have limitations on delivering the intended user experience. Additionally, it is not possible to handle concurrent input regardless of device capability, due to constraints of mouse events: both system level limitations and legacy compatibility.

Meanwhile, native applications are capable of handling both cases with the provided system APIs.

The Touch Events specification provides a solution to this problem by specifying interfaces to allow web applications to directly handle touch events, and multiple touch points for capable devices.

The W3C‘s Protocols and Formats Working Group created a non-normative document that includes a mapping of hardware events (e.g. keyboard events) to touch events. For more information see Touch Events Accessibility Mapping.

This specification defines conformance criteria that apply to a single product: the user agent that implements the interfaces that it contains.

WindowProxy is defined in [[!HTML5]].

WebIDL Conformance

The IDL blocks in this specification are conforming IDL fragments as defined by the WebIDL specification [[!WEBIDL]].

A conforming Web Events user agent must also be a conforming ECMAScript implementation of this IDL fragments in this specification, with the following exception:

  • section 4.4.6 of Web IDL requires that IDL attributes are reflected as accessor properties on interface prototype objects. Instead of this, the user agent may reflect IDL attributes as data properties on the platform objects that implement the relevant interface. These data properties must have the same behavior when getting and setting as would be exhibited when invoking the getter and setter of the accessor properties on the platform object.

Note: Both ways of reflecting IDL attributes allow for simply getting and setting the property on the platform object to work. For example, given a Touch object aTouch, evaluating aTouch.target would return the EventTarget for the Touch object. If the user agent implements IDL attributes as accessor properties, then the property access invokes the getter which returns the EventTarget. If the user agent implements IDL attributes as data properties on the platform object with the same behavior as would be found with the accessor properties, then the object would appear to have an own property named "target" whose value is an EventTarget object, and the property access would return this value.

Touch Interface

This interface describes an individual touch point for a touch event. Touch objects are immutable; after one is created, its attributes must not change.

readonly attribute long identifier
An identification number for each touch point. When a touch point becomes active, it must be assigned an identifier that is distinct from any other active touch point. While the touch point remains active, all events that refer to it must assign it the same identifier.
readonly attribute EventTarget target
The EventTarget on which the touch point started when it was first placed on the surface, even if the touch point has since moved outside the interactive area of that element.
readonly attribute double screenX
The horizontal coordinate of point relative to the screen in pixels
readonly attribute double screenY
The vertical coordinate of point relative to the screen in pixels
readonly attribute double clientX
The horizontal coordinate of point relative to the viewport in pixels, excluding any scroll offset
readonly attribute double clientY
The vertical coordinate of point relative to the viewport in pixels, excluding any scroll offset
readonly attribute double pageX
The horizontal coordinate of point relative to the viewport in pixels, including any scroll offset
readonly attribute double pageY
The vertical coordinate of point relative to the viewport in pixels, including any scroll offset

TouchList Interface

This interface defines a list of individual points of contact for a touch event. TouchList objects are immutable; after one is created, its contents must not change.

A TouchList object‘s supported property indices ([[!WEBIDL]]) are the numbers in the range 0 to one less than the length of the list.

readonly attribute unsigned long length
returns the number of Touches in the list
getter Touch? item (in unsigned long index)
returns the Touch at the specified index in the list or null if the index is not less than the length of the list.

TouchEvent Interface

This interface defines the touchstarttouchendtouchmove, and touchcancel event types. TouchEvent objects are immutable; after one is created and initialized, its attributes must not change.

readonly attribute TouchList touches
a list of Touches for every point of contact currently touching the surface.
readonly attribute TouchList targetTouches
a list of Touches for every point of contact that is touching the surface and started on the element that is the target of the current event.
readonly attribute TouchList changedTouches

a list of Touches for every point of contact which contributed to the event.

For the touchstart event this must be a list of the touch points that just became active with the current event. For the touchmove event this must be a list of the touch points that have moved since the last event. For the touchend and touchcancel events this must be a list of the touch points that have just been removed from the surface.

readonly attribute boolean altKey
true if the alt (Alternate) key modifier is activated; otherwise false
readonly attribute boolean metaKey
true if the meta (Meta) key modifier is activated; otherwise false. On some platforms this attribute may map to a differently-named key modifier.
readonly attribute boolean ctrlKey
true if the ctrl (Control) key modifier is activated; otherwise false
readonly attribute boolean shiftKey
true if the shift (Shift) key modifier is activated; otherwise false

TouchEvent Implementer‘s Note

User agents should ensure that all Touch objects available from a given TouchEvent are all associated to the same document that the TouchEvent was dispatched to. To implement this, user agents should maintain a notion of the current touch-active document. On first touch, this is set to the target document where the touch was created. When all active touch points are released, the touch-active document is cleared. All TouchEvents are dispatched to the current touch-active document, and each Touch object it contains refers only to DOM elements (and co-ordinates) in that document. If a touch starts entirely outside the currently touch-active document, then it is ignored entirely.

Usage Examples

The examples below demonstrate the relations between the different TouchList members defined in a TouchEvent.

touches and targetTouches of a TouchEvent

This example demonstrates the utility and relations between the touches and targetTouches members defined in the TouchEvent interface. The following code will generate different output based on the number of touch points on the touchable element and the document:

                  <div id=‘touchable‘>
                      This element is touchable.
                  </div>

                  document.getElementById(‘touchable‘).addEventListener(‘touchstart‘, function(ev) {

                      if (ev.touches.item(0) == ev.targetTouches.item(0))
                      {
                          /**
                           * If the first touch on the surface is also targeting the
                           * "touchable" element, the code below should execute.
                           * Since targetTouches is a subset of touches which covers the
                           * entire surface, TouchEvent.touches >= TouchEvents.targetTouches
                           * is always true.
                           */

                          document.write(‘Hello Touch Events!‘);
                      }

                      if (ev.touches.length == ev.targetTouches.length)
                      {
                          /**
                           * If all of the active touch points are on the "touchable"
                           * element, the length properties should be the same.
                           */

                          document.write(‘All points are on target element‘)
                      }

                      if (ev.touches.length > 1)
                      {
                          /**
                           * On a single touch input device, there can only be one point
                           * of contact on the surface, so the following code can only
                           * execute when the terminal supports multiple touches.
                           */

                          document.write(‘Hello Multiple Touch!‘);
                      }

                  }, false);
              

changedTouches of a TouchEvent

This example demonstrates the utility of changedTouches and it‘s relation with the other TouchList members of the TouchEvent interface. The code is a example which triggers whenever a touch point is removed from the defined touchable element:

                  <div id=‘touchable‘>
                      This element is touchable.
                  </div>

                  document.getElementById(‘touchable‘).addEventListener(‘touchend‘, function(ev) {

                      /**
                       * Example output when three touch points are on the surface,
                       * two of them being on the "touchable" element and one point
                       * in the "touchable" element is lifted from the surface:
                       *
                       * Touch points removed: 1
                       * Touch points left on element: 1
                       * Touch points left on document: 2
                       */

                      document.write(‘Removed: ‘ + ev.changedTouches.length);
                      document.write(‘Remaining on element: ‘ + ev.targetTouches.length);
                      document.write(‘Remaining on document: ‘ + ev.touches.length);

                  }, false);
              

List of TouchEvent types

The following table provides a summary of the types of possible TouchEvent types defined in this specification. All events should accomplish the bubbling phase. Some events are not cancelable (see preventDefault).

Event Type Sync / Async Bubbling phase Trusted proximal event target types DOM interface Cancelable Default Action
touchstart Sync Yes Document, Element TouchEvent Yes undefined
touchend Sync Yes Document, Element TouchEvent Yes Varies: mousemove (If point has been moved), mousedown, mouseup, click
touchmove Sync Yes Document, Element TouchEvent Yes undefined
touchcancel Sync Yes Document, Element TouchEvent No none

The touchstart event

A user agent must dispatch this event type to indicate when the user places a touch point on the touch surface.

The target of this event must be an Element. If the touch point is within a frame, the event should be dispatched to an element in the child browsing context of that frame.

If the preventDefault method is called on this event, it should prevent any default actions caused by any touch events associated with the same active touch point, including mouse events or scrolling.

The touchend event

A user agent must dispatch this event type to indicate when the user removes a touch point from the touch surface, also including cases where the touch point physically leaves the touch surface, such as being dragged off of the screen.

The target of this event must be the same Element on which the touch point started when it was first placed on the surface, even if the touch point has since moved outside the interactive area of the target element.

The touch point or points that were removed must be included in the changedTouches attribute of the TouchEvent, and must not be included in the touches and targetTouches attributes.

If this event is cancelled, any sequence of touch events that includes this event must not be interpreted as a click.

The touchmove event

A user agent must dispatch this event type to indicate when the user moves a touch point along the touch surface.

The target of this event must be the same Element on which the touch point started when it was first placed on the surface, even if the touch point has since moved outside the interactive area of the target element.

Note that the rate at which the user agent sends touchmove events is implementation-defined, and may depend on hardware capabilities and other implementation details.

A user agent should suppress the default action caused by any touchmove event until at least one touchmove event associated with the same active touch point is not cancelled. Whether the default action is suppressed for touchmove events after at least one touchmove event associated with the same active touch point is not cancelled is implementation dependent.

The touchcancel event

A user agent must dispatch this event type to indicate when a touch point has been disrupted in an implementation-specific manner, such as a synchronous event or action originating from the UA canceling the touch, or the touch point leaving the document window into a non-document area which is capable of handling user interactions. (e.g. The UA‘s native user interface, plug-ins) A user agent may also dispatch this event type when the user places more touch points on the touch surface than the device or implementation is configured to store, in which case the earliest Touch object in the TouchList should be removed.

The target of this event must be the same Element on which the touch point started when it was first placed on the surface, even if the touch point has since moved outside the interactive area of the target element.

The touch point or points that were removed must be included in the changedTouches attribute of the TouchEvent, and must not be included in the touches and targetTouches attributes.

Extensions to the Document Interface

The Document interface [[!DOM-LEVEL-3-CORE]] contains methods by which the user can create Touch and TouchList objects.

Touch createTouch()
Creates a Touch object with the specified attributes.

WindowProxy view
EventTarget target
long identifier
double pageX
double pageY
double screenX
double screenY
TouchList createTouchList()
Creates a TouchList object consisting of zero or more Touch objects. Calling this method with no arguments creates a TouchList with no objects in it and length 0 (zero).

Touch... touches

Some user agents implement an initTouchEvent method as part of the TouchEvent interface. When this method is available, scripts can use it to initialize the properties of a TouchEvent object, including its TouchList properties (which can be initialized with values returned from createTouchList). The initTouchEvent method is not yet standardized, but it may appear in some form in a future specification.

Interaction with Mouse Events

The user agent may dispatch both touch events and mouse events [[!DOM-LEVEL-2-EVENTS]] in response to the same user input. If the user agent dispatches both touch events and mouse events in response to a single user action, then the touchstart event type must be dispatched before any mouse event types for that action. If thepreventDefault method of touchstarttouchmove, or touchend is called, the user agent should not dispatch any mouse event that would be a consequential result of the the prevented touch event.

If a Web application can process touch events, it can intercept them, and no corresponding mouse events would need to be dispatched by the user agent. If the Web application is not specifically written for touch input devices, it can react to the subsequent mouse events instead.

If the user agent intreprets a sequence of touch events as a click, then it should dispatch mousemovemousedownmouseup, and click events (in that order) at the location of the touchend event for the corresponding touch input. If the contents of the document have changed during processing of the touch events, then the user agent may dispatch the mouse events to a different target than the touch events.

The default actions and ordering of any further touch and mouse events are implementation-defined, except as specified elsewhere.

Glossary

active touch point
touch point which is currently on the screen and is being tracked by the user agent. The touch point becomes active when the user agent first dispatches a touchstart event indicating its appearance. It ceases to be active after the user agent dispatches a touchend or touchcancel event indicating that the touch point is removed from the surface or no longer tracked.
touch point
The coordinate point at which a pointer (e.g finger or stylus) intersects the target surface of an interface. This may apply to a finger touching a touch-screen, or an digital pen writing on a piece of paper.
preventDefault
If a event is cancelable, the preventDefault method is used to signify that the event is to be canceled, and any default actions defined in the user agent as a result of this event, or consequential events from the canceled event will not occur. Calling this method on non-cancelable events will have no effect.

Issues

The working group maintains a list of open issues in this specification. These issues may be addressed in future revisions of the specification.

Acknowledgements

Many thanks to the WebKit engineers for developing the model used as a basis for this spec, Neil Roberts (SitePen) for his summary of WebKit touch events, Peter-Paul Koch (PPK) for his write-ups and suggestions, Robin Berjon for developing the ReSpec.js spec authoring tool, and the WebEvents WG for their many contributions.

Many others have made additional comments as the spec developed, which have led to steady improvements. Among them are Matthew Schinckel, Andrew Grieve, Cathy Chan, and Boris Zbarsky. If we inadvertently omitted your name, please let me know.

The group acknowledges the following contributors to this specification‘s test suite: Matt Brubeck, Olli Pettay, Art Barstow, Cathy Chan and Rick Byers.

Changes Since Last Publication

The following non-substantive changes were made since the 24 January 2013 Last Call Working Draft was published:

  • Added a non-normative note for implementers regarding event targets (changeset).
  • Added a non-normative note regarding mapping hardware events to touch events (changeset).
  • Minor Web IDL bug fixes and clarifications (changesets: 12).
  • Added a non-normative note regarding the initTouchEvent method that is not standardized (changesets: 12).
时间: 2024-10-13 06:21:28

移动端触摸事件介绍的相关文章

移动端二三事【二】:移动端触摸事件点透及多种解决方案。

大家都知道的少说,多分享一些干货. 一.首先说移动端的三大主要事件: 1.手指按下: ontouchstart2.手指移动:ontouchmove3.手指抬起 ontouchend *使用移动端事件时,为尽可能地保证兼容性与调试时的友好性,尽可能用事件绑定的方式.例如: /* 注意: 在移动端开发的时候,浏览器的模拟器时好时坏,一般不用on的方式绑定事件函数,要用事件绑定的方式(addEventListener). */ //用以下方式浏览器的移动端模拟器可能会无法识别事件 var div =

touchSwipe移动端触摸事件

今天分享一款很棒的插件touchSwipe,估计很多朋友都在找手机全屏滚动的效果,因为好多企业的微官网是或是专题都在用这样的效果,那么今天touchSwipe 1.6是最新的专门为移动设备设计的jquery插件,如:Ipad,苹果.安卓,当然PC上也是可以用的,嘻嘻.插件touchSwipe可监听单个和多个手指触摸,鼠标按着左键拖动等事件,因此插件可以实现滑动滚屏.缩放等效果.本实例主讲滚屏效果,相了解缩放功能的请参考官方文档. 特点: 1.监听滑动的4个方向:上.下.左.右: 2.监听多个手指

移动端触摸事件 【转】

触摸事件 三种在规范中列出并获得跨移动设备广泛实现的基本触摸事件:     1. touchstart:手指放在一个DOM元素上.     2. touchmove:手指拖曳一个DOM元素.     3. touchend:手指从一个DOM元素上移开.     每个触摸事件都包括了三个触摸列表:     1. touches:当前位于屏幕上的所有手指的一个列表.     2. targetTouches:位于当前DOM元素上的手指的一个列表. 3. changedTouches:涉及当前事件的手

js 处理移动端触摸事件

在处理移动端的touch事件时,我们可以选择一些插件来处理,比如jquery ui touch punch.js 提供丰富的触摸效果,可以满足移动端的开发, 但是,有些移动端开发中,并不需要如此复杂的效果,例如我们只需知道滑动的距离,向左还是向右,我们可以自己写一些代码来处理touch事件: 以下代码,只在触摸情况下支持,电脑需要chrome模拟手机: 可以改造成自己需要的效果: <!doctype html> <html> <head> <meta charse

HTML5移动端触摸事件

工作了近一个月了 因为公司是主要偏向于移动端,开始不懂移动端事件 一直用的click  click在安卓端没有什么问题 但在IOS端就有问题了点击之后会延迟半秒  多亏旁边大神指点 原来  iOS上的Safari也支持click 和mouseover等传统的交互事件,只是不推荐在iOS的浏览器应用上使用click和mouseover,因为这两个事件是为了支持鼠标点击而设计出来的.Click事件在iOS上会有半秒左右的延迟,原因是iOS要highlight接收到click的element.而mou

移动端 触摸事件 ontouchstart、ontouchmove、ontouchend、ontouchcancel

http://www.cnblogs.com/foolisher/p/5412723.html http://blog.csdn.net/b7995547/article/details/48951761 http://www.cnblogs.com/koukouyifan/p/4066567.html css device-width如何在js中获取 0,js判断是手机还是pc: function IsPC() { var userAgentInfo = navigator.userAgent

移动端触摸事件

原生  ontouchstart  ontouchmove  ontouchend  obj.addEventListener('touchstart',start) obj.addEventListener('touchmove',move)  obj.addEventListener('touchend',end <a>点击我</a> <div>abc</div> div{position:absolute;left:0;top:0;background

H5移动端触摸事件:touchstart、touchend、touchmove

第一部分代码事例: <html><head> <meta charset="utf-8"> <style> #main,#main1{ width:500px; height:500px; border:1px solid red; } </style> <script type="text/javascript"> function load (){ var oInp = document.g

移动端 触摸事件 ontouchstart、ontouchmove、ontouchend、ontouchcancel[转]

转:http://www.cnblogs.com/irelands/p/3433628.html 1.Touch事件简介pc上的web页面鼠 标会产生onmousedown.onmouseup.onmouseout.onmouseover.onmousemove的事件,但是在移动终端如 iphone.ipod  Touch.ipad上的web页面触屏时会产生ontouchstart.ontouchmove.ontouchend.ontouchcancel 事件,分别对应了触屏开始.拖拽及完成触屏