太阳火神的美丽人生 (http://blog.csdn.net/opengl_es)
本文遵循“署名-非商业用途-保持一致”创作公用协议
=======================================
iOS 9.1
Live Photos
Support for Apple Pencil
=======================================
This article summarizes the key developer-related features introduced in iOS 9.1, which runs on currently
shipping iOS devices. The article also lists the documents that describe new features in more detail.
For late-breaking news and information about known issues, see iOS 9.1 Release Notes . For the complete list
of new APIs added in iOS 9.1, see iOS 9.1 API Diffs . For more information on new devices, see iOS Device
Compatibility Reference .
Live Photos
Live Photos is a feature of iOS 9 that allows users to capture and relive their favorite moments with richer
context than traditional photos. When the user presses the shutter button, the Camera app captures much
more content along with the regular photo, including audio and additional frames before and after the photo.
When browsing through these photos, users can interact with them and play back all the captured content,
making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for
sharing. The Photos framework includes support to fetch a PHLivePhoto object from the PHImageManager
object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView
object (defined in the PhotosUI framework) to display the contents of a Live Photo. The PHLivePhotoView
view takes care of displaying the image, handling all user interaction, and applying the visual treatments to
play back the content.
You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You
can request a PHLivePhoto object for an asset in the user’s photo library by using PHImageManager or
UIImagePickerController. If you have a sharing extension, you can also get PHLivePhoto objects by
using NSItemProvider. On the receiving side of a share, you can recreate a PHLivePhoto object from the
set of files originally exported by the sender.
The data of a Live Photo is exported as a set of files in a PHAssetResource object. The set of files must be
preserved as a unit when you upload them to a server. When you rebuild a PHLivePhoto with these files on
the receiver side, the files are validated; loading fails if the files don’t come from the same asset.
To learn how to give users a great experience with Live Photos in your app, see Live Photos.
Support for Apple Pencil
iOS 9.1 引入了一些 API 用于帮你联合预判苹果笔在支持的设备上产生的触摸手势。特别是,UITouch 类包含的:
iOS 9.1 introduces APIs that help you use coalesced and predictive touches that can be produced by Apple
Pencil on supported devices. Specifically, the UITouch class includes:
● preciseLocationInView: 和 precisePreviousLocationInView: 方法给出精确的触摸位置(当可用时)
The preciseLocationInView: and precisePreviousLocationInView: methods, which give you
the precise location for a touch (when available)
● altitudeAngle 属性 和 azimuthAngleInView: 以及 azimuthUnitVectorInView: 方法帮你确定笔尖的高度和方位角
The altitudeAngle property and the azimuthAngleInView: and azimuthUnitVectorInView:
methods, which help you determine the altitude and azimuth of the stylus
● The estimatedProperties and estimatedPropertiesExpectingUpdates properties, which help
you prepare to update touches that are estimated
● The UITouchTypeStylus constant that’s used to represent a touch received from a stylus.
For an example of some ways to take advantage of these APIs in your app, see the sample project TouchCanvas:
Using UITouch efficiently and effectively . To learn how to add 3D Touch segues to your views, see Adding 3D
Touch Segues.
补充:
由官网截图可以看得出,笔头好像是可短距伸缩的,不知道传说中的压感级别是不是指这个,还是最新的触摸屏就已经支持压感力度级别分级识别了。
后续考证再补充。