iOS 9.1-Main content: Apple pen API Introduction

Source: Internet
Author: User

iOS 9.1-Main content: Apple pen API Introduction

The beautiful Life of the Sun Vulcan (http://blog.csdn.net/opengl_es)

This article follows "Attribution-non-commercial use-consistent" authoring public agreement

Reprint Please keep this sentence: Sun Vulcan's Beautiful Life-this blog focuses on Agile development and mobile and IoT device research: IOS, Android, HTML5, Arduino, Pcduino , Otherwise, the article from this blog refused to reprint or re-reproduced, thank you for your cooperation.



=======================================

IOS 9.1


Live Photos

Support for Apple Pencil

=======================================


This article summarizes the key developer-related features introduced in IOS 9.1, which runs on currently
Shipping IOS devices. The article also lists the documents that describe new features in more detail.
For late-breaking News and information on known issues, see IOS 9.1 Release Notes. For the complete list
Of new APIs added in iOS 9.1, see iOS 9.1 API diffs. For more information on the new devices, see IOS Device
Compatibility Reference.



Live Photos


Live Photos is a feature of IOS 9 so allows users to capture and relive their favorite moments with richer
Context than traditional photos. When the user presses the shutter button, the Camera app captures much
More content along with the regular photo, including audio and additional frames before and after the photo.
When browsing through these photos, users can interact with them and play back all the captured content,
Making the photos come to life.
IOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for
Sharing. The Photos framework includes support-to-fetch a Phlivephoto object from the Phimagemanager
Object, which is used to represent all the data that comprises a Live Photo. can use a Phlivephotoview
Object (defined in the Photosui framework) to display the contents of a Live Photo. The Phlivephotoview
View takes care of displaying the image, handling all user interaction, and applying the visual treatments to
Play back the content.
You can also with Phassetresource to access the data of a Phlivephoto object for sharing purposes. You
Can request a Phlivephoto object for an asset in the user's photo library by using Phimagemanager or
Uiimagepickercontroller. If you had a sharing extension, you can also get Phlivephoto objects by
Using Nsitemprovider. On the receiving side of a share, you can recreate a Phlivephoto object from the
Set of files originally exported by the sender.
The data of a Live Photo is exported as a set of files in a Phassetresource object. The set of files must be
Preserved as a unit when you upload them to a server. When you rebuild a phlivephoto with these files on
The receiver side, the files are validated; Loading fails if the files don ' t come from the same asset.
To learn the give users a great experience with live Photos in the Your app, see Live Photos.



Support for Apple Pencil


IOS 9.1 introduces a number of APIs to help you jointly pre-contract the touch gestures that the Apple pen produces on supported devices. In particular, the Uitouch class contains:

IOS 9.1 introduces APIs that's help you use coalesced and predictive touches the can be produced by Apple
Pencil on supported devices. Specifically, the Uitouch class includes:


Preciselocationinview: And Precisepreviouslocationinview: Method gives precise touch position (when available)
The Preciselocationinview:and precisepreviouslocationinview:methods, which give you
The precise location for a touch (when available)

Altitudeangle Properties and Azimuthangleinview: and Azimuthunitvectorinview: Methods to help you determine the height and bearing angle of the nib
The Altitudeangle property and the Azimuthangleinview:and Azimuthunitvectorinview:
Methods, which help you determine the altitude and azimuth of the stylus

The Estimatedproperties and Estimatedpropertiesexpectingupdates properties, which help
You prepare to update touches that is estimated

The Uitouchtypestylus constant that's used to represent a touch received from a stylus.


For a example of some ways to take advantage of these APIs in your app, see the sample project Touchcanvas:
Using Uitouch efficiently and effectively. To learn what to add 3D Touch segues to your views, see Adding 3D
Touch segues.




Add:

From the official website can be seen, the pen seems to be short-distance telescopic, do not know the legendary pressure level is not refers to this, or the latest touch screen has been supported by pressure-sensitive level classification identified.

Follow-up research and supplement.





iOS 9.1-Main content: Apple pen API Introduction

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.