1. Overview
Apple opened the Sirikit interface to third party applications in iOS10. At present, QQ has been the first adaptation of Siri's messaging and telephone functions. This means that you can tell Siri directly in the iOS10 that it will help you send QQ messages and phone calls, does it sound cool?
What about the experience of third party applications using Siri? What applications can access Sirikit? What do you need to do to access Sirikit? This article will answer all of these questions for you.
Figure 1 using Siri to send QQ message effect show
2. Sirikit Introduction
We all know that Siri is a smart voice helper in the iphone, so what is Sirikit? Sirikit is the development framework Apple has provided for Third-party applications to support Siri. In the official document, Sirikit will divide voice support for different scenarios into domain, and currently, Sirikit support domain includes VoIP telephony, messaging, transfer, picture search, Internet booking, CarPlay and restaurant bookings, That is to say, if you have one of these features in your application, you can consider connecting these features to the Sirikit.
When implementing Sirikit-related functions, we do not need to really recognize speech, and speech recognition will be done by Siri. When Siri recognizes speech, it abstracts the functions that speech is going to perform into our intent objects, and our access work primarily deals with these intent objects and does not involve natural language processing (NLP) technology.
On the development of Sirikit online has some articles, also can refer to Apple's official documents Sirikit programming Guide, this article focuses on the introduction of QQ experience.
Fig. 2 Principle of Sirikit
3. Sirikit Access
To implement the Sirikit feature, you need to add target Intents extension to the Xcode project, like other extension, Intents is a plug-in that runs independently of the extension app process, Mainly used to process and confirm intent requests from Siri. If you want Siri to provide some custom interfaces when dealing with app-related intent, you'll need to add Intents UI extension Target,intents UI Extension is also a stand-alone plug-in (so to complete the support Sirikit actually need to add two target, a little egg pain). The development of app Extension can refer to Apple's app Extension programming Guide.
We use QQ in the message function as an example to explain the Sirikit access method:
First, we need to configure the Siri Intents we need to support in the Intents extentsion info.plist file, and add intentssupported to the Insendmessageintent, If you need to disable a feature while the screen is locked, then add the intent of the item in the intentsrestrictedwhilelocked, as shown in Figure 3.
Figure 3 Intent extentsion info.plist Configuration
Sirikit access is mainly divided into intents extension and intents UI extension two parts, the following are introduced separately.
Intents Extension
When we say "hello to Wang Yizhan with QQ", the speech recognition will be done automatically by Siri, and Siri will display the identified good content in the Siri interface. As shown in Figure 4, we can see that a complete message statement consists of four main parts:
Application name: tell Siri which App,siri will automatically identify the app's name according to the app's bundle displayname without additional registration.
send a message intent: tell Siri to use the function of the message, we observed that the message is also able to identify, specifically what words will be identified as the message intent Apple did not explain in the document.
message receiver: tell Siri who is the recipient of the message, "Wang Yizhan" is my QQ friend's nickname.
message content: Tell Siri what you want to send, and the message here is "I'm angry."
Figure 4 Confirming the Send message interface
The application name and Intent are necessary, or Siri cannot abstract your "Intent". If the latter two items are default, we can require the user to provide further data or ignore them in the implementation. After the recognition is complete, Siri abstracts the message content and the receiver into a insendmessageintent intent Extension passed to QQ.
We can also see from Figure 4 that Siri accurately identified my QQ friends as "Wang Yizhan" in my voice, but "Wang Yizhan" is not a generic phrase, so how does this work? The secret lies in the QQ run when we put all the nickname of QQ friends to the Siri cloud, so Siri can identify specific users to use the specific phrase, detailed synchronization method can refer to Invocabulary setvocabularystrings:oftype: Method.
Each domain function has a corresponding intents in Siri, and each intents corresponds to a specific handler protocol. For outgoing messages, the corresponding intent and handler protocols are insendmessageintent and insendmessageintenthandling respectively. As long as the relevant methods in the INSENDMESSAGEINTENTHANDLING protocol are implemented, And when Siri resolves the insendmessageintent request, we use our Insendmessageintenthandling object to process the related message request. The specific process is shown in Figure 5:
Figure 5 Siri send QQ message Flow
1) resolverecipientsforsendmessage
Process and confirm the recipient name that Siri passes over from the intent, such as confirming whether the name is currently in the QQ buddy list and resolution result feedback to Siri. Resolution result represents the results of the application for intent processing, and table 1 lists several possible resolution results for outgoing messages.
Table 1 Send resolution result
2) resolvecontent
Similar to the handling of the receiver, in this way the content of the message that Siri recognizes can be "decorated" and the resolution result is fed back to Siri, for example, QQ has a emoji fit for the special words in some messages, such as "angry".
3) Confirmsendmessage
The purpose of this method is to confirm whether you want to send the message, you can do some authentication work in this step, verify the transmission after verification, or cancel. Confirm that you can send the confirmation send interface after sending, as shown in Figure 4. If you need to share data from the containing app, the specific implementation scenario refers to the shared Container of the app group.
4) Handlesendmessage
As shown in Figure 4, when the user clicks on the "Send" button or use voice to give instructions to send the final entry to this method, in this method we need to implement the logic of the message, send a successful post can be sent to send a successful interface, such as Figure 6.
Figure 6 Successful message sending interface
Intents UI Extension
For intent types that support custom interfaces, you can provide a more aesthetically pleasing custom interface in the intents UI extension. Custom UI implementations are relatively simple, and as with iOS app development, they are implemented through Uiviewcontroller subclasses. We need to set the initial Viewcontroller in the Info.plist file of the intents UI extension or set main storyboard, for different types of intent interfaces to show through the child Viewcontrollers the way to achieve a differentiated interface display.
As shown in Figure 7, when a response is received from the intents extension, the system evokes the intents UI extension and loads initial viewcontroller, Through the Inuihostedviewsiriproviding protocol ConfigureWithInteraction:context:completion: Methods can obtain intent, for example, in the messaging function, This method is recalled once the message confirms the sending and sending success. Depending on the type and state of the intent object, a customized interface display can be realized by present the corresponding child viewcontroller when the callback of the relevant intent is received.
It is important to note that the Intents UI extension process does not exit after the interface is destroyed, and is probably only dormant in the background and then awakened the next time response arrives.
Figure 7 Life cycle of the Intents UI extension
4. Summary
On the whole, although Apple has limited access to sirikit this time, Apple has been very attached to Siri in terms of our fitness experience. In addition, this is the first time for Sirikit to apply the Open interface to third parties, so there are some problems inevitably. We did encounter some bugs in the development process, most of which were resolved after feedback to Apple, but there are still some flaws in the language recognition, such as the Sirikit of the mixed Chinese and English scenes. Hopefully, Siri's support for Chinese is getting better, and Siri will be able to open up more scenes for third parties to apply to fit.
The above is the entire content of this article, I hope to help you learn, but also hope that we support the cloud habitat community.