下载 Xcode 8,配置 iOS 10 和 Swift 3html
(可选)经过命令行编译ios
除 非你想使用命令行编译,使用 Swift 3.0 的工具链并不须要对项目作任何改变。若是你想的话,打开 Xcode-beta,而后从顶部菜单栏中选择 Xcode > Preferences,接着选择 Location,在页面的底部,你会看到「Command Line Tool」这行设置,请在这里选择 Xcode 8.0。swift
如今,在 Terminal 使用命令行找到工程所在的文件夹,调用 xcodebuild 命令就能够编译工程了。xcode
(可选)移植现有的 Swift 2 应用闭包
如 果你想对一个已使用 Swift 2.0 开发的工程引入 Siri 功能,须要点击工程,选择 Build Settings,在 Swift Compiler - Version 下面,找到 Use Legacy Swift Language Version 选项,设置成 No。这会形成编译器报错,而后你能够根据这些报错信息来修改代码,推荐你使用这个设置来更新代码,以适应 Swift 不断进化的语义。app
开始使用 SiriKitide
首先,在你的 App(或者是新建一个单视图的 Swift 模板工程),点击顶部的工程,而后点击左侧下方的 + 按钮,在这里(译者注:我在这里添加了一张图片,可以说的更明白)点击。工具
弹出的窗口中,选择 iOS > Application Extension,接着选择 Intents Extension。post
这 样就给工程添加了一个新的 intent,用于监听 Siri 的命令。其中的 Product Name 应该和你的工程文件名字类似,好比,你的 App 名为 MusicMatcher,你能够把这个 intent 的名字命名为 MusicMatcherSiriIntent。必定要选中 Include UI Extension 选项,咱们以后会用到,这也是添加额外扩展的最简单的方法。测试
我 刚刚建立的两个新 target 能够从项目的文件层级上找到。找到 Intent 文件夹下的 IntentHandler.swift 文件,看一下这里面的样本代码。默认会提供一些示例代码,容许用户说一下诸如「用 MusicMatcher 开始锻炼」的命令,MusicMatcher 是 App 的名字。
像这样运行示例应用
这个时候最好编译一下代码,而后在 iOS 真机上试一下命令。继续,编译应用的 target,从 Scheme 下拉菜单里选择 MusicMatcher,而后选择真机,点击 Run。
你看你会看到一个空白的应用出现,你使用的扩展这时会在后台加载到设备的系统文件里,如今点击 Stop 按钮来关闭应用。
接下来,找到你的 scheme,选择 Intent target,点击 Run。
这时会出现一个弹出框,问你须要链接哪一个应用,选择你刚刚运行的应用:MusicMatcher。这会让真机上再次出现这个应用(仍是一个空白的应用),不过此次调试台(debugger)中会出现链接的 Intent 扩展。
如今点击 home 按钮回到首屏,或者应用可能本身就退出了,由于你正在运行的是 Intent,不是应用自己(这不是崩溃!!!)。
启用扩展
扩展都已安装就位了,可是做为一个 iOS 用户,仍然须要进行 Siri 设置才能使用扩展。点击测试设备里的 Settings,选择 Siri 菜单,你会看到 MusicMatcher 出如今清单里,激活容许使用 Siri。
测试咱们第一个 Siri 命令
尝试一下 Siri 命令,长按 Home 键或者说出「Hey Siri」来激活 Siri(固然须要你已经激活「Hey Siri」功能)。
试一下命令,好比「使用 MusicMatcher 开始锻炼」。
「对不起,你须要在应用里继续。」
若是你像我同样遇到了这样的错误信息:「Sorry, you’ll need to continue in the app.」(不知道什么缘由,偶尔会出现这么一个问题,什么鬼?)
在控制台中你可能会看到相似的信息:
1
2
3
4
|
dyld: Library not loaded: @rpath/libswiftCoreLocation.dylib
Referenced from: /private/
var
/containers/Bundle/Application/CC815FA3-EB04-4322-B2BB-8E3F960681A0/LockScreenWidgets.app/PlugIns/JQIntentWithUI.appex/JQIntentWithUI
Reason: image not found
Program ended
with
exit code: 1
|
咱们还须要在工程里添加 CoreLocation 库,确保能添加到咱们编译过的 Swift 工程中。
再 次选择工程根目录,选择 MusicMatcher target。在 General 底下找到 Linked Frameworks and Libraries。点击 + 按钮,添加 CoreLocation.framework。如今能够再次编译在真机上运行,接着照着上面相同的步骤再次编译运行 intent target。
最后,从手机桌面激活 Siri。
「Hey Siri!」
「Start my workout using MusicMatcher(使用 MusicMatcher 开始锻炼)」
Siri 这时候应该会回应:「OK. exercise started on MusicMatcher(OK,开始用 MusicMatcher 锻炼身体)」,而后会出现一个 UI 界面写着「Workout Started(锻炼开始)」。
它是如何工做的呢?
模板中的 IntentHandler 类使用了一长串的协议:
首先最主要的就是 INExtension,容许咱们一开始就把类看成一个 intent extension 来用。剩下的协议都是 intent handler 类型,在类里可以回调:
1
2
3
4
5
|
INStartWorkoutIntentHandling
INPauseWorkoutIntentHandling
INResumeWorkoutIntentHandling
INCancelWorkoutIntentHandling
INEndWorkoutIntentHandling
|
第一个就是咱们刚刚测试过的,INStartWorkoutIntentHandling。
按住 Command 键点击这些协议的名字,会看到苹果提供的文档:
1
2
3
4
5
|
/*!
@brief Protocol to declare support for handling an INStartWorkoutIntent
@abstract By implementing this protocol, a class can provide logic for resolving, confirming and handling the intent.
@discussion The minimum requirement for an implementing class is that it should be able to handle the intent. The resolution and confirmation methods are optional. The handling method is always called last, after resolving and confirming the intent.
*/
|
换句话说,这协议告诉 SiriKit 咱们准备处理英文句子「Start my workout with AppName Here.」
这 会根据用户使用语言的不一样而不一样,不过最终的目的都是开始一次锻炼。INStartWorkoutIntentHandling 协议调用的几个方法都在示例代码里实现了。若是你想建立一个锻炼应用,你能够自行了解其余的内容。不过在这篇教程的剩余部分,我会添加一个新的 intent handler,来处理发送消息。
添加一个新的消息 Intent
确认应用能够完美运行后,让咱们继续,添加一个新的 intent 类型,用于发送消息,这里的文档说明了下列信息:
Send a message
Handler:INSendMessageIntentHandling protocol
Intent:INSendMessageIntent
Response:INSendMessageIntentResponse
在类里添加 INSendMessageIntentHandling 协议。首先要明确,咱们把它添加到类协议清单里,也就是在 IntentHandler.swift 文件里。因为实际上我不想使用这些 intent,因此我会删除它们,只留下这一个:
1
2
|
class IntentHandler: INExtension, INSendMessageIntentHandling {
...
|
若是这时候编译,是不会经过编译的,由于咱们还须要实现一些遵照 INSendMessageIntentHandling 协议所必需的方法。
另外,若是你须要核对具体是哪些方法,只须要按住 Command 键而后鼠标点击 INSendMessageIntentHandling,而后看一下哪些方法前面没有 optional 关键词便可。
在这里,咱们发现只有一个必须实现的方法:
1
2
3
4
5
6
7
8
9
|
/*!
@brief handling method
@abstract Execute the task represented by the INSendMessageIntent that's passed in
@discussion This method is called to actually execute the intent. The app must return a response for this intent.
@param sendMessageIntent The input intent
@param completion The response handling block takes a INSendMessageIntentResponse containing the details of the result of having executed the intent
@see INSendMessageIntentResponse
*/
public func handle(sendMessage intent: INSendMessageIntent, completion: (INSendMessageIntentResponse) -> Swift.Void)
|
遵照新消息意图协议
回到 IntentHandler.swift 文件,添加一行分隔符(借助 jump bar,在导航查找代码时这个分隔符会很是有用)
1
|
// MARK: - INSendMessageIntentHandling
|
在 MARK 底下,咱们来实现方法。我发现 Xcode 8 很是有用,经过敲击方法名字的开始部分,剩下的都能交给自动补全来完成了,而后选择对应的方法。
在 handler 里,咱们须要建立一个 INSendMessageIntentResponse,来回调闭包。先假设全部的信息发送都很成功,在 INSendMessageIntentResponse 里返回一个用户活动的成功值,和默认模板中的实现很是相似。还须要添加一个 print 方法,当 handler 方法被 Siri 事件触发后咱们就能知晓啦:
1
2
3
4
5
6
|
func handle(sendMessage intent: INSendMessageIntent, completion: (INSendMessageIntentResponse) -> Void) {
print(
"Message intent is being handled."
)
let userActivity = NSUserActivity(activityType: NSStringFromClass(INSendMessageIntent))
let response = INSendMessageIntentResponse(code: .success, userActivity: userActivity)
completion(response)
}
|
把这个 intent 类型添加到 Info.plist
在具有处理 INSendMessageIntent 方法以前,咱们须要在 Info.plist 文件里添加一些值,就看成是应用的受权吧。
在 intent 的 Info.plist 文件里,找到并点开 NSExtension 键。接着点开 NSExtensionAttributes,而后是 IntentsSupported,咱们须要给 INSendMessageIntent 新添加一行,容许应用处理信息 intents。
测试新的 intent
如今咱们已经设置好了新的 intent,来测试一下。记住,你必须先编译 App,在真机上运行,接着运行扩展进行调试,若是你不这样作,扩展要么不会工做,要么不会在 Xcode 的控制台中打印日志。调用 Siri 的 intent,你如今能够看到会出现一个新的信息窗口,这个窗口目前仍是空的,毕竟咱们尚未给应用编写什么逻辑,咱们须要实现剩下的调用,还要添加一些信息的逻辑,实现更好的用户体验。
This tutorial written on June 20th, 2016 using the Xcode 8 Beta 1, and is using the Swift 3.0 toolchain.
This post is a follow-up in a multi-part SiriKit tutorial. If you have not read part 1 yet, I recommend starting there.
In order to make our Siri integration more useful, we can help fill out the content of our message using a callback method from the INSendMessageIntentHandling
protocol. Investigating this protocol you can see this show up an optional methods.
resolveRecipients(forSendMessage intent:
INSendMessageIntent
, with completion: ([
INPersonResolutionResult
]) ->
Swift
.
Void
)
resolveContent(forSendMessage intent:
INSendMessageIntent
, with completion: (
INStringResolutionResult
) ->
Swift
.
Void
)
resolveGroupName(forSendMessage intent:
INSendMessageIntent
, with completion: (
INStringResolutionResult
) ->
Swift
.
Void
)
resolveServiceName(forSendMessage intent:
INSendMessageIntent
, with completion: (
INStringResolutionResult
) ->
Swift
.
Void
)
resolveSender(forSendMessage intent:
INSendMessageIntent
, with completion: (
INPersonResolutionResult
) ->
Swift
.
Void
)
|
So we can provide SiriKit with further information by implementing as many of these resolutions as we wish. Effectively enabling us to provide information regarding the recipients, content, group name, service name, or sender. These should be relatively self-explanatory.
Let’s try providing some static data for our title and content, to demonstrate how resolutions work.
First, let’s add the resolution for the content of the message, by implementing the resolveContent
protocol method.
func
resolveContent(forSendMessage intent:
INSendMessageIntent
, with completion: (
INStringResolutionResult
) ->
Void
) {
let
message =
"My message body!"
let
response =
INStringResolutionResult
.success(with: message)
completion(response)
}
|
Here we create a string resolution result, and call the success function. This is the simplest way to proceed, but we also have the option of returning a disambiguation
, confirmationRequired
, or unsupported
response. We’ll get to those later, but first let’s actually use the data Siri is providing us.
Siri will send in it’s own transcription of our message in the intent
object. We’re interested in the content
property, so let’s take that and embed it inside of a string.
func
resolveContent(forSendMessage intent:
INSendMessageIntent
, with completion: (
INStringResolutionResult
) ->
Void
) {
let
message =
"Dictated text: \(content!)"
let
response =
INStringResolutionResult
.success(with: message)
completion(response)
}
|
The content property is an optional, and as such we need to make sure Siri actually provided a transcription. If no transcription was provided then a message won’t be entirely useful, so we need to tell Siri that the information is missing and we need this value. We can do this by returning a resolution result calling the needsValue
class method on INStringResolutionResult
.
func
resolveContent(forSendMessage intent:
INSendMessageIntent
, with completion: (
INStringResolutionResult
) ->
Void
) {
if
let
content = intent.content {
let
message =
"Dictated text: \(content)"
let
response =
INStringResolutionResult
.success(with: message)
completion(response)
}
else
{
let
response =
INStringResolutionResult
.needsValue()
completion(response)
}
}
|
Now SiriKit knows when we try to send a message, that the content value is a requirement. We should implement the same type of thing for the recipients. In this case, recipients can have multiple values, and we can look them up in a variety of ways. If you have a messaging app, you would need to take the INPerson
intent object that is passed in and try to determine which of your own user’s the message is intended for.
This goes outside the scope of this Siri tutorial, so I’ll leave it up to you to implement your own application logic for the resolveRecipients
method. If you want to see an example implementation, Apple have released some sample code here.
We’ll be continuing to investigate iOS 10 and publish more free tutorials in the future. If you want to follow along be sure to subscribe to our newsletter and follow me on Twitter.
Thanks, Jameson