“Going back to ’s earliest days we’ve always worked hard to create healthy open platforms,” said Scott Huffman, who leads the Assistant engineering team. “The Assistant will be our next thriving open ecosystem.” The first new program is called Actions on , which will launch in early December allow app makers to build Assistant into their services. The Embedded Assistant SDK, meanwhile, is coming in 2017 can put Assistant in third-party hardware.

Actions on

says there will be two different action types that will be available to developers: Direct Actions Conversation Actions. Both are fairly self-explanatory, but to be clear Direct Actions are requests like “dim the lights,” “play my evening mood playlist,” or “play Narcos on Netflix” (coming soon). Conversation Actions, on the other h, are for when you need a back–forth conversation with Assistant such as making reservations, buying tickets, or ordering food. says a number of companies are already building Actions into their apps, including CNN, Foursquare, IFTTT, lly, nkedIn, Netflix, Todoist, The ll Street urnal, bMD. says that in the future Actions on will work in three different “interfaces”: pure voice interactions, text-based conversations, hybrids of the two. 

Embedded SDK

Huffman didn’t say much about the Embedded Assistant SDK, but from what he did say it sounds like the SDK will be wide open. “ imagine a future where the Assistant will be able to help in any context, from any kind of device,” Huffman said. “ether you’re tinkering with a Raspberry in your basement, or you’re building a mass market consumer device you’ll be able to integrate the Assistant right into what you make.”