Amazon.com Inc’s chief technology officer is working toward a day when people can control almost any piece of software with their voice.
The company on Wednesday rolled out the technology powering Alexa, its voice assistant that competes with Apple Inc’s Siri, to developers so they can build chat features into their own apps, CTO Werner Vogels said in an interview.
The service, Amazon Lex, was in a preview phase since late 2016.
The move underscores how Amazon is racing to be the top player in voice-controlled computing, after losing out in mobile to Apple and Alphabet Inc’s Google.
Vogels said that Amazon’s headway in processing how humans write and speak would make conversational assistants or “chatbots” more helpful than the clunky tools they’ve been in the past.
“There’s massive acceleration happening here,” he said before speaking at Amazon’s cloud-computing summit in San Francisco. “The cool thing about having this running as a service in the cloud instead of in your own data center or on your own desktop is that we can continuously make Lex better with the help of the millions of customers that are using it.”
Processing vast quantities of data is key to artificial intelligence, which lets voice assistants decode speech. Amazon will take the text and recordings people send to apps to train Lex – as well as Alexa – to understand more queries.
That could help Amazon catch up in data collection. As popular as Amazon’s Alexa-powered devices are, such as Echo speakers, the company has sold an estimated 10 million or more. Apple has sold hundreds of millions of iPhones and other devices with Siri.
Vogels said people use Alexa for many tasks, from helping them cook to playing music, while they talk to assistants on their phones in fewer scenarios like when driving a car.
As with other cloud-based services, Amazon will charge developers based on how many text or voice requests Lex processes.
Still, the biggest payoff may come from e-commerce, which has already attracted many to build chatbots. Amazon has begun offering Alexa-only shopping deals to encourage purchases by voice, and Facebook Inc this week said its virtual assistant, called M, can help users order food from delivery.com.
“Voice is a big part of the computer interface of the future,” said Gene Munster, a veteran equity analyst and now head of research at Loup Ventures.
“Whoever owns voice will be the gateway of commerce,” said Munster.
Wells Fargo & Company is testing a “chatbot,” an automated program that can communicate with the bank’s customers on Facebook’s messaging platform to give them information on their accounts and help them reset their passwords.
The US bank said on Tuesday that it is piloting the virtual assistant with several hundred employees, and plans to extend testing to a few thousand customers later this spring.
Wells Fargo’s chatbot will use artificial intelligence to respond to natural language messages from users, such as how much money they have in their accounts, and where the nearest bank ATM is.
Chatbots have risen in popularity in finance and other industries over the past few months because recent improvements in artificial intelligence have made them better at interacting and interpreting human languages.
Banks and other financial firms are hopeful that chatbots can be used to provide better and continuous customer service at the fraction of the cost of large call centers populated by humans.
French bank Societe Generale, for example, recently revealed that it was working with start-up Personetics Technologies to develop chatbots that could answer queries about equity funds in its Romanian banking unit.
Wells Fargo’s chatbot, which does not yet have a name, comes as the bank is investing heavily in the development of AI-based technology.
Facebook opened up its Messenger app to developers to create chatbots in April 2016 in a bid to expand its reach in customer service and enterprise transactions.