Share

A few years ago, asking an AI assistant to retrieve sensitive financial information on behalf of a bank customer would have seemed far-fetched. However, this is the vision of Scott Totman, head of mobile technology, payments and innovation at Capital One, had for the US bank.

Earlier this year, Capital One created a virtual assistant application - or ‘skill’ - for Amazon’s Alexa platform. The voice-activated service allows customers to use voice commands to perform actions such as tracking recent transactions on credit and current accounts, balance checking and even bill payment.

Speaking at AWS Re:Invent in Las Vegas this week Totman revealed how the company created the AI platform.

The project started by running face-to-face consumer research sessions with customers to work out if they would be willing to bank using voice with Alexa.

Totman found customers had a common set of concerns they would need to overcome before launching the service. These included account security, a lack of physical privacy and sensitivity around data sharing with third parties.

Capital One realised early on that security would be integral for customer confidence in the platform. The skill is one of the first to use the open standard OAuth so no information is passed between Capital One and Amazon. Furthermore, the initial linking of accounts isn’t a simple mobile integration but a two-step verification where a secret phrase is sent to a user’s mobile device to authenticate the integration.

Related

Totman identified two primary customer needs: the ability to perform a task without having to stop what they are doing, and performing basic financial health checks, what Totman calls “Am I OK?” questions.

He said that when the team developing the Alexa skill “had to balance security and convenience, so that people felt comfortable using it but not making it so inconvenient that you would just go on your mobile app and do it just as fast”.

David Isbitski, chief evangelist for Alexa and Echo added: “Skills should provide an experience that is easier than using your phone,” otherwise people won’t use them.

Semantic search

Next, Totman and his small developer team had to work out how to communicate with customers using Alexa.

“Take ‘what’s left in my checking account?’, so is the person asking this affluent? Probably not, so we have to bear in mind. While the Alexa platform, through the commercials, has been brought with humour and levity, finances aren’t necessarily humorous to a lot of people. It’s stressful and personal so we had to be sensitive to that,” Totman said.

At first Capital One wanted customers to state their account number or type to get a balance request, however “we found that people ask things in all sorts of ways,” Totman said.

Through its call centres, user lab and actual feedback from the skill, Totman saw that people asked for information in a number of ways, and the skill now allows for more than 150 query types.

“Sometimes people don’t give us any information at all, they ask “what did I buy” and we have to figure out the time period, account and help the user get some useful information without making them work for it,” Totman added.

Finally, voice brought a new set of challenges for user experience (UX) designers as people don’t speak in a linear way. Totman explained the challenge: “There is a strong breadcrumb trail in web and mobile apps, you know where they came from and are going. With conversations they can go from asking for an account balance to wanting to pay a bill,” so it is important to work out context and understand when customers switch intent.

Avoiding the bad first date

Totman had some advice for fellow Alexa skill developers which he described as similar to avoiding a bad first date, namely avoiding awkward silences, forced conversation, a lack of personality and talking too much.

When it comes to silences: “Unlike a web page where you have process bars, when you have a silence from an Echo it is ten times more painful than a webpage, so you have to have extremely responsive APIs,” Totman said

Then, in order to avoid forced conversation you have to continue iterating the skill and responding to feedback so that all questions can be answered because “if you put in a feature that no one uses it doesn’t clutter up anything, it is just a question that no one asks. Just put it in,” Totman advised

He added that looking for opportunities for humour and making sure responses aren’t too verbose are good guidelines when developing an Alexa skill.

What next?

Capital One is investigating how to offer customers more capabilities to perform core banking activities with Alexa, such as locking or activating credit cards without having to pick up the phone.

Totman said the Capital One developers are working on a skill which allows you to ask Alexa what happened last night. Totman demoed it on stage and Alexa responded to the query with: “I don’t know what happened to you but I know what happened to your money”, and then “make better choices Scott.”

Explaining how they achieved this Totman said: “It’s not that complicated behind the scenes. We grab transactions between 8pm to 3am and if it is over a threshold we shame you and if it is lower we congratulate you.”