At Foolproof we design experiences to improve the lives of millions, and as part of this we’re especially interested in the interaction between humans and technology. We’ve had our eye on Voice Recognition for some time and how this particular technology will affect the way consumers will manage their personal finances in the future.
Using our voice to control the things around us isn’t new. Aural interaction and feedback has been common for people with accessibility needs for years e.g. using voice feedback on cash machines. We’re also used to engaging with Virtual Assistants like Siri and Cortana.
More recently we’ve invited these Virtual Assistants into our homes with the likes of Amazon Echo and Google Home. As we get more used to using our voice for basic commands like ‘what time is it?’ and ‘do I need an umbrella today?’, the next step is to use our voice to perform more complex tasks like banking.
The biggest challenge for financial services will be overcoming customer inertia and security concerns. Earlier this year Barclays, First Direct and HSBC deployed Voice Recognition into phone banking. In the US, Capital One customers can already carry out tasks like checking their account balance or making payments using Amazon Echo. Citibank is expected to follow suit in the near future.
Consumer behaviour and aural interfaces
Our research has uncovered some fundamental ways in which people approach and use aural interfaces.
Convenience is the most common reason for using voice over other modes of input
- It is quicker to say something than to type it.
- It is more natural to simply think something and verbalise it than it is to write it out.
Consumers expect their episode of interaction to be concise and this has an impact on the types of questions they ask the answers they expect
- It is likely they will ask questions which do not require exhaustive answers.
In terms of aural interfaces and banking, we can add one point to the two above:
Security and privacy is paramount
- Both are extremely important for internet banking users regardless of channel. Given that this medium is new, it is critical to reassure users that their money is secure, their details private and safe from others who may also have access.
We have observed how these behaviours influence people on using aural user interfaces for banking.
Consumers consider their social setting/context before using aural interfaces
They may not want others to hear the question they’re asking. A consumer might think:
- ‘Do I want others around me to know that I to find out my balance right now…will they think I don’t have enough money if I have to find out my bank balance before I pay for this?’
And they certainly may not want others to hear the answer to the question they have just asked
A consumer might think:
- ‘Do I want my friends to hear my bank balance?’
Designing aural interfaces for banking
Such behaviours and expectations impact how we should approach designing interactions for this modality. We will briefly look at how the above behaviours might manifest themselves in a smart-home device such as Amazon Alexa or Google Home.
Consumers will most likely place these devices in areas where they spend most of their time, such as the kitchen or living room. Therefore, they need to be reassured that their sensitive banking information will be secure, and that no-one else can access it.
- Reassure users that they will only be able to log in to their bank account after they have successfully authenticated themselves (i.e by voice recognition, asking for digits from password, e.t.c)
The types of questions users ask (and the responses they expect to get) will be influenced by context such as time and previous habits.
If a user asks about their bank account balance, they likely don’t want the balance of every account they have.
- Consumer question: “How much money is in my current account?”
- An appropriate system response: “The balance of Account A is X.” (Account A being the most frequently used account.)
- A poor-quality system response might be: “The balance on Account A is X, the balance on Account B is Y, the balance on Account C is Z.”
Users could (and will) ask for the same information in a variety of different ways, and they may not use the correct terminology or product names.
For example, users might use a variety of the following questions to find out the same information, which is to see how much money they can spend:
- Consumer question: “How much money is in my account?”
- Consumer question: “What is the balance of my bank account?”
- Consumer question: “How close am I to my Credit Card limit?”
Users will expect systems to be intelligent enough to offer useful information that they did not ask for, but would help them make good decisions.
If a regular transaction, such as a direct debit, will affect their ‘financial health’ and spending power is coming up, they want to be told.
- User question: “How much money is in my current account?”
- System response: “There is £200 in Current Account A, but you have a direct debit for the gym of £50 coming out tomorrow.”
- System response: “There is £20 in Current Account A, but you have an overdraft facility of £500.”
As consumers become more comfortable using their voice to engage with their devices, voice banking is sure to come soon. The main things to keep in mind when designing for voice banking include:
- As always, security and privacy is paramount for every consumer and users will expect there to be checks and balances in place to ensure information is kept private and secure.
- Consumers consider their social setting/context before using aural interfaces as they may not want others to hear the question they’re asking, or the response given.
- Consumers most often use their voice for convenience. Therefore, questions will be short, and they will expect the device’s response to be concise.
- Systems need to be intelligent enough to offer useful information consumers did not ask for, but information that would assist in making a good decision.