I’ve touched on this before. Too many businesses assume that all their customers are the same. At the heart of the assumption is an over-developed sense of their own importance. Most consumer-facing businesses have a fan-base, a core of devoted users of their products and services. These are their visible customers; they act as brand ambassadors, tweeters and commentards. Their involvement with the business drives its product development as well as its marketing.
The danger is that these loyal, even fanatical, customers become the only focus for development. There is, of course, also a silent majority. Customers who buy the products, or use the services, but neither love them nor base their lives on them. They are not experts in navigating the product matrix; they may not use the website every day, or even every month; they may only dip their toes into the ecosystem. But they still spend money, and they are legion.
If you base your product development, and particularly your security decisions, on your core fan-base you risk alienating the quiet multitude. They are easily alienated, for they have no love for you; they will feel no emotional pain in choosing an alternative.
Why is this relevant to security? Because most businesses adopt a one-size-fits-all model of security. Since some people – the fan-base – put the business at the heart of their lives, their data must be protected like the Crown Jewels. If you use Apple for phones, computers, tablets, email, file data, music, photos and payment then of course having your AppleID compromised would be something like the end of the world. If your social life depends on your on-line gaming scores and your Steam library is worth more than your car, then of course SteamGuard makes sense. And so on, and so forth.
But if you have a phone which is a phone and email client, and you have a separate email service, and you pay with a credit card, and your video gaming is occasional and casual, and you don’t log in with Facebook or link your PayPal account to your bank account, then compromising most of your online identity does you little damage. My iCloud is empty. My PayPal balance is £0. I have a Windows phone; an Android tablet; a PC. The only card I use online has a derisory credit limit. My photos, data and music are inside my perimeter on old-fashioned spinning rust.
So most of the time, I don’t really care. I just want the convenience of the product, or the service, on the occasions I choose to use it. I don’t want to update it. I don’t need to be told about new features. I don’t want to have to jump through hoops, wait for emailed authorisation codes, remember a wildly complicated password and enter it three times, answer security questions or any of that. It’s inherently disposable; having the service improves my life, but losing it wouldn’t ruin it, and it holds no data that matters to me.
It’s simple. Users should be able to choose how secure they want to be.
Of course there are regulations to consider – services which retain my credit card details, for instance, have to be more secure, even if I don’t care about that card. But they should make it possible for me not to store a card, in return for letting me lower the level of security.
Users would have to indemnify the service provider, in case they were stupid enough to choose low security for things which mattered to them. Businesses would have to balance this against whatever duty of care their lawyers felt they owed their customers, but in the end we have to believe in individual responsibility, don’t we?
There is a real security principle behind this. In fact there are two:
- Over-securing is as bad as under-securing. If your security policies make life too difficult for users, they will work around them in order to get their jobs done. Their work-arounds will not only compromise the very security you’re trying to assure, but will also lower productivity.
- Security is a constant act of budget-juggling. The principle that the annual cost of mitigating risk should be lower than the annual loss expectancy from the unmitigated risk is a good one. The wider you spread the net, the less likely you are to adhere to this principle. When I teach data classification, the first point I make is that you should consider all of your information to be public unless you can prove otherwise. The business reflex is to do the opposite, but unless something would materially harm you if it appeared on the front page of the Daily Mail, it’s not actually confidential. The less confidential information you have, the better you can protect it.