Friday, 10 May 2013

Privacy learning from aviation safety...

I've written before on my thoughts, or belief, that information should be treated as a dangerous chemical, or a patient undergoing surgery:

privacy is a safety critical concern

Developing ideas such as checklists from aviation, surgery etc, do improve the quality of work and actually implementing those has been a very interesting experience - if anyone wants to learn about this let me know, I'm very happy to talk about it.

Now while we have some parts of the "coalface work" working, ie: the real privacy engineering work with real software engineers, software architects, programmers and code, we can now turn to reassess how we're managing privacy and in turn managing risk and improving the safety of our code, products, business and ultimately our customers from a privacy perspective.

One area that has succeeded very well is aviation - our original starting point for much of this safety discussion. Indeed aviation safety precedes most other safety programmes by decades. I'd like then to quote from the book Aviation Safety Programs (Richard Wood, 2nd edn 1997, Jeppesen):

Safety is not a moral problem or an ethical problem or a pain and suffering problem. It is an economic problem. Safety, by itself, generates a lot of sympathy, but very little action. Only the economics of safety generate action. In the entire history of safety, nothing good happened unless it was either economically beneficial or mandated by government.

Now simply replace "safety" by "privacy":

Privacy is not a moral problem or an ethical problem or a pain and suffering problem. It is an economic problem. Privacy, by itself, generates a lot of sympathy, but very little action. Only the economics of privacy generate action. In the entire history of privacy, nothing good happened unless it was either economically beneficial or mandated by government.

Interesting don't you think?  Ah, yes, we know it is an economic problem and it certainly is being mandated by government. Adopting something like Privacy by Design today will not change the quality of your product or relationship with your customers. You'll get some sympathy but ultimately does anything change? There's an outcry every time Facebook changes it privacy policies but people still post private information there; similarly with Google and the rest. Privacy is talked about in moral and ethical terms: building trust with the customer.

Despite of everything, Facebook hasn't lost 50%, 90% or even 1% of its customer base because of its perceived "anti-privacy" stance, and neither has Google, nor any other company I can think of. Privacy must be framed as an economic argument: if company X abuses their consumer data privacy, then they lose all their customers and business.

We've a lot to learn from a successful safety program and that's something I'm going to write about more in the coming days and weeks. How do we put a privacy programme together such that it really does economically benefit a company and addresses all the cross-cutting issues from the top management through to the programmer actually writing the code that runs the business?

First things first though, when constructing such a programme, we need a group of people to lead the privacy culture within a company and that team needs day-to-day experience close to coalface, they need knowledge of not just the moral and ethical aspects but of real scientific and engineering experience. Starting with that we might just really see where the economics benefits are won and lost.

No comments: