Lawrence Lessig stated that "code is law" - a maxim that above all should be the most critical in software engineering, especially when put in the context of implementing privacy and security.
I want to talk about some issues that worry me slightly (ok, a lot!). The first is that despite of policies, laws etc, the final implementation of anything related with privacy is in the code the programmers write. The second is that we are building our compliance programmes upon grand schemes and policies and paying piecemeal attention to the actual act of software engineering. The latter we attempt to wrap up in processes and "big ideas", for example, Privacy by Design.
Now before the PbD people get too upset, there's nothing wrong with stating and enumerating your principles, the Agile Manifesto is a great example of this, however there is no doubt that many implementation of agile are poor at best and grossly negligent and destructive at worst. The term used is "technical debt".
Aside: the best people I've seen conduct software development in an agile manner are formal methods people...I guess due to the discipline and training in the fundamentals they've received. This also applied to experienced architects, engineers and programmers for whom much of this formality is second nature.
Addressing the first point: no matter how many policies or great consumer advocates or promises you make, at the end of the day, privacy must be engineered into the architecture, design and code of your systems. It does not matter many powerpoint slides or policy documents or webpages your write, unless the programmers "get it", you can forget privacy, period!
Aside: Banning powerpoint may not be such a bad idea....
Herein lies a problem, the very nature of privacy in your systems means that it crosscuts every aspects of your design and ultimately your whole information strategy. Most of these things do not obviously manifest themselves in the design and code of your systems.
To solve this there must be a fundamental shift from the consumer advocacy-legal focus of privacy to a much deeper, technical or engineering, even scientific approach. This however does not just mean focusing on the design and code, though that is fundamental to the implementation, but to the whole stack of management and strategy from the highest directors to the programmers.
I've seen efforts in this direction but stop at the product management - "Hey, here are the privacy requirements - implement them!" ... which does feel good in that you are interacting, or believe that you are interacting, with the products you are producing but still not sufficiently with the people who really build these. Just producing requirements doesn't help: you need that interaction and communication right across the company.
Of course all of the above is extremely difficult and leads us to our next point which is how we build our compliance programmes in the first place. The simple question here is "are you fully inclusive?", meaning do you include programmers, architects (technical people with everyday experience) or is the programme run by non-technical, or formerly technical staff? Invariably it is the latter.
Compliance programmes must be inclusive otherwise the necessary inherency required to successfully and sufficiently implement the ideas and strategies of that programme will be lost - usually in a sea of powerpoint and policy documents.
Firstly in order to achieve inherent privacy (or security, or xyz) focus must lie on onboarding and educating the programmers, the designers and the architects and less focus on the management, prescription and consumer advocacy. Secondly, any compliance programme must be inclusive and understand the needs of the said technical staff. Thirdly, the engineering and technical staff are the most critical components in your organisation.
Compliance programmes are often measured on the amount of documentation produced (number of slides even?), however this ends up with a self feeding process where for the compliance programme to survive it needs to keep the fear of non-compliance at the fore. Read Jeff Jarvis' article on Privacy Inc.: Scare and Sell and then Jim Adler's talk about PII2012 on The Emergent Privacy-Industrial Complex and you get an idea of what is going wrong. Avoid at all costs creating a privacy priesthood in your compliance programmes.
Aside: This might be good old fashioned economics - if a compliance programme actually worked then there'd be no need for the programme in the end.
There are two interrelated caveats that also need to be discussed, the first of which is that any work in privacy will expose the flaws, holes and crosscutting issues across your products, development programmes and management and engineering skill bases. For example, a request to change a well crafted design to cope with some misunderstood ambiguity in privacy policy is going to end in tears for all concerned. It will demand of management, engineering and your compliance programme a much deeper [scientific] knowledge of what information your products are using, carrying, collecting and processing - to a degree uncommonly found in current practices.
The fundamental knowledge require to really appreciate information management and privacy is extensive and complex. Awareness courses are a start but I've seen precious few courses even attempting to cover the subject of privacy from a technical perspective.
Secondly, privacy will force you to examine your information strategy - or even create and information strategy - and ask very awkward and uncomfortable questions about your products and goals.
No comments:
Post a Comment