Blame the Street View data collection practices on a "more is more" engineering mindset. And rethink your notions about privacy for unencrypted Wi-Fi data.Mathew J. Schwartz | May 01, 2012 10:50 AM
During a two-year period, Google captured oodles of Wi-Fi data worldwide as part of its Street View program. But why?
Blame the engineering ethos that's prevalent at high-technology companies like Google. You know the "more is more" mindset: more bells and whistles equals greater goodness.
But an unfiltered engineering mindset would help explain theapparent thinking behind the Street View wardriving program: "Well, if this Wi-Fi data is flying around and no one is encrypting it, what reasonable expectation do they have that it won't be sniffed and stored?"
This whole episode is starting to look like it is just about finding a suitable scapegoat whereas in reality the failures were multiple and in this instance happened to line us, cf: the Swiss-cheese analogy used in aircraft accident investigations.
I strongly doubt a "rouge engineer" was responsible - it takes many engineers to build and deploy these systems (look at the average size of an R&D team in most larger companies and the process/personnel infrastructure surrounding those). Now, admittedly it is possible that this code was added by a single engineer but do Google really have such lax software engineering practices?
R&D teams are under pressure to collect as much information as possible from applications and services, or, have the ability to collect and collect just in case. Mechanisms to collect WiFi data and snoop on the contents are easily available and part of the usual networking infrastructure tool kits - they need to be otherwise the features you rely upon in your operating systems etc wouldn't work.
According to reports the engineers asked for help from legal; in which case we have two points of failure, one that the engineers might have been asking the wrong questions and that legal didn't understand or respond. The failure might also have been at product/programme management level which might have blocked these requests or overridden them. It might just have been that engineering and legal acted correctly within the context of working in Google.
The statement in the Information Week article "an unfiltered engineering mindset would help explain the apparent thinking" is extremely misleading - the failures are multiple and most likely actually don't just involve engineering at all - the real blame is most likely elsewhere: product, programme management?
Also, even if Google have taken the so called Privacy-by-Design  ideas on-board, their communication and implementation might be very different from what is intended. As a similar example, Agile Methods have a similar set of statements and their implementation in many cases is grossly misunderstood (at best).
While statements of principle like PbD or even the Agile Manifesto  are fine, their implementation both technically and within the culture of the company as a whole are neglected at best and utterly misunderstood at worst. In here lies the real problems...unfortunately therein lies too many hard questions: changing a culture whilst maintaining maturity of method and process is hard.
I was also wondering about the title of the article "Engineering Trumped Privacy" and though it both misleading and somewhat offensive (to engineering), however the meaning of the work "trumped" can be taken in a slightly different way to mean "engineering exposed flaws in the overall implementation and understanding of privacy".
Finally, I do think the statement "never attribute to malice what can be explained by stupidity" applied here. At various levels in Google I don't think there was any intention of doing anything "bad", but for a company that tells everyone else not to be evil, malice does seem a logical choice fuelled by whatever conspiracy theory you choose.
Just like a pilot is "always" at blame in an aircraft crash, I wonder how Marcus Miller is feeling today...