According to a Google
job listing and several stories this morning Google is building an internal “Red Team” to address privacy and security issues. The job description reads:
As a Data Privacy Engineer at Google you will help ensure that our products are designed to the highest standards and are operated in a manner that protects the privacy of our users. Specifically, you will work as member of our Privacy Red Team to independently identify, research, and help resolve potential privacy risks across all of our products, services, and business processes in place today. Top candidates will have an intimate knowledge of the inner workings of modern web browsers and computer networks, enjoy analyzing software designs and implementations from both a privacy and security perspective, and will be recognized experts at discovering and prioritizing subtle, unusual, and emergent security flaws.
This is a positive move for Google on multiple levels. The new group will make it more likely that Google finds and addresses potential privacy and security issues and will help ensure that Google’s products operate in compliance with the company’s privacy policies.
The issue and problem that I immediately see is that the Red Team appears to be primarily or exclusively a software engineering team. It’s not a policy making or legal compliance group. Yet most of Google’s privacy problems with regulators in Europe and the US (i.e., FTC) have little to do with software, security or technical issues. Rather they concern policy decisions:
Search data retention at odds with European standards
Collecting WiFi data with Street View cars (and personal information at the same time)
Circumventing Safari default privacy settings
Consolidating privacy policies of multiple products into a single uber-privacy policy (the Europeans objected to the rapid and unilateral nature of the changes)
Launching new products (i.e., Buzz) without sufficient disclosures and user education
Each of these “scandals,” challenges or issues had more to do with decisions that executives made than software. Perhaps a Red Team would have caught the problems, for example, with PII payload data being collected by Street View cars or the fact that Google was circumventing Safari’s default privacy settings.
But if the Red Team had identified these issues internally it’s not clear that anything would have immediately changed absent pressure from without. Still it’s better to have a watchdog than not.
My view is that Google’s Red Team needs at least a couple of members who might anticipate these larger “political” or policy problems in addition to software engineers who can sniff out security holes.
Comments