IoT Privacy Forum founder Gilad Rosner features in the latest episode of O’Reilly’s Hardware Podcast, discussing his research for the British government, and the differences between European and US privacy cultures.

On his work used by the UK Parliament (paraphrased in parts):

That research was when the UK government put out a call and said, we’d like to vacuum up a lot of social media data and analyze it for government purposes: “beneficial outcomes” rather than law enforcement. Trying to look at data and come up with new information that would theoretically be beneficial to society. They were wondering how they’d go about it — whether “public” social media posts could present ethical problems when inhaling all that data for analysis. The answer is: yes, there are ethical problems here, because even though information is set to “public”, there’s a concept of respecting the context in which the data was uploaded, or volunteered. When I tweet, I’m not necessarily expecting the government to mine that information about me.

When it comes to privacy and data protection, especially with a large actor like the government, one of the most important concerns is procedural safeguards. Governments have ideas all the time, often good ideas, but the apparatus of implementing these ideas is very large, bureaucratic, and diffuse. So what constrains these activities, to make sure they’re being done securely, in line with existing privacy regulations, and with people being sensitive to things not necessarily covered by regulation, but still potentially worrisome? How do we come up with ways of letting good ideas happen, but under control?

Academics Data Protection Law Policy Power Transparency

While waiting for a taxi in Barcelona, my friend, irritated at waiting for one to appear, told me that Uber was made illegal in Spain. However, she was not 100% sure it was still illegal; perhaps something had changed. To answer this, she took the most direct route and opened Uber on her phone: Nope, we could not order transportation. I saw this moment as a wonderful inversion of that 90s internet tenet, ‘code is law.’ The main idea is that the architectures of electronic systems perform similar regulation of behavior as law does in the physical, social realm. In some of the more utopian and techno-deterministic formulations of this idea, the world of paper laws and hoary legislative debates would crumble before the might of the interweb and its meritocratic ways. Opening Uber on her phone to see if it could sell its wares in the Catalan capital was a wonderful reminder of the over-exaggeration of regulation’s untimely demise. Academics love the word, ‘tension,’ and companies like Uber and Airbnb cause mountains of it. They’re such good case studies for regulation, perturbing existing regimes such as hotel taxation and taxi passenger safety, stepping on toes economic and political. The earliest discussions of the social impact of the internet invoked its borderless character and the inevitable clashes that would arise with national sovereignty. That tension is is alive and well, visible in US alarmism over the coming EU General Data Protection Regulation, in the regulatory tussle over Uber and its so-called sharing economy kin, and in the recent invalidation of the Safe Harbor framework. Law may move slowly, but it still packs a punch.

Law Policy Power

Jay Stanley, Senior Policy Analyst for the ACLU Speech, Privacy & Technology Project, is doing a series of posts on the IoT. The first piece, ‘The Coming Power Struggles Over the “Internet of Things”,’ contemplates the extension of corporate power into more and more personal and intimate spaces. He begins with this example:

When I stick a movie into my DVD player and try to fast-forward through some of the annoying preliminaries that are at the front of the disc, I get an error message: “This operation prohibited by this disc” […] First of all, it’s not “the disc” that is prohibiting my attempt to forward through trumped up FBI warnings and Hollywood anti-piracy propaganda. It’s the Hollywood studios that have programmed my technology to prohibit my “operation.” […] The message is: “There’s no power play going on here, it’s just how the objective technology works!” What’s actually happening is the movie studios have decided to program technology (or pressure hardware manufacturers to do so) to take away your control over your time and what you watch, and force you to view content that they control, in order to advance their own interests. More broadly, this annoying little example highlights the power struggles we could face as computer chips come to saturate the world around us—the trend often called “the Internet of Things.”

It’s an interesting and important point (though I do wish it was a little less shrill). Questions of power inequity rarely surface in public discussions of data collection and system control, so I’m happy to see Stanley address it. His next piece, “The Internet of Kafkaesque Things,” is a thoughtful if rarified discussion of the similarities and differences between computers and bureaucracies. Stanley worries if those similarities will transmit the inefficiencies and rabbit holes of bureaucracies into ever more personal spaces as devices become more connected:

The bottom line is that the danger is not just that … we will become increasingly subject to the micro-power of bureaucracies as computer chips saturate our lives. There is also the danger that the Kafkaesque absurdities and injustices that characterize bureaucracies will be amplified at those micro levels—and without even being leavened by any of the safety valves (formal or informal) that human bureaucracies often feature.

Both pieces are worth a read and I’m looking forward to the third piece in the series.

Intimacy Power