UC Berkeley’s Center for Long-Term Cybersecurity released “Cybersecurity Futures 2020,” a set of scenarios meant to spur conversations about the future of cybersecurity and related topics. Dr. Gilad Rosner, founder of the IoT Privacy Forum, was one of the contributors to the Intentional Internet of Things scenario, which provokes discussion with this image of the future:

“While the widespread adoption of IoT technologies may be predictable in 2016, the mechanism that will propel this shift is less so. In this scenario, government will intentionally drive IoT adoption to help societies combat recalcitrant large-scale problems in areas like education, the environment, public health, and personal well-being. This will be widely seen as beneficial, particularly as the technologies move quickly from being household novelties to tools for combating climate change and bolstering health. “Smart cities” will transition from hype to reality as urban areas adapt to the IoT with surprising speed. In this world, cybersecurity will fade as a separate area of interest; when digitally connected technologies are part of everyday life, their security is seen as inseparable from personal and national security. But while this world will offer fantastic benefits for public life and reinvigorate the role of governments, there will also be greater vulnerability as IoT technologies become more foundational to government functions and the collective good.”   (from: https://cltc.berkeley.edu/scenario/scenario-four/)

Main page: https://cltc.berkeley.edu/scenarios/

Intro and Executive Summary: https://cltc.berkeley.edu/files/2016/04/intro_04-27-04a_pages.pdf

Full report: https://cltc.berkeley.edu/files/2016/04/cltcReport_04-27-04a_pages.pdf

 

Academics Policy Privacy by Design Security

This Tuesday, Dr Gilad Rosner, founder of the IoT Privacy Forum, will be doing a free one hour webcast called Privacy, Society & the Internet of Things. It’s an exploration of the many meanings of ‘privacy,’ the privacy risks implied by a world of connected devices, and some of the frameworks emerging to address those risks. The webcast will be broadcast live at 10am PT / 1pm ET / 6pm GMT. Register for it here: http://www.oreilly.com/pub/e/3582

Conference Law Policy Privacy by Design

The Open Web Fellows program — a collaboration between Ford Foundation and Mozilla – is an international leadership initiative that brings together technology talent and civil society organizations to advance and protect the open Web. Specifically, the goals of the Open Web Fellows programs are to:

  • Increase public awareness and understanding of Internet policy issues
  • Support career paths in the Internet policy and advocacy sector
  • Celebrate and support the vibrant network of Internet advocacy organizations

Broadly speaking, we are looking for makers – those who see a problem in the world and can solve it through technology or media. The right candidate may bring specialties in design, development, storytelling, research and policy analysis, and should be comfortable performing as a technologist, a tinkerer, and a curious contributor to the fellowship program. You do not need to have previous experience with Internet advocacy, policy or activism.”

Deadline Mar 20, 2016

https://advocacy.mozilla.org/open-web-fellows/

Uncategorized

In this new article for O’Reilly, IoT Privacy Forum Founder Dr Gilad Rosner discusses how the IoT amplifies the problems of Notice & Choice and Consent. Academics and experts have long been aware of their failure as privacy protection strategies, and some have called for them to be eliminated in favor of a controversial policy: letting businesses choose which data uses are and are not appropriate. The article examines the pros and cons of this approach with regard to connected devices.

https://www.oreilly.com/ideas/in-the-age-of-connected-devices-will-our-privacy-regulations-be-good-enough

Data Protection Law Policy

While waiting for a taxi in Barcelona, my friend, irritated at waiting for one to appear, told me that Uber was made illegal in Spain. However, she was not 100% sure it was still illegal; perhaps something had changed. To answer this, she took the most direct route and opened Uber on her phone: Nope, we could not order transportation. I saw this moment as a wonderful inversion of that 90s internet tenet, ‘code is law.’ The main idea is that the architectures of electronic systems perform similar regulation of behavior as law does in the physical, social realm. In some of the more utopian and techno-deterministic formulations of this idea, the world of paper laws and hoary legislative debates would crumble before the might of the interweb and its meritocratic ways. Opening Uber on her phone to see if it could sell its wares in the Catalan capital was a wonderful reminder of the over-exaggeration of regulation’s untimely demise. Academics love the word, ‘tension,’ and companies like Uber and Airbnb cause mountains of it. They’re such good case studies for regulation, perturbing existing regimes such as hotel taxation and taxi passenger safety, stepping on toes economic and political. The earliest discussions of the social impact of the internet invoked its borderless character and the inevitable clashes that would arise with national sovereignty. That tension is is alive and well, visible in US alarmism over the coming EU General Data Protection Regulation, in the regulatory tussle over Uber and its so-called sharing economy kin, and in the recent invalidation of the Safe Harbor framework. Law may move slowly, but it still packs a punch.

Law Policy Power

Data Protection Policy Privacy by Design Realpolitik

One of the debates in privacy is the continuing feasibility of Collection Limitation as a data protection principle. Historically, there was some basic tension with data retention pressures, as Steve Wilson noted: “Collection Limitation … can contradict the security or legal instinct to always retain as much data as possible, in case it comes in handy one day.” The IoT, broadly construed, adds a new pressure: the increasing ubiquity of sensor information.

Mobile phones with all their sensors have been a challenge to Collection Limitation for years. Consider the legions of apps that access all of the sensor data because they can. But that was one device – now multiply that by… choose your number du jour. The point is that the IoT/ubiquitous computing/pervasive computing/contextual computing are typified by enhanced monitoring. Collection Limitation is, simply put, the principle of only gathering the data you need for a particular application. The US, enormous market that it is, does not really enforce this principle. That’s unsurprising, as it mainly appears in ‘soft law,’ i.e., there are no sanctions to enforce it in the commercial world. It nominally exists in Europe, but there are very limited ways of enforcing it. How can this principle withstand the emergence of billions of all-seeing, all-hearing devices in the human environment?

In November of last year, two automotive trade bodies released a set of Vehicle Privacy Principles, written in conjunction with the law firm Hogan Lovells with some assistance from the Future of Privacy Forum. I’ve written on these Principles before for O’Reilly Radar – I don’t like them because of their very weak consent principles. Further, the Principles never mention the ability to kill all non-essential, non-driving-related sensors in the car, nor the ability to shut off location tracking. HERE is where I would like to see the Collection Limitation principle reassert itself, in combination with an improved consent posture. If a driver declines whatever shiny, amazing application in-car sensors would enable, and doesn’t want the car manufacturer, the dealership, and any partners to know where she or he is driving, the data should not be collected. Collection Limitation is meaningful here because the car is unique in its function and context – driving. And while the same data could likely be gathered from the driver’s phone, the phone could be off, location services disabled, what have you; it’s a separate consideration. If the car is a locus of sensors, a privacy-positive orientation would have the driver able to kill all non-essential sensing. This is also an argument in support of the continued existence of Collection Limitation.

Data Protection Policy

Behind the scenes Coalitions & Consortia

Jay Stanley, Senior Policy Analyst for the ACLU Speech, Privacy & Technology Project, is doing a series of posts on the IoT. The first piece, ‘The Coming Power Struggles Over the “Internet of Things”,’ contemplates the extension of corporate power into more and more personal and intimate spaces. He begins with this example:

When I stick a movie into my DVD player and try to fast-forward through some of the annoying preliminaries that are at the front of the disc, I get an error message: “This operation prohibited by this disc” […] First of all, it’s not “the disc” that is prohibiting my attempt to forward through trumped up FBI warnings and Hollywood anti-piracy propaganda. It’s the Hollywood studios that have programmed my technology to prohibit my “operation.” […] The message is: “There’s no power play going on here, it’s just how the objective technology works!” What’s actually happening is the movie studios have decided to program technology (or pressure hardware manufacturers to do so) to take away your control over your time and what you watch, and force you to view content that they control, in order to advance their own interests. More broadly, this annoying little example highlights the power struggles we could face as computer chips come to saturate the world around us—the trend often called “the Internet of Things.”

It’s an interesting and important point (though I do wish it was a little less shrill). Questions of power inequity rarely surface in public discussions of data collection and system control, so I’m happy to see Stanley address it. His next piece, “The Internet of Kafkaesque Things,” is a thoughtful if rarified discussion of the similarities and differences between computers and bureaucracies. Stanley worries if those similarities will transmit the inefficiencies and rabbit holes of bureaucracies into ever more personal spaces as devices become more connected:

The bottom line is that the danger is not just that … we will become increasingly subject to the micro-power of bureaucracies as computer chips saturate our lives. There is also the danger that the Kafkaesque absurdities and injustices that characterize bureaucracies will be amplified at those micro levels—and without even being leavened by any of the safety valves (formal or informal) that human bureaucracies often feature.

Both pieces are worth a read and I’m looking forward to the third piece in the series.

Intimacy Power

Latest tweets are now displayed on the right side of the page. w00t. Little victories.

Behind the scenes