598x544-19-12-19

Challenge.gov, the official organizer of US federal tech/science competitions, has unveiled the Privacy Policy Snapshot Challenge with a top prize of $20,000 (runners-up: $10k and $5k).

Submissions are being accepted until April 10, 2017.

In their own words, they

call for designers, developers, and health data privacy experts to create an online Model Privacy Notice (MPN) generator. The MPN is a voluntary, openly available resource designed to help health technology developers who collect digital health data clearly convey information about their privacy and security policies to their users. Similar to a nutrition facts label, the MPN provides a snapshot of a product’s existing privacy practices, encouraging transparency and helping consumers make informed choices when selecting products. The MPN does not mandate specific policies or substitute for more comprehensive or detailed privacy policies.

An effective MPN would have to simplify nuanced information about multiple stakeholders’ complex data collection, retention, sharing and usage practices. It must also prioritize the importance of a variety of objective facts about devices, their documentation and methods of consent acquirement. Crucially, it should foresee ways in which manufacturers might attempt to game the evaluation system, and mitigate those possibilities.

Though this challenge only considers technology collecting health data, it will be instructive for similar initiatives in many other IoT fields. It’s a useful step in supporting the right of consumers to have transparent information about diverse privacy and security practices.

Full conditions and requirements can be found on the contest homepage.

Data Protection Health Privacy Policies Security Transparency User Control Wearables

ces2017

Today was the first official day of 2017’s Consumer Electronics Show in Las Vegas, and I walked the show floor to see this year’s upcoming products and meditate on their privacy implications. I’m a nerd and geek at my core, and this was my first time at CES, so I was excited.

The first stop was a small section dedicated to technology for mothers and babies. The very limited number of companies was surprising to me, but one vendor told me this was because a) CES had only begun this topic area last year, and b) there were already well-established mother/baby technology shows elsewhere, and CES was seen to be both expensive and lacking in a critical mass of interested clientele. The first company I encountered was Mamava, who made privacy booths for mothers to express breast milk. Mainly, as a technology, this was about physical privacy rather4 than informational privacy, though there were some IoT-like features in the form of mobile phone-based unlocking and awareness of who was inside the booth. Next was a company called Bloomlife, who made what they claimed was the first IoT contraction sensor for consumer use. My presumption is that until the data they collect is shared with a HIPAA-covered entity, they would not be subject to HIPAA themselves, which is yet another glaring problem with sectoral privacy legislation. The mother/baby area was paired with beauty products, and aside from impressive wearable, self-contained breast pumps and questionable laser hair regrowth solutions, there wasn’t much interesting.

Next was the main show floor of the Sands Hotel, which was dedicated to health, sports, wearables, robots, and 3D printing, and a special area for startups, university-led products, and those that received government funding. 3Honestly, not much blew me away; my inner geek was not very satisfied. From a privacy perspective, I took note of the proliferation of cameras, which is a long established trend. I encountered a British company called Lyte who made sports sunglasses with an embedded HD camera. I noted that they did not have an external light indicating that they were recording, which would be a more privacy-positive feature, supporting the principle of transparency (e.g., notification). The CEO, whom I interviewed, said that this was because their key market was sports enthusiasts, and the glasses would be used in a sports context rather than just for looking cool in public. He said that as they look towards a more general user base, they would consider such things as an indicator light. I saw a number of robots with cameras in their heads, sometimes with face recognition capabilities, which of course makes me wonder about their data collection practices, i.e., who gets that face data, and is children’s data treated with greater care.

I’m quite interested in smart jewelry, in large part because great design is quite difficult. So often, technology just looks like… more technology, so I’m always pleased to see creative, artistic IoT products. One caught my eye today: the Leaf, by Bellabeat, which is an activity, sleep, stress, and menstrual cycle tracker. One of the touted features of the IoT is its unobtrusiveness, and the Leaf certainly makes its technology disappear.

1

The product that stood out the most for me today was not about data collection, networking, or connecting the physical world to the virtual one. It’s called the Gyenno Spoon, and it does one thing: it helps people with Parkinson’s and other tremors to use a spoon. That’s it. The video below illustrates how profoundly difficult it is to eat for people who suffer from tremors, and shows how an advancement like the Gyenno Spoon can improve well-being and dignity. I’ve been working in technology for over 20 years, and few things have moved me as much as this.

Finally, I chatted with a body camera 8manufacturer who was moving from supplying law enforcement to selling his product to other professionals. In my interview with the founder, he told me how lawyers, doctors, and tow truck drivers wanted a device to record their interactions so as to have evidence of their activities and to prevent harassment. Again, the theme of camera proliferation appeared, and I can’t help but wonder about the continuing normalization of citizens video surveilling each other. I suppose it’s time to read more about surveillance studies. At least the Venture body camera has a recording indicator light.

Tomorrow, the main show floor at the Las Vegas Convention Center! Now, I’m off to find a buffet.

Conference Data Protection Intimacy Law Transparency Wearables

IoT Privacy Forum founder Gilad Rosner features in the latest episode of O’Reilly’s Hardware Podcast, discussing his research for the British government, and the differences between European and US privacy cultures.

On his work used by the UK Parliament (paraphrased in parts):

That research was when the UK government put out a call and said, we’d like to vacuum up a lot of social media data and analyze it for government purposes: “beneficial outcomes” rather than law enforcement. Trying to look at data and come up with new information that would theoretically be beneficial to society. They were wondering how they’d go about it — whether “public” social media posts could present ethical problems when inhaling all that data for analysis. The answer is: yes, there are ethical problems here, because even though information is set to “public”, there’s a concept of respecting the context in which the data was uploaded, or volunteered. When I tweet, I’m not necessarily expecting the government to mine that information about me.

When it comes to privacy and data protection, especially with a large actor like the government, one of the most important concerns is procedural safeguards. Governments have ideas all the time, often good ideas, but the apparatus of implementing these ideas is very large, bureaucratic, and diffuse. So what constrains these activities, to make sure they’re being done securely, in line with existing privacy regulations, and with people being sensitive to things not necessarily covered by regulation, but still potentially worrisome? How do we come up with ways of letting good ideas happen, but under control?

Academics Data Protection Law Policy Power Transparency

“Implementing transparency and control in the context of IoT raises a number of challenges. Addressing these challenges is the main goal of the UPRISE-IoT European (CHIST ERA) project in which this PhD will be conducted.

Among these challenges, specific attention will be paid to the following topics:

  • Analysis of the physical environment of the users to get an accurate picture of the devices surrounding them and the data that they collect.
  • Analysis of the purposes of these collections (how the data are supposed to be used) and their legal basis (e.g. the privacy notices of the entities collecting the data).
  • Analysis of the potential privacy risks posed by these data collections.
  • Definition of a framework for interacting with the users. This framework should make it possible for users to get a good understanding of the above information and to express their wishes (e.g. through user-centric privacy policies) in a user-friendly and non-ambiguous way.The PhD project will be conducted in collaboration with the members of the Inria PRIVATICS group and the other partners of the UPRISE-IoT project. It will not necessarily address all the above topics, and the specific focus will be adjusted in agreement with the successful candidate, based on his expertise and motivation.

    Location:

    The thesis will be located in the Inria Rhône-Alpes Research Center, either in Grenoble or in Lyon (south-east of France).

    Required skills:

    The candidate should have a Master’s degree in computer science or a related field. Knowledge and motivation for one of the following fields would be appreciated: networks, privacy, security, human computer interaction.
    Knowledge of French is not required.”

https://cappris.inria.fr/wp-content/uploads/2013/03/PhD-position-UPRISE.pdf

Academics Transparency User Control