598x544-19-12-19

Challenge.gov, the official organizer of US federal tech/science competitions, has unveiled the Privacy Policy Snapshot Challenge with a top prize of $20,000 (runners-up: $10k and $5k).

Submissions are being accepted until April 10, 2017.

In their own words, they

call for designers, developers, and health data privacy experts to create an online Model Privacy Notice (MPN) generator. The MPN is a voluntary, openly available resource designed to help health technology developers who collect digital health data clearly convey information about their privacy and security policies to their users. Similar to a nutrition facts label, the MPN provides a snapshot of a product’s existing privacy practices, encouraging transparency and helping consumers make informed choices when selecting products. The MPN does not mandate specific policies or substitute for more comprehensive or detailed privacy policies.

An effective MPN would have to simplify nuanced information about multiple stakeholders’ complex data collection, retention, sharing and usage practices. It must also prioritize the importance of a variety of objective facts about devices, their documentation and methods of consent acquirement. Crucially, it should foresee ways in which manufacturers might attempt to game the evaluation system, and mitigate those possibilities.

Though this challenge only considers technology collecting health data, it will be instructive for similar initiatives in many other IoT fields. It’s a useful step in supporting the right of consumers to have transparent information about diverse privacy and security practices.

Full conditions and requirements can be found on the contest homepage.

Data Protection Health Privacy Policies Security Transparency User Control Wearables

ces2017

Today was the first official day of 2017’s Consumer Electronics Show in Las Vegas, and I walked the show floor to see this year’s upcoming products and meditate on their privacy implications. I’m a nerd and geek at my core, and this was my first time at CES, so I was excited.

The first stop was a small section dedicated to technology for mothers and babies. The very limited number of companies was surprising to me, but one vendor told me this was because a) CES had only begun this topic area last year, and b) there were already well-established mother/baby technology shows elsewhere, and CES was seen to be both expensive and lacking in a critical mass of interested clientele. The first company I encountered was Mamava, who made privacy booths for mothers to express breast milk. Mainly, as a technology, this was about physical privacy rather4 than informational privacy, though there were some IoT-like features in the form of mobile phone-based unlocking and awareness of who was inside the booth. Next was a company called Bloomlife, who made what they claimed was the first IoT contraction sensor for consumer use. My presumption is that until the data they collect is shared with a HIPAA-covered entity, they would not be subject to HIPAA themselves, which is yet another glaring problem with sectoral privacy legislation. The mother/baby area was paired with beauty products, and aside from impressive wearable, self-contained breast pumps and questionable laser hair regrowth solutions, there wasn’t much interesting.

Next was the main show floor of the Sands Hotel, which was dedicated to health, sports, wearables, robots, and 3D printing, and a special area for startups, university-led products, and those that received government funding. 3Honestly, not much blew me away; my inner geek was not very satisfied. From a privacy perspective, I took note of the proliferation of cameras, which is a long established trend. I encountered a British company called Lyte who made sports sunglasses with an embedded HD camera. I noted that they did not have an external light indicating that they were recording, which would be a more privacy-positive feature, supporting the principle of transparency (e.g., notification). The CEO, whom I interviewed, said that this was because their key market was sports enthusiasts, and the glasses would be used in a sports context rather than just for looking cool in public. He said that as they look towards a more general user base, they would consider such things as an indicator light. I saw a number of robots with cameras in their heads, sometimes with face recognition capabilities, which of course makes me wonder about their data collection practices, i.e., who gets that face data, and is children’s data treated with greater care.

I’m quite interested in smart jewelry, in large part because great design is quite difficult. So often, technology just looks like… more technology, so I’m always pleased to see creative, artistic IoT products. One caught my eye today: the Leaf, by Bellabeat, which is an activity, sleep, stress, and menstrual cycle tracker. One of the touted features of the IoT is its unobtrusiveness, and the Leaf certainly makes its technology disappear.

1

The product that stood out the most for me today was not about data collection, networking, or connecting the physical world to the virtual one. It’s called the Gyenno Spoon, and it does one thing: it helps people with Parkinson’s and other tremors to use a spoon. That’s it. The video below illustrates how profoundly difficult it is to eat for people who suffer from tremors, and shows how an advancement like the Gyenno Spoon can improve well-being and dignity. I’ve been working in technology for over 20 years, and few things have moved me as much as this.

Finally, I chatted with a body camera 8manufacturer who was moving from supplying law enforcement to selling his product to other professionals. In my interview with the founder, he told me how lawyers, doctors, and tow truck drivers wanted a device to record their interactions so as to have evidence of their activities and to prevent harassment. Again, the theme of camera proliferation appeared, and I can’t help but wonder about the continuing normalization of citizens video surveilling each other. I suppose it’s time to read more about surveillance studies. At least the Venture body camera has a recording indicator light.

Tomorrow, the main show floor at the Las Vegas Convention Center! Now, I’m off to find a buffet.

Conference Data Protection Intimacy Law Transparency Wearables

iot_contest_banner

The Federal Trade Commission (FTC) is offering $25,000 (and runner-up prizes) for a “technical solution” that would protect consumers from the security risks of running out-of-date software on IoT devices in their homes.

Demonstrating growing concern about the security/privacy vulnerabilities of billions of connected devices, the FTC is hoping that the winning efforts will benefit the entire IoT spectrum, which goes far beyond the range of connected appliances, meters, screens, toys and gadgets expected to live in the residential home of the future.

The FTC’s press release states:

An ideal tool might be a physical device that the consumer can add to his or her home network that would check and install updates for other IoT devices on that home network, or it might be an app or cloud-based service, or a dashboard or other user interface. Contestants also have the option of adding features such as those that would address hard-coded, factory default or easy-to-guess passwords.

Such solutions could be scalable to entire workplaces, offering widespread protection against security threats.

Contest submissions will be accepted from March 1st until midday May 22, 2017. See the challenge homepage for further details.

The IoT Privacy Forum encourages more of such government contests addressing privacy and security concerns in the IoT. Since privacy is more often a cost center rather than a revenue source, money and attention from government actors is a great way to stimulate markets and technology.

Data Protection Policy Privacy by Design Security Smart Home

wearable-iot

Researchers at American University and the Center for Digital Democracy have today released a report on wearable eHealth devices, which represent a rapidly-growing IoT sector.

Titled Health Wearable Devices in the Big Data Era: Ensuring Privacy, Security & Consumer Protection (download PDF here), the 122 pages cover privacy and security threats, the Big Data marketplace, predictive/targeting methods, the legal and regulatory environment, and an extensive section on promoting ethical data practices. The intro to the report states:

The report documents a number of current digital health marketing practices that threaten the privacy of consumer health information, including condition targeting, look-alike modeling, predictive analytics, scoring, and the real-time buying and selling of individual consumers.

The potential range of intensely personal data obtainable from wearable (not to mention implantable) devices is what makes them such a potent marketing tool:

An emerging set of techniques will be designed to harness the unique capabilities of wearables—such as biosensors that track bodily functions, and “haptic technology” that enables users to “feel” actual body sensations. Pharmaceutical companies are poised to be among the major beneficiaries of wearable marketing. (p.4)

Recognizing the cost-saving and preventative benefits of eHealth devices, the report calls urgently for “meaningful, effective and enforceable safeguards” at the foundations of the connected-health system. Regulation in the U.S. is currently “weak and fragmented,” it notes, and is totally unprepared for sophisticated technologies capable of “unprecedented” data collection.

Data Ownership Data Protection Intimacy Law Policy Privacy by Design Wearables

filed

 

A former Uber employee is suing the company for whistleblower retaliation, exposing a startling set of claims about data privacy practices within the San Francisco-based corporation. At 45, Ward Spangenberg is a seasoned infosec expert who reportedly discovered extremely lax policy in data protection, retention and security — and how near-universal internal access to detailed personal information is compromising all Uber riders.

First up in Spangenburg’s declaration is that “payroll information for all Uber employees was contained in an unsecured Google spreadsheet”.

He says that Uber collects “a myriad of data” about its customers, including names, emails, social security numbers, locations, device types, and “other data that the user may or may not know they were even providing to Uber by requesting a ride”. Furthermore,

Uber’s lack of security regarding its customer data was resulting in Uber employees being able to track high-profile politicians, celebrities and even personal acquaintances of Uber employees, including ex-boyfriends/girlfriends, and ex-spouses. I also reported that […] allowing all employees to access this information (as opposed to a small security team) was resulting in a violation of governmental regulations regarding data protection and consumer privacy rights.

Such a wealth of personal information, available to all “without regard to any particular employment or security clearance” would make a mockery of Uber’s Vulnerability Management Policy, which “specifically stated, in writing” that:

the policy could not be followed if Uber deemed there was a “legitimate business purpose” for not doing so, or if a Director level employee or above permitted such an exception.

Finally, Uber “routinely deleted files which were subject to litigation holds,” while its Incident Response Team

would be called when governmental agencies raided Uber’s offices due to concerns regarding noncompliance with governmental regulations. In those instances, Uber would lock down the office and immediately cut all connectivity so that law enforcement could not access Uber’s information. I would then be tasked with purchasing all new equipment for the office within the day, which I did when Uber’s Montreal office was raided.

Spangenburg was reportedly “also a point person when foreign government agencies raided company offices abroad,” remotely encrypting office computers from Uber’s San Francisco HQ.

“My job was to just make sure that any time a laptop was seized, the protocol locked the laptops up,” he said.

You can read Will Evans‘s excellent article on the story here. Ward Spangenberg’s full declaration can be read here.

Connected Cars Data Protection Privacy by Design Realpolitik Security

Privacy and consumer watchdog groups have filed a complaint with the Federal Trade Commission about toys that are insecure enough to be used to spy on children easily. The targets of the complaint are Genesis Toys, the maker of My Friend Cayla and i-Que, and Nuance Communications, a third-party provider of voice recognition technology who also supplies products to law enforcement and the intelligence community. 

The Electronic Privacy Information Center (EPIC), the Center for Digital Democracy, the Consumers Union and others have jointly filed the complaint, which boldly states in the introduction:

This complaint concerns toys that spy. By purpose and design, these toys record and collect the private conversations of young children without any limitations on collection, use, or disclosure of this personal information. The toys subject young children to ongoing surveillance and are deployed in homes across the United States without any meaningful data protection standards. They pose an imminent and immediate threat to the safety and security of children in the United States.

The complaint requests that the FTC investigate Genesis Toys for several problematic issues, ranging from easy unauthorized Bluetooth connections to the toys within a 50-foot range, to the difficulty of locating the Terms of Service. Many findings appear to violate the Children’s Online Privacy Protection Act (COPPA) and FTC rules prohibiting unfair and deceptive practices. These include collection of data from children younger than 13, vague descriptions of voice collection practices in the Privacy Policies, and contradictory/misleading information regarding third-party access to voice recordings.
 
Cayla’s companion app invites children to input their physical location, as well as their names, parents’ names, school, and their favorite TV shows, meals and toys. The complaint highlights that it’s unclear how long the manufacturer will hold this data, and if they will ever delete it even if requested:
The Privacy Policies for Cayla and i-Que state that Genesis does not retain personal information for “longer than is necessary.” The scope of what is “necessary” is undefined. Genesis permits users to request deletion of personal information the company holds about them, but advises users that “we may need to keep that information for legitimate business or legal purposes.”
Disturbingly, the complaint notes that each of the toys can be heavily compromised by two unauthorized phones working in tandem:
Researchers discovered that by connecting one phone to the doll through the insecure Bluetooth connection and calling that phone with a second phone, they were able to both converse with and covertly listen to conversations collected through the My Friend Cayla and i-Que toys.
BEUC, a European consumer organisation, have today joined the effort against the manufacturers by complaining to the European Commission, the EU network of national data protection authorities, and the International Consumer Protection and Enforcement Network.

It should be noted that Danelle Dobbins, then a Master’s student at Washington University in St. Louis, wrote about Cayla’s glaring security problems in a 2015 paper. Dobbins draws attention to the work of Ken Munro, a security specialist who hacked Cayla at the beginning of 2015 as seen in the below video (via the BBC).

The complaint further notes that children are being surreptitiously marketed to:

Researchers discovered that My Friend Cayla is pre-programmed with dozens of phrases that reference Disneyworld and Disney movies. For example, Cayla tells children that her favorite movie is Disney’s The Little Mermaid and her favorite song is “Let it Go,” from Disney’s Frozen. Cayla also tells children she loves going to Disneyland and wants to go to Epcot in Disneyworld.

This product placement is not disclosed and is difficult for young children to recognize as advertising. Studies show that children have a significantly harder time identifying advertising when it’s not clearly distinguished from programming.

The toys’ voice recognition feature comes from Nuance, who also offers products and services to law enforcement and intelligence agencies. The most disturbing element of the complaint is the suggestion that children’s personal data and interactions could end up being used in the development of Nuance’s intelligence and law enforcement products:

Nuance uses the voice and text information it collects to “develop, tune, enhance, and improve Nuance services and products.”… Nuance’s products and services include voice biometric solutions sold to military, intelligence, and law enforcement agencies…. The use of children’s voice and text information to enhance products and services sold to military, intelligence, and law enforcement agencies creates a substantial risk of harm because children may be unfairly targeted by these organizations if their voices are inaccurately matched to recordings obtained by these organizations.

This could be one of those moments that causes a policy reaction. While negative press may have an impact on the individual companies and their sectors, the only methods that can truly help prevent more of these kinds of unsafe products is regulation and the threat of lawsuit. Let’s hope that policymakers and regulators use this opportunity to scare other toy makers, demonstrate the power of sanction, punish the bad actors, and increase the potency of data security and children’s safety regulation.

Coalitions & Consortia Data Protection Intimacy Law Privacy by Design Security Toys

arxan-connected-cars

I like this infographic (click above to expand image) though with due respect to the authors, I’m skeptical about the claim that ‘connected cars’ (as if there’s only one thing called a connected car) have 10 times the amount of code in a Boeing 787. But I’m nitpicking. I appreciate that this graphic specifically calls out the OBD-II port as a worry spot as well as noting that insurance dongles lack security. It would be great to do security analysis on all existing dongles in significant circulation to see how bad things really are. I also quite liked this: “LTE coverage and Wifi in the car expose you to the same vulnerabilities as a house on wheels.” That’s simple and effective writing – bravo Arxan.

The Recommendations at the bottom are aimed at consumers. They’re all reasonable and this is the first time I’m seeing “Don’t jailbreak your car.” Again, good on you, Arxan. I’m amused by the suggestion to check your outlets periodically and make sure you know what’s installed. It’s like a combination of encouraging safe sex for your car combined with ‘watch out for spoofed ATMs.’

Arxan is, however, a B2B company, so I would like to see, in addition to consumer recommendations, industry recommendations. Of course, those suggestions are part the services they offer so they can’t give away too much for free, but still – a few pearls of wisdom would be welcome. I know it’s too much to ask for policy-oriented suggestions – especially ones that raise costs – so here are a few:

  • Security Impact Analysis should be a regulatory requirement for all cars that rise above a certain threshold of connectivity (a topic for exploration)

  • Strengthen data breach notification laws (a general suggestion, not just for cars or IoT)

  • Car companies should be required to have CISOs

Data Ownership Data Protection Policy Security User Control

IoT Privacy Forum founder Gilad Rosner features in the latest episode of O’Reilly’s Hardware Podcast, discussing his research for the British government, and the differences between European and US privacy cultures.

On his work used by the UK Parliament (paraphrased in parts):

That research was when the UK government put out a call and said, we’d like to vacuum up a lot of social media data and analyze it for government purposes: “beneficial outcomes” rather than law enforcement. Trying to look at data and come up with new information that would theoretically be beneficial to society. They were wondering how they’d go about it — whether “public” social media posts could present ethical problems when inhaling all that data for analysis. The answer is: yes, there are ethical problems here, because even though information is set to “public”, there’s a concept of respecting the context in which the data was uploaded, or volunteered. When I tweet, I’m not necessarily expecting the government to mine that information about me.

When it comes to privacy and data protection, especially with a large actor like the government, one of the most important concerns is procedural safeguards. Governments have ideas all the time, often good ideas, but the apparatus of implementing these ideas is very large, bureaucratic, and diffuse. So what constrains these activities, to make sure they’re being done securely, in line with existing privacy regulations, and with people being sensitive to things not necessarily covered by regulation, but still potentially worrisome? How do we come up with ways of letting good ideas happen, but under control?

Academics Data Protection Law Policy Power Transparency

I’m very happy to announce the publication of a new report: Privacy and the Internet of Things. Published by O’Reilly Media, the report explores the privacy risks implied by increasing numbers of devices in the human environment, and the historical and emerging frameworks to address them. It’s a free report, available for download here:

http://www.oreilly.com/iot/free/privacy-and-the-iot.csp

In this report, you will:

  • Learn the various definitions of the Internet of Things
  • Explore the meaning of privacy and survey its mechanics and methods from American and European perspectives
  • Understand the differences between privacy and security in the IoT
  • Examine major privacy risks implied by the proliferation of connected devices
  • Review existing and emerging frameworks for addressing IoT privacy risks
  • Find resources for further reading and research into IoT privacy

I’d be very happy to discuss any of the report’s content. Please feel free to email me at gilad(at)iotprivacyforum.org.

 

Academics Data Protection Policy Privacy by Design

Dr Gilad Rosner, the IoT Privacy Forum’s founder, was recently interviewed by the European Alliance for Innovation. Dr Rosner will be keynoting at the 3rd EAI International Conference on Safety and Security in Internet of Things, taking place in Paris in October. An excerpt from the interview:

How would you comment on the recent clashes between governments and tech firms, regarding privacy and security?

The internet age has been a windfall for law enforcement and intelligence gathering agencies. The routine collection of personal information through social media, search, mobile phones, and web usage has created enormous databases of people’s movements, activities and communications. All of this comes from commercial endeavor – that is, the internet and social media are propelled by private companies in search of profit. They create the products and they store the data, and so those private data stores represent an irresistible target for data-hungry government entities like the NSA and others.

The ‘government’ is never one thing. Governments are comprised of agencies, interests, and people – overlapping powers and agendas. In the case of law enforcement, different groups have different remits and varying degrees of power. Foreign intelligence gathering is not the same as domestic law enforcement, and the rules that enable and constrain different agencies vary widely. There is a blurry line between lawful and unlawful access to private stores of personal data. The Snowden disclosures gave the world some perspective about just how blurry that line is, and how the governance of intelligence gathering may be porous or insufficient.

Sociologists have noted that states ‘penetrate’ their populations; that people need to be made ‘legible’ so that the state can act upon them. A strong argument can be made that intelligence gathering – for foreign or domestic purposes – is a core characteristic of the modern state. As such, lawful and unlawful access (and who gets to say which is which?) are two sides of the same coin: the state’s desire for information about people. Part of the way liberal democracies are judged is through consideration of their legitimacy. When government actors are accused of hacking into private data stores or otherwise circumventing established legal methods of obtaining access, such as search warrants and subpoenas, that legitimacy is called into question. Still, the line is blurry, and because of the secretive nature of intelligence gathering, it’s difficult to get a complete picture of when agencies are acting within their rights, when company practice facilitates or hinders the transfer of personal data to government actors, and when everyone is acting within ‘normal’ operating procedures.

Read the whole interview here: http://blog.eai.eu/the-blurry-line-between-data-protection-and-coercion/

Academics Conference Data Protection