wearable-iot

Researchers at American University and the Center for Digital Democracy have today released a report on wearable eHealth devices, which represent a rapidly-growing IoT sector.

Titled Health Wearable Devices in the Big Data Era: Ensuring Privacy, Security & Consumer Protection (download PDF here), the 122 pages cover privacy and security threats, the Big Data marketplace, predictive/targeting methods, the legal and regulatory environment, and an extensive section on promoting ethical data practices. The intro to the report states:

The report documents a number of current digital health marketing practices that threaten the privacy of consumer health information, including condition targeting, look-alike modeling, predictive analytics, scoring, and the real-time buying and selling of individual consumers.

The potential range of intensely personal data obtainable from wearable (not to mention implantable) devices is what makes them such a potent marketing tool:

An emerging set of techniques will be designed to harness the unique capabilities of wearables—such as biosensors that track bodily functions, and “haptic technology” that enables users to “feel” actual body sensations. Pharmaceutical companies are poised to be among the major beneficiaries of wearable marketing. (p.4)

Recognizing the cost-saving and preventative benefits of eHealth devices, the report calls urgently for “meaningful, effective and enforceable safeguards” at the foundations of the connected-health system. Regulation in the U.S. is currently “weak and fragmented,” it notes, and is totally unprepared for sophisticated technologies capable of “unprecedented” data collection.

Data Ownership Data Protection Intimacy Law Policy Privacy by Design Wearables

filed

 

A former Uber employee is suing the company for whistleblower retaliation, exposing a startling set of claims about data privacy practices within the San Francisco-based corporation. At 45, Ward Spangenberg is a seasoned infosec expert who reportedly discovered extremely lax policy in data protection, retention and security — and how near-universal internal access to detailed personal information is compromising all Uber riders.

First up in Spangenburg’s declaration is that “payroll information for all Uber employees was contained in an unsecured Google spreadsheet”.

He says that Uber collects “a myriad of data” about its customers, including names, emails, social security numbers, locations, device types, and “other data that the user may or may not know they were even providing to Uber by requesting a ride”. Furthermore,

Uber’s lack of security regarding its customer data was resulting in Uber employees being able to track high-profile politicians, celebrities and even personal acquaintances of Uber employees, including ex-boyfriends/girlfriends, and ex-spouses. I also reported that […] allowing all employees to access this information (as opposed to a small security team) was resulting in a violation of governmental regulations regarding data protection and consumer privacy rights.

Such a wealth of personal information, available to all “without regard to any particular employment or security clearance” would make a mockery of Uber’s Vulnerability Management Policy, which “specifically stated, in writing” that:

the policy could not be followed if Uber deemed there was a “legitimate business purpose” for not doing so, or if a Director level employee or above permitted such an exception.

Finally, Uber “routinely deleted files which were subject to litigation holds,” while its Incident Response Team

would be called when governmental agencies raided Uber’s offices due to concerns regarding noncompliance with governmental regulations. In those instances, Uber would lock down the office and immediately cut all connectivity so that law enforcement could not access Uber’s information. I would then be tasked with purchasing all new equipment for the office within the day, which I did when Uber’s Montreal office was raided.

Spangenburg was reportedly “also a point person when foreign government agencies raided company offices abroad,” remotely encrypting office computers from Uber’s San Francisco HQ.

“My job was to just make sure that any time a laptop was seized, the protocol locked the laptops up,” he said.

You can read Will Evans‘s excellent article on the story here. Ward Spangenberg’s full declaration can be read here.

Connected Cars Data Protection Privacy by Design Realpolitik Security

Privacy and consumer watchdog groups have filed a complaint with the Federal Trade Commission about toys that are insecure enough to be used to spy on children easily. The targets of the complaint are Genesis Toys, the maker of My Friend Cayla and i-Que, and Nuance Communications, a third-party provider of voice recognition technology who also supplies products to law enforcement and the intelligence community. 

The Electronic Privacy Information Center (EPIC), the Center for Digital Democracy, the Consumers Union and others have jointly filed the complaint, which boldly states in the introduction:

This complaint concerns toys that spy. By purpose and design, these toys record and collect the private conversations of young children without any limitations on collection, use, or disclosure of this personal information. The toys subject young children to ongoing surveillance and are deployed in homes across the United States without any meaningful data protection standards. They pose an imminent and immediate threat to the safety and security of children in the United States.

The complaint requests that the FTC investigate Genesis Toys for several problematic issues, ranging from easy unauthorized Bluetooth connections to the toys within a 50-foot range, to the difficulty of locating the Terms of Service. Many findings appear to violate the Children’s Online Privacy Protection Act (COPPA) and FTC rules prohibiting unfair and deceptive practices. These include collection of data from children younger than 13, vague descriptions of voice collection practices in the Privacy Policies, and contradictory/misleading information regarding third-party access to voice recordings.
 
Cayla’s companion app invites children to input their physical location, as well as their names, parents’ names, school, and their favorite TV shows, meals and toys. The complaint highlights that it’s unclear how long the manufacturer will hold this data, and if they will ever delete it even if requested:
The Privacy Policies for Cayla and i-Que state that Genesis does not retain personal information for “longer than is necessary.” The scope of what is “necessary” is undefined. Genesis permits users to request deletion of personal information the company holds about them, but advises users that “we may need to keep that information for legitimate business or legal purposes.”
Disturbingly, the complaint notes that each of the toys can be heavily compromised by two unauthorized phones working in tandem:
Researchers discovered that by connecting one phone to the doll through the insecure Bluetooth connection and calling that phone with a second phone, they were able to both converse with and covertly listen to conversations collected through the My Friend Cayla and i-Que toys.
BEUC, a European consumer organisation, have today joined the effort against the manufacturers by complaining to the European Commission, the EU network of national data protection authorities, and the International Consumer Protection and Enforcement Network.

It should be noted that Danelle Dobbins, then a Master’s student at Washington University in St. Louis, wrote about Cayla’s glaring security problems in a 2015 paper. Dobbins draws attention to the work of Ken Munro, a security specialist who hacked Cayla at the beginning of 2015 as seen in the below video (via the BBC).

The complaint further notes that children are being surreptitiously marketed to:

Researchers discovered that My Friend Cayla is pre-programmed with dozens of phrases that reference Disneyworld and Disney movies. For example, Cayla tells children that her favorite movie is Disney’s The Little Mermaid and her favorite song is “Let it Go,” from Disney’s Frozen. Cayla also tells children she loves going to Disneyland and wants to go to Epcot in Disneyworld.

This product placement is not disclosed and is difficult for young children to recognize as advertising. Studies show that children have a significantly harder time identifying advertising when it’s not clearly distinguished from programming.

The toys’ voice recognition feature comes from Nuance, who also offers products and services to law enforcement and intelligence agencies. The most disturbing element of the complaint is the suggestion that children’s personal data and interactions could end up being used in the development of Nuance’s intelligence and law enforcement products:

Nuance uses the voice and text information it collects to “develop, tune, enhance, and improve Nuance services and products.”… Nuance’s products and services include voice biometric solutions sold to military, intelligence, and law enforcement agencies…. The use of children’s voice and text information to enhance products and services sold to military, intelligence, and law enforcement agencies creates a substantial risk of harm because children may be unfairly targeted by these organizations if their voices are inaccurately matched to recordings obtained by these organizations.

This could be one of those moments that causes a policy reaction. While negative press may have an impact on the individual companies and their sectors, the only methods that can truly help prevent more of these kinds of unsafe products is regulation and the threat of lawsuit. Let’s hope that policymakers and regulators use this opportunity to scare other toy makers, demonstrate the power of sanction, punish the bad actors, and increase the potency of data security and children’s safety regulation.

Coalitions & Consortia Data Protection Intimacy Law Privacy by Design Security Toys