Day 2 of CES comes to a close, and two trends from Day 1 remain: I saw more robots that do face recognition, and I remain underwhelmed. Today, I walked the cartoonishly huge Las Vegas Convention Center (LVCC), which was dedicated to automotive technology, more robots, more health tech, displays, and accessories. The LVCC is different from the Sands, which I walked on Day 1, in that it’s full of big, established manufacturers: Panasonic, LG, Samsung, Mercedes, Jaguar, Qualcomm, Volkswagen, and so on. The Sands had smaller companies, startups, and university-supported products. After two full days of walking the convention, the technology that made the strongest showing was:

  • Virtual Reality
  • Drones
  • Robots
  • Camera products
  • Automotive (inside the cockpit)

Honorable mention goes to gesture control. I was surprised by the poor showing of health-oriented products. There were plenty of fitness products, but these were the ho-hum Fitbit-like wearables we’ve seen for some time now. I’m not a gamer, but I was quite impressed with the immersive VR tech I played with.

At the intersection of health and automotive was Mercedes. They put sensors in the steering wheel of one of their cars to detect heart rate. If the car determined that you were stressed, it could alter the interior conditions: lighting, music, and – I kid you not – aroma. The glove box contains a scent cartridge that it would introduce into the airflow system to aromatherapy you back to your happy place. Mercedes took the idea that ‘technology companies are really lifestyle companies’ a bit too far – the car offers you suggestions on how to live a fitter, healthier existence based on what it detects and knows about you. Being alone in a car can be a place of respite… I sure as heck don’t want to be nagged about my sloth while I’m driving an expensive car to In-N-Out Burger.

aaeaaqaaaaaaaac6aaaajdk1ngvmnwq5ltmxmzytngq4ys1intkzlwe3mgvlmzjiodrhoa

Day 2 was replete with robots. LG seems to want to bring WALL-E to life with its airport information robot, lawn mower robot, and household ‘Hub robot.’ Robots from less well-known manufacturers were also in full force, and as I mentioned in yesterday’s wrap-up, face recognition appears to be de rigueur. Wavebot, which looks like something you squeeze to plop out cookie dough or raw falafels, was being marketed as a butler/watchdog that continually roams your house. Both it and other robots on display were not working because of the usual convention WiFi failures, so I didn’t really get a sense of their value… or likelihood of killing me and my family because the face recognition broke. My big takeaway was: The Age of Household Robots is nowhere near.

However… virtual assistant technology seems to be galloping at a rapid clip. I saw several integrations of Alexa into non-cyclindrical devices. Volkswagen introduced it into their cars, LG put it in a fridge, and a small robot named Lynx allows Alexa to bumble around your house and wave at you. Toyota built its own chirpy virtual ‘companion’ named Yui that merges the virtual assistant with self-driving. Unsurprisingly, this was merely a concept demo. Sidebar: all of these virtual assistants are so nice. I’m from New York originally – will someone please build me an Alexa/Siri mod that makes it cranky and profane? “Siri, what’s the temperature?” “Stick yer friggin’ head out the window and check yourself, ya schmuck.” “Thanks, Siri… you feel like family.”

Qualcomm showed off a super tiny camera and sensor, which seemed ideal for integration into… well… anything. The person I spoke to mentioned toys, and after December’s ugly story about wildly insecure IoT toys that can be used to spy on children, this development made me more concerned than excited.

Ultimately, I found Day 2 less satisfying than Day 1, in large part because the big manufacturers are trotting out incremental product advancements, rather than the risker, more interesting technology that smaller companies are launching. On the whole, CES failed to excite me; I am aware that writing that puts me at odds with lots of other analysts who want to tell you how amazing everything is. Still, a few things inspired me, a few surprised me, and some products just seemed stupid.

With regard to privacy, this was a Consumer Electronics Show, and so the products were mainly ones that we would bring into close contact with our lives – on our person, in our cars, and in our homes. So, for me, the privacy issues raised are:

  • Encroachment on intimacy: household robots, toys and other near at hand objects with cameras and other sensors means increased collection of intimate moments and activities.
  • Further disappearance of surveillance technology: smaller cameras means less awareness of being monitored and recorded.
  • Normalization of child surveillance: cameras in the home and toys means much more collection of children’s behavior and interactions.
  • Expansion of stakeholders: decreasing costs to include cameras, microphones and other monitoring devices means it’s easier for new entrants into technology markets to introduce monitoring features. Will those new entrants know how to handle intimate personal data respectfully?

I expand on these concerns in my report, Privacy and the Internet of Things, which you can download for free from O’Reilly. And, while CES didn’t impress very much, I at least did not see things that filled me with terror. I did find a bust of Siegfried & Roy, which made the whole trip worthwhile.

Conference Connected Cars Data Ownership Drones Intimacy Security Smart Home Toys User Control Wearables

ces2017

Today was the first official day of 2017’s Consumer Electronics Show in Las Vegas, and I walked the show floor to see this year’s upcoming products and meditate on their privacy implications. I’m a nerd and geek at my core, and this was my first time at CES, so I was excited.

The first stop was a small section dedicated to technology for mothers and babies. The very limited number of companies was surprising to me, but one vendor told me this was because a) CES had only begun this topic area last year, and b) there were already well-established mother/baby technology shows elsewhere, and CES was seen to be both expensive and lacking in a critical mass of interested clientele. The first company I encountered was Mamava, who made privacy booths for mothers to express breast milk. Mainly, as a technology, this was about physical privacy rather4 than informational privacy, though there were some IoT-like features in the form of mobile phone-based unlocking and awareness of who was inside the booth. Next was a company called Bloomlife, who made what they claimed was the first IoT contraction sensor for consumer use. My presumption is that until the data they collect is shared with a HIPAA-covered entity, they would not be subject to HIPAA themselves, which is yet another glaring problem with sectoral privacy legislation. The mother/baby area was paired with beauty products, and aside from impressive wearable, self-contained breast pumps and questionable laser hair regrowth solutions, there wasn’t much interesting.

Next was the main show floor of the Sands Hotel, which was dedicated to health, sports, wearables, robots, and 3D printing, and a special area for startups, university-led products, and those that received government funding. 3Honestly, not much blew me away; my inner geek was not very satisfied. From a privacy perspective, I took note of the proliferation of cameras, which is a long established trend. I encountered a British company called Lyte who made sports sunglasses with an embedded HD camera. I noted that they did not have an external light indicating that they were recording, which would be a more privacy-positive feature, supporting the principle of transparency (e.g., notification). The CEO, whom I interviewed, said that this was because their key market was sports enthusiasts, and the glasses would be used in a sports context rather than just for looking cool in public. He said that as they look towards a more general user base, they would consider such things as an indicator light. I saw a number of robots with cameras in their heads, sometimes with face recognition capabilities, which of course makes me wonder about their data collection practices, i.e., who gets that face data, and is children’s data treated with greater care.

I’m quite interested in smart jewelry, in large part because great design is quite difficult. So often, technology just looks like… more technology, so I’m always pleased to see creative, artistic IoT products. One caught my eye today: the Leaf, by Bellabeat, which is an activity, sleep, stress, and menstrual cycle tracker. One of the touted features of the IoT is its unobtrusiveness, and the Leaf certainly makes its technology disappear.

1

The product that stood out the most for me today was not about data collection, networking, or connecting the physical world to the virtual one. It’s called the Gyenno Spoon, and it does one thing: it helps people with Parkinson’s and other tremors to use a spoon. That’s it. The video below illustrates how profoundly difficult it is to eat for people who suffer from tremors, and shows how an advancement like the Gyenno Spoon can improve well-being and dignity. I’ve been working in technology for over 20 years, and few things have moved me as much as this.

Finally, I chatted with a body camera 8manufacturer who was moving from supplying law enforcement to selling his product to other professionals. In my interview with the founder, he told me how lawyers, doctors, and tow truck drivers wanted a device to record their interactions so as to have evidence of their activities and to prevent harassment. Again, the theme of camera proliferation appeared, and I can’t help but wonder about the continuing normalization of citizens video surveilling each other. I suppose it’s time to read more about surveillance studies. At least the Venture body camera has a recording indicator light.

Tomorrow, the main show floor at the Las Vegas Convention Center! Now, I’m off to find a buffet.

Conference Data Protection Intimacy Law Transparency Wearables

ces

Hello from CES 2017! I’ll be blogging and tweeting from the show floor of the 2017 Consumer Electronics Show in Las Vegas about the latest IoT technology and their privacy and security implications. Follow me on @GiladRosner and @IoTPrivacyForum for updates.

– Gilad

Conference

Privacy and consumer watchdog groups have filed a complaint with the Federal Trade Commission about toys that are insecure enough to be used to spy on children easily. The targets of the complaint are Genesis Toys, the maker of My Friend Cayla and i-Que, and Nuance Communications, a third-party provider of voice recognition technology who also supplies products to law enforcement and the intelligence community. 

The Electronic Privacy Information Center (EPIC), the Center for Digital Democracy, the Consumers Union and others have jointly filed the complaint, which boldly states in the introduction:

This complaint concerns toys that spy. By purpose and design, these toys record and collect the private conversations of young children without any limitations on collection, use, or disclosure of this personal information. The toys subject young children to ongoing surveillance and are deployed in homes across the United States without any meaningful data protection standards. They pose an imminent and immediate threat to the safety and security of children in the United States.

The complaint requests that the FTC investigate Genesis Toys for several problematic issues, ranging from easy unauthorized Bluetooth connections to the toys within a 50-foot range, to the difficulty of locating the Terms of Service. Many findings appear to violate the Children’s Online Privacy Protection Act (COPPA) and FTC rules prohibiting unfair and deceptive practices. These include collection of data from children younger than 13, vague descriptions of voice collection practices in the Privacy Policies, and contradictory/misleading information regarding third-party access to voice recordings.
 
Cayla’s companion app invites children to input their physical location, as well as their names, parents’ names, school, and their favorite TV shows, meals and toys. The complaint highlights that it’s unclear how long the manufacturer will hold this data, and if they will ever delete it even if requested:
The Privacy Policies for Cayla and i-Que state that Genesis does not retain personal information for “longer than is necessary.” The scope of what is “necessary” is undefined. Genesis permits users to request deletion of personal information the company holds about them, but advises users that “we may need to keep that information for legitimate business or legal purposes.”
Disturbingly, the complaint notes that each of the toys can be heavily compromised by two unauthorized phones working in tandem:
Researchers discovered that by connecting one phone to the doll through the insecure Bluetooth connection and calling that phone with a second phone, they were able to both converse with and covertly listen to conversations collected through the My Friend Cayla and i-Que toys.
BEUC, a European consumer organisation, have today joined the effort against the manufacturers by complaining to the European Commission, the EU network of national data protection authorities, and the International Consumer Protection and Enforcement Network.

It should be noted that Danelle Dobbins, then a Master’s student at Washington University in St. Louis, wrote about Cayla’s glaring security problems in a 2015 paper. Dobbins draws attention to the work of Ken Munro, a security specialist who hacked Cayla at the beginning of 2015 as seen in the below video (via the BBC).

The complaint further notes that children are being surreptitiously marketed to:

Researchers discovered that My Friend Cayla is pre-programmed with dozens of phrases that reference Disneyworld and Disney movies. For example, Cayla tells children that her favorite movie is Disney’s The Little Mermaid and her favorite song is “Let it Go,” from Disney’s Frozen. Cayla also tells children she loves going to Disneyland and wants to go to Epcot in Disneyworld.

This product placement is not disclosed and is difficult for young children to recognize as advertising. Studies show that children have a significantly harder time identifying advertising when it’s not clearly distinguished from programming.

The toys’ voice recognition feature comes from Nuance, who also offers products and services to law enforcement and intelligence agencies. The most disturbing element of the complaint is the suggestion that children’s personal data and interactions could end up being used in the development of Nuance’s intelligence and law enforcement products:

Nuance uses the voice and text information it collects to “develop, tune, enhance, and improve Nuance services and products.”… Nuance’s products and services include voice biometric solutions sold to military, intelligence, and law enforcement agencies…. The use of children’s voice and text information to enhance products and services sold to military, intelligence, and law enforcement agencies creates a substantial risk of harm because children may be unfairly targeted by these organizations if their voices are inaccurately matched to recordings obtained by these organizations.

This could be one of those moments that causes a policy reaction. While negative press may have an impact on the individual companies and their sectors, the only methods that can truly help prevent more of these kinds of unsafe products is regulation and the threat of lawsuit. Let’s hope that policymakers and regulators use this opportunity to scare other toy makers, demonstrate the power of sanction, punish the bad actors, and increase the potency of data security and children’s safety regulation.

Coalitions & Consortia Data Protection Intimacy Law Privacy by Design Security Toys

arxan-connected-cars

I like this infographic (click above to expand image) though with due respect to the authors, I’m skeptical about the claim that ‘connected cars’ (as if there’s only one thing called a connected car) have 10 times the amount of code in a Boeing 787. But I’m nitpicking. I appreciate that this graphic specifically calls out the OBD-II port as a worry spot as well as noting that insurance dongles lack security. It would be great to do security analysis on all existing dongles in significant circulation to see how bad things really are. I also quite liked this: “LTE coverage and Wifi in the car expose you to the same vulnerabilities as a house on wheels.” That’s simple and effective writing – bravo Arxan.

The Recommendations at the bottom are aimed at consumers. They’re all reasonable and this is the first time I’m seeing “Don’t jailbreak your car.” Again, good on you, Arxan. I’m amused by the suggestion to check your outlets periodically and make sure you know what’s installed. It’s like a combination of encouraging safe sex for your car combined with ‘watch out for spoofed ATMs.’

Arxan is, however, a B2B company, so I would like to see, in addition to consumer recommendations, industry recommendations. Of course, those suggestions are part the services they offer so they can’t give away too much for free, but still – a few pearls of wisdom would be welcome. I know it’s too much to ask for policy-oriented suggestions – especially ones that raise costs – so here are a few:

  • Security Impact Analysis should be a regulatory requirement for all cars that rise above a certain threshold of connectivity (a topic for exploration)

  • Strengthen data breach notification laws (a general suggestion, not just for cars or IoT)

  • Car companies should be required to have CISOs

Data Ownership Data Protection Policy Security User Control

I’m happy to report that the IoT Privacy Forum will be getting a new website very soon….

Behind the scenes

I’m very happy to announce the publication of a new report: Privacy and the Internet of Things. Published by O’Reilly Media, the report explores the privacy risks implied by increasing numbers of devices in the human environment, and the historical and emerging frameworks to address them. It’s a free report, available for download here:

http://www.oreilly.com/iot/free/privacy-and-the-iot.csp

In this report, you will:

  • Learn the various definitions of the Internet of Things
  • Explore the meaning of privacy and survey its mechanics and methods from American and European perspectives
  • Understand the differences between privacy and security in the IoT
  • Examine major privacy risks implied by the proliferation of connected devices
  • Review existing and emerging frameworks for addressing IoT privacy risks
  • Find resources for further reading and research into IoT privacy

I’d be very happy to discuss any of the report’s content. Please feel free to email me at gilad(at)iotprivacyforum.org.

 

Academics Data Protection Policy Privacy by Design

“Implementing transparency and control in the context of IoT raises a number of challenges. Addressing these challenges is the main goal of the UPRISE-IoT European (CHIST ERA) project in which this PhD will be conducted.

Among these challenges, specific attention will be paid to the following topics:

  • Analysis of the physical environment of the users to get an accurate picture of the devices surrounding them and the data that they collect.
  • Analysis of the purposes of these collections (how the data are supposed to be used) and their legal basis (e.g. the privacy notices of the entities collecting the data).
  • Analysis of the potential privacy risks posed by these data collections.
  • Definition of a framework for interacting with the users. This framework should make it possible for users to get a good understanding of the above information and to express their wishes (e.g. through user-centric privacy policies) in a user-friendly and non-ambiguous way.The PhD project will be conducted in collaboration with the members of the Inria PRIVATICS group and the other partners of the UPRISE-IoT project. It will not necessarily address all the above topics, and the specific focus will be adjusted in agreement with the successful candidate, based on his expertise and motivation.

    Location:

    The thesis will be located in the Inria Rhône-Alpes Research Center, either in Grenoble or in Lyon (south-east of France).

    Required skills:

    The candidate should have a Master’s degree in computer science or a related field. Knowledge and motivation for one of the following fields would be appreciated: networks, privacy, security, human computer interaction.
    Knowledge of French is not required.”

https://cappris.inria.fr/wp-content/uploads/2013/03/PhD-position-UPRISE.pdf

Academics Transparency User Control

While waiting for a taxi in Barcelona, my friend, irritated at waiting for one to appear, told me that Uber was made illegal in Spain. However, she was not 100% sure it was still illegal; perhaps something had changed. To answer this, she took the most direct route and opened Uber on her phone: Nope, we could not order transportation. I saw this moment as a wonderful inversion of that 90s internet tenet, ‘code is law.’ The main idea is that the architectures of electronic systems perform similar regulation of behavior as law does in the physical, social realm. In some of the more utopian and techno-deterministic formulations of this idea, the world of paper laws and hoary legislative debates would crumble before the might of the interweb and its meritocratic ways. Opening Uber on her phone to see if it could sell its wares in the Catalan capital was a wonderful reminder of the over-exaggeration of regulation’s untimely demise. Academics love the word, ‘tension,’ and companies like Uber and Airbnb cause mountains of it. They’re such good case studies for regulation, perturbing existing regimes such as hotel taxation and taxi passenger safety, stepping on toes economic and political. The earliest discussions of the social impact of the internet invoked its borderless character and the inevitable clashes that would arise with national sovereignty. That tension is is alive and well, visible in US alarmism over the coming EU General Data Protection Regulation, in the regulatory tussle over Uber and its so-called sharing economy kin, and in the recent invalidation of the Safe Harbor framework. Law may move slowly, but it still packs a punch.

Law Policy Power

One of the debates in privacy is the continuing feasibility of Collection Limitation as a data protection principle. Historically, there was some basic tension with data retention pressures, as Steve Wilson noted: “Collection Limitation … can contradict the security or legal instinct to always retain as much data as possible, in case it comes in handy one day.” The IoT, broadly construed, adds a new pressure: the increasing ubiquity of sensor information.

Mobile phones with all their sensors have been a challenge to Collection Limitation for years. Consider the legions of apps that access all of the sensor data because they can. But that was one device – now multiply that by… choose your number du jour. The point is that the IoT/ubiquitous computing/pervasive computing/contextual computing are typified by enhanced monitoring. Collection Limitation is, simply put, the principle of only gathering the data you need for a particular application. The US, enormous market that it is, does not really enforce this principle. That’s unsurprising, as it mainly appears in ‘soft law,’ i.e., there are no sanctions to enforce it in the commercial world. It nominally exists in Europe, but there are very limited ways of enforcing it. How can this principle withstand the emergence of billions of all-seeing, all-hearing devices in the human environment?

In November of last year, two automotive trade bodies released a set of Vehicle Privacy Principles, written in conjunction with the law firm Hogan Lovells with some assistance from the Future of Privacy Forum. I’ve written on these Principles before for O’Reilly Radar – I don’t like them because of their very weak consent principles. Further, the Principles never mention the ability to kill all non-essential, non-driving-related sensors in the car, nor the ability to shut off location tracking. HERE is where I would like to see the Collection Limitation principle reassert itself, in combination with an improved consent posture. If a driver declines whatever shiny, amazing application in-car sensors would enable, and doesn’t want the car manufacturer, the dealership, and any partners to know where she or he is driving, the data should not be collected. Collection Limitation is meaningful here because the car is unique in its function and context – driving. And while the same data could likely be gathered from the driver’s phone, the phone could be off, location services disabled, what have you; it’s a separate consideration. If the car is a locus of sensors, a privacy-positive orientation would have the driver able to kill all non-essential sensing. This is also an argument in support of the continued existence of Collection Limitation.

Data Protection Policy