773x175--14-58-38

The 2nd European Workshop on Usable Security (EuroUSEC) will be an affiliated workshop at the 2nd IEEE European Symposium on Security and Privacy (EuroS&P) on April 29, 2017 in Paris at UPMC Campus Jussieu.

EuroUSEC is soliciting “previously unpublished work offering novel research contributions in any aspect of human factors in security and privacy for end-users and IT professionals,” including but not limited to:

  • innovative security or privacy functionality and design
  • new applications of existing models or technology
  • field studies of security or privacy technology
  • usability evaluations of new or existing security or privacy features
  • security testing of new or existing usability features
  • longitudinal studies of deployed security or privacy features
  • studies of administrators or developers and support for security and privacy
  • psychological, sociological and economic aspects of security and privacy
  • the impact of organizational policy or procurement decisions
  • methodology for usable security and privacy research
  • lessons learned from the deployment and use of usable privacy and security features
  • reports of replicating previously published studies and experiments
  • reports of failed usable privacy/security studies or experiments, with focus on the lessons learned

The submission deadline is March 17, 2017 and full instructions are published on the event homepage.

All affiliated workshops are listed on the EuroS&P 2017 homepage.

Academics Conference HCI Privacy by Design Privacy Impact Assessment Security

598x544-19-12-19

Challenge.gov, the official organizer of US federal tech/science competitions, has unveiled the Privacy Policy Snapshot Challenge with a top prize of $20,000 (runners-up: $10k and $5k).

Submissions are being accepted until April 10, 2017.

In their own words, they

call for designers, developers, and health data privacy experts to create an online Model Privacy Notice (MPN) generator. The MPN is a voluntary, openly available resource designed to help health technology developers who collect digital health data clearly convey information about their privacy and security policies to their users. Similar to a nutrition facts label, the MPN provides a snapshot of a product’s existing privacy practices, encouraging transparency and helping consumers make informed choices when selecting products. The MPN does not mandate specific policies or substitute for more comprehensive or detailed privacy policies.

An effective MPN would have to simplify nuanced information about multiple stakeholders’ complex data collection, retention, sharing and usage practices. It must also prioritize the importance of a variety of objective facts about devices, their documentation and methods of consent acquirement. Crucially, it should foresee ways in which manufacturers might attempt to game the evaluation system, and mitigate those possibilities.

Though this challenge only considers technology collecting health data, it will be instructive for similar initiatives in many other IoT fields. It’s a useful step in supporting the right of consumers to have transparent information about diverse privacy and security practices.

Full conditions and requirements can be found on the contest homepage.

Data Protection Health Privacy Policies Security Transparency User Control Wearables

Day 2 of CES comes to a close, and two trends from Day 1 remain: I saw more robots that do face recognition, and I remain underwhelmed. Today, I walked the cartoonishly huge Las Vegas Convention Center (LVCC), which was dedicated to automotive technology, more robots, more health tech, displays, and accessories. The LVCC is different from the Sands, which I walked on Day 1, in that it’s full of big, established manufacturers: Panasonic, LG, Samsung, Mercedes, Jaguar, Qualcomm, Volkswagen, and so on. The Sands had smaller companies, startups, and university-supported products. After two full days of walking the convention, the technology that made the strongest showing was:

  • Virtual Reality
  • Drones
  • Robots
  • Camera products
  • Automotive (inside the cockpit)

Honorable mention goes to gesture control. I was surprised by the poor showing of health-oriented products. There were plenty of fitness products, but these were the ho-hum Fitbit-like wearables we’ve seen for some time now. I’m not a gamer, but I was quite impressed with the immersive VR tech I played with.

At the intersection of health and automotive was Mercedes. They put sensors in the steering wheel of one of their cars to detect heart rate. If the car determined that you were stressed, it could alter the interior conditions: lighting, music, and – I kid you not – aroma. The glove box contains a scent cartridge that it would introduce into the airflow system to aromatherapy you back to your happy place. Mercedes took the idea that ‘technology companies are really lifestyle companies’ a bit too far – the car offers you suggestions on how to live a fitter, healthier existence based on what it detects and knows about you. Being alone in a car can be a place of respite… I sure as heck don’t want to be nagged about my sloth while I’m driving an expensive car to In-N-Out Burger.

aaeaaqaaaaaaaac6aaaajdk1ngvmnwq5ltmxmzytngq4ys1intkzlwe3mgvlmzjiodrhoa

Day 2 was replete with robots. LG seems to want to bring WALL-E to life with its airport information robot, lawn mower robot, and household ‘Hub robot.’ Robots from less well-known manufacturers were also in full force, and as I mentioned in yesterday’s wrap-up, face recognition appears to be de rigueur. Wavebot, which looks like something you squeeze to plop out cookie dough or raw falafels, was being marketed as a butler/watchdog that continually roams your house. Both it and other robots on display were not working because of the usual convention WiFi failures, so I didn’t really get a sense of their value… or likelihood of killing me and my family because the face recognition broke. My big takeaway was: The Age of Household Robots is nowhere near.

However… virtual assistant technology seems to be galloping at a rapid clip. I saw several integrations of Alexa into non-cyclindrical devices. Volkswagen introduced it into their cars, LG put it in a fridge, and a small robot named Lynx allows Alexa to bumble around your house and wave at you. Toyota built its own chirpy virtual ‘companion’ named Yui that merges the virtual assistant with self-driving. Unsurprisingly, this was merely a concept demo. Sidebar: all of these virtual assistants are so nice. I’m from New York originally – will someone please build me an Alexa/Siri mod that makes it cranky and profane? “Siri, what’s the temperature?” “Stick yer friggin’ head out the window and check yourself, ya schmuck.” “Thanks, Siri… you feel like family.”

Qualcomm showed off a super tiny camera and sensor, which seemed ideal for integration into… well… anything. The person I spoke to mentioned toys, and after December’s ugly story about wildly insecure IoT toys that can be used to spy on children, this development made me more concerned than excited.

Ultimately, I found Day 2 less satisfying than Day 1, in large part because the big manufacturers are trotting out incremental product advancements, rather than the risker, more interesting technology that smaller companies are launching. On the whole, CES failed to excite me; I am aware that writing that puts me at odds with lots of other analysts who want to tell you how amazing everything is. Still, a few things inspired me, a few surprised me, and some products just seemed stupid.

With regard to privacy, this was a Consumer Electronics Show, and so the products were mainly ones that we would bring into close contact with our lives – on our person, in our cars, and in our homes. So, for me, the privacy issues raised are:

  • Encroachment on intimacy: household robots, toys and other near at hand objects with cameras and other sensors means increased collection of intimate moments and activities.
  • Further disappearance of surveillance technology: smaller cameras means less awareness of being monitored and recorded.
  • Normalization of child surveillance: cameras in the home and toys means much more collection of children’s behavior and interactions.
  • Expansion of stakeholders: decreasing costs to include cameras, microphones and other monitoring devices means it’s easier for new entrants into technology markets to introduce monitoring features. Will those new entrants know how to handle intimate personal data respectfully?

I expand on these concerns in my report, Privacy and the Internet of Things, which you can download for free from O’Reilly. And, while CES didn’t impress very much, I at least did not see things that filled me with terror. I did find a bust of Siegfried & Roy, which made the whole trip worthwhile.

Conference Connected Cars Data Ownership Drones Intimacy Security Smart Home Toys User Control Wearables

iot_contest_banner

The Federal Trade Commission (FTC) is offering $25,000 (and runner-up prizes) for a “technical solution” that would protect consumers from the security risks of running out-of-date software on IoT devices in their homes.

Demonstrating growing concern about the security/privacy vulnerabilities of billions of connected devices, the FTC is hoping that the winning efforts will benefit the entire IoT spectrum, which goes far beyond the range of connected appliances, meters, screens, toys and gadgets expected to live in the residential home of the future.

The FTC’s press release states:

An ideal tool might be a physical device that the consumer can add to his or her home network that would check and install updates for other IoT devices on that home network, or it might be an app or cloud-based service, or a dashboard or other user interface. Contestants also have the option of adding features such as those that would address hard-coded, factory default or easy-to-guess passwords.

Such solutions could be scalable to entire workplaces, offering widespread protection against security threats.

Contest submissions will be accepted from March 1st until midday May 22, 2017. See the challenge homepage for further details.

The IoT Privacy Forum encourages more of such government contests addressing privacy and security concerns in the IoT. Since privacy is more often a cost center rather than a revenue source, money and attention from government actors is a great way to stimulate markets and technology.

Data Protection Policy Privacy by Design Security Smart Home

filed

 

A former Uber employee is suing the company for whistleblower retaliation, exposing a startling set of claims about data privacy practices within the San Francisco-based corporation. At 45, Ward Spangenberg is a seasoned infosec expert who reportedly discovered extremely lax policy in data protection, retention and security — and how near-universal internal access to detailed personal information is compromising all Uber riders.

First up in Spangenburg’s declaration is that “payroll information for all Uber employees was contained in an unsecured Google spreadsheet”.

He says that Uber collects “a myriad of data” about its customers, including names, emails, social security numbers, locations, device types, and “other data that the user may or may not know they were even providing to Uber by requesting a ride”. Furthermore,

Uber’s lack of security regarding its customer data was resulting in Uber employees being able to track high-profile politicians, celebrities and even personal acquaintances of Uber employees, including ex-boyfriends/girlfriends, and ex-spouses. I also reported that […] allowing all employees to access this information (as opposed to a small security team) was resulting in a violation of governmental regulations regarding data protection and consumer privacy rights.

Such a wealth of personal information, available to all “without regard to any particular employment or security clearance” would make a mockery of Uber’s Vulnerability Management Policy, which “specifically stated, in writing” that:

the policy could not be followed if Uber deemed there was a “legitimate business purpose” for not doing so, or if a Director level employee or above permitted such an exception.

Finally, Uber “routinely deleted files which were subject to litigation holds,” while its Incident Response Team

would be called when governmental agencies raided Uber’s offices due to concerns regarding noncompliance with governmental regulations. In those instances, Uber would lock down the office and immediately cut all connectivity so that law enforcement could not access Uber’s information. I would then be tasked with purchasing all new equipment for the office within the day, which I did when Uber’s Montreal office was raided.

Spangenburg was reportedly “also a point person when foreign government agencies raided company offices abroad,” remotely encrypting office computers from Uber’s San Francisco HQ.

“My job was to just make sure that any time a laptop was seized, the protocol locked the laptops up,” he said.

You can read Will Evans‘s excellent article on the story here. Ward Spangenberg’s full declaration can be read here.

Connected Cars Data Protection Privacy by Design Realpolitik Security

Privacy and consumer watchdog groups have filed a complaint with the Federal Trade Commission about toys that are insecure enough to be used to spy on children easily. The targets of the complaint are Genesis Toys, the maker of My Friend Cayla and i-Que, and Nuance Communications, a third-party provider of voice recognition technology who also supplies products to law enforcement and the intelligence community. 

The Electronic Privacy Information Center (EPIC), the Center for Digital Democracy, the Consumers Union and others have jointly filed the complaint, which boldly states in the introduction:

This complaint concerns toys that spy. By purpose and design, these toys record and collect the private conversations of young children without any limitations on collection, use, or disclosure of this personal information. The toys subject young children to ongoing surveillance and are deployed in homes across the United States without any meaningful data protection standards. They pose an imminent and immediate threat to the safety and security of children in the United States.

The complaint requests that the FTC investigate Genesis Toys for several problematic issues, ranging from easy unauthorized Bluetooth connections to the toys within a 50-foot range, to the difficulty of locating the Terms of Service. Many findings appear to violate the Children’s Online Privacy Protection Act (COPPA) and FTC rules prohibiting unfair and deceptive practices. These include collection of data from children younger than 13, vague descriptions of voice collection practices in the Privacy Policies, and contradictory/misleading information regarding third-party access to voice recordings.
 
Cayla’s companion app invites children to input their physical location, as well as their names, parents’ names, school, and their favorite TV shows, meals and toys. The complaint highlights that it’s unclear how long the manufacturer will hold this data, and if they will ever delete it even if requested:
The Privacy Policies for Cayla and i-Que state that Genesis does not retain personal information for “longer than is necessary.” The scope of what is “necessary” is undefined. Genesis permits users to request deletion of personal information the company holds about them, but advises users that “we may need to keep that information for legitimate business or legal purposes.”
Disturbingly, the complaint notes that each of the toys can be heavily compromised by two unauthorized phones working in tandem:
Researchers discovered that by connecting one phone to the doll through the insecure Bluetooth connection and calling that phone with a second phone, they were able to both converse with and covertly listen to conversations collected through the My Friend Cayla and i-Que toys.
BEUC, a European consumer organisation, have today joined the effort against the manufacturers by complaining to the European Commission, the EU network of national data protection authorities, and the International Consumer Protection and Enforcement Network.

It should be noted that Danelle Dobbins, then a Master’s student at Washington University in St. Louis, wrote about Cayla’s glaring security problems in a 2015 paper. Dobbins draws attention to the work of Ken Munro, a security specialist who hacked Cayla at the beginning of 2015 as seen in the below video (via the BBC).

The complaint further notes that children are being surreptitiously marketed to:

Researchers discovered that My Friend Cayla is pre-programmed with dozens of phrases that reference Disneyworld and Disney movies. For example, Cayla tells children that her favorite movie is Disney’s The Little Mermaid and her favorite song is “Let it Go,” from Disney’s Frozen. Cayla also tells children she loves going to Disneyland and wants to go to Epcot in Disneyworld.

This product placement is not disclosed and is difficult for young children to recognize as advertising. Studies show that children have a significantly harder time identifying advertising when it’s not clearly distinguished from programming.

The toys’ voice recognition feature comes from Nuance, who also offers products and services to law enforcement and intelligence agencies. The most disturbing element of the complaint is the suggestion that children’s personal data and interactions could end up being used in the development of Nuance’s intelligence and law enforcement products:

Nuance uses the voice and text information it collects to “develop, tune, enhance, and improve Nuance services and products.”… Nuance’s products and services include voice biometric solutions sold to military, intelligence, and law enforcement agencies…. The use of children’s voice and text information to enhance products and services sold to military, intelligence, and law enforcement agencies creates a substantial risk of harm because children may be unfairly targeted by these organizations if their voices are inaccurately matched to recordings obtained by these organizations.

This could be one of those moments that causes a policy reaction. While negative press may have an impact on the individual companies and their sectors, the only methods that can truly help prevent more of these kinds of unsafe products is regulation and the threat of lawsuit. Let’s hope that policymakers and regulators use this opportunity to scare other toy makers, demonstrate the power of sanction, punish the bad actors, and increase the potency of data security and children’s safety regulation.

Coalitions & Consortia Data Protection Intimacy Law Privacy by Design Security Toys

deadline-extended

Usable Security (USEC) Workshop 2017

Abstract Submission Deadline: 7 December 2016
Full Paper Submission Deadline: 14 December 2016

Notification: 21 January 2017
Camera ready copy due: 31 January 2017
Workshop: 26 February 2017 (co-located with NDSS 2017) at the Catamaran Resort Hotel & Spa in San Diego, CA

Conference Website: http://www.dcs.gla.ac.uk/~karen/usec/

Submission Instructions: http://www.dcs.gla.ac.uk/~karen/usec/submission.html

One cannot have security and privacy without considering both the technical and human aspects thereof. If the user is not given due consideration in the development process, the system is likely to enable users to protect their privacy and security in the Internet.

Usable security and security is more complicated than traditional usability. This is because traditional usability principles cannot always be applied. For example, one of the cornerstones of usability is that people are given feedback on their actions, and are helped to recover from errors. In authentication, we obfuscate password entry (a usability fail) and we give people no assistance to recover from errors. Moreover, security is often not related to the actual functionality of the system, so people often see it as a bolt-on, and an annoying hurdle. These and other usability challenges of security are the focus of this workshop.

We invite submissions on all aspects of human factors including mental models, adoption, and usability in the context of security and privacy. USEC 2017 aims to bring together researchers already engaged in this interdisciplinary effort with other computer science researchers in areas such as visualization, artificial intelligence, machine learning and theoretical computer science as well as researchers from other domains such as economics, legal scientists, social scientists, and psychology. We particularly encourage collaborative research from authors in multiple disciplines.

Topics include, but are not limited to:

  • Human factors related to the deployment of the Internet of Things (New topic for 2017)
  • Usable security / privacy evaluation of existing and/or proposed solutions
  • Mental models that contribute to, or complicate, security or privacy
  • Lessons learned from designing, deploying, managing or evaluating security and privacy technologies
  • Foundations of usable security and privacy incl. usable security and privacy patterns
  • Ethical, psychological, sociological, economic, and legal aspects of security and privacy technologies

We also encourage submissions that contribute to the research community’s knowledge base:

  • Reports of replicating previously published studies and experiments
  • Reports of failed usable security studies or experiments, with the focus on the lessons learned from such experience

Academics Security

arxan-connected-cars

I like this infographic (click above to expand image) though with due respect to the authors, I’m skeptical about the claim that ‘connected cars’ (as if there’s only one thing called a connected car) have 10 times the amount of code in a Boeing 787. But I’m nitpicking. I appreciate that this graphic specifically calls out the OBD-II port as a worry spot as well as noting that insurance dongles lack security. It would be great to do security analysis on all existing dongles in significant circulation to see how bad things really are. I also quite liked this: “LTE coverage and Wifi in the car expose you to the same vulnerabilities as a house on wheels.” That’s simple and effective writing – bravo Arxan.

The Recommendations at the bottom are aimed at consumers. They’re all reasonable and this is the first time I’m seeing “Don’t jailbreak your car.” Again, good on you, Arxan. I’m amused by the suggestion to check your outlets periodically and make sure you know what’s installed. It’s like a combination of encouraging safe sex for your car combined with ‘watch out for spoofed ATMs.’

Arxan is, however, a B2B company, so I would like to see, in addition to consumer recommendations, industry recommendations. Of course, those suggestions are part the services they offer so they can’t give away too much for free, but still – a few pearls of wisdom would be welcome. I know it’s too much to ask for policy-oriented suggestions – especially ones that raise costs – so here are a few:

  • Security Impact Analysis should be a regulatory requirement for all cars that rise above a certain threshold of connectivity (a topic for exploration)

  • Strengthen data breach notification laws (a general suggestion, not just for cars or IoT)

  • Car companies should be required to have CISOs

Data Ownership Data Protection Policy Security User Control

UC Berkeley’s Center for Long-Term Cybersecurity released “Cybersecurity Futures 2020,” a set of scenarios meant to spur conversations about the future of cybersecurity and related topics. Dr. Gilad Rosner, founder of the IoT Privacy Forum, was one of the contributors to the Intentional Internet of Things scenario, which provokes discussion with this image of the future:

“While the widespread adoption of IoT technologies may be predictable in 2016, the mechanism that will propel this shift is less so. In this scenario, government will intentionally drive IoT adoption to help societies combat recalcitrant large-scale problems in areas like education, the environment, public health, and personal well-being. This will be widely seen as beneficial, particularly as the technologies move quickly from being household novelties to tools for combating climate change and bolstering health. “Smart cities” will transition from hype to reality as urban areas adapt to the IoT with surprising speed. In this world, cybersecurity will fade as a separate area of interest; when digitally connected technologies are part of everyday life, their security is seen as inseparable from personal and national security. But while this world will offer fantastic benefits for public life and reinvigorate the role of governments, there will also be greater vulnerability as IoT technologies become more foundational to government functions and the collective good.”   (from: https://cltc.berkeley.edu/scenario/scenario-four/)

Main page: https://cltc.berkeley.edu/scenarios/

Intro and Executive Summary: https://cltc.berkeley.edu/files/2016/04/intro_04-27-04a_pages.pdf

Full report: https://cltc.berkeley.edu/files/2016/04/cltcReport_04-27-04a_pages.pdf

 

Academics Policy Privacy by Design Security

Ashkan Soltani, the FTC’s Chief Technologist, penned a good article on the particular security challenges of cheap, connected, low-power devices. He uses the venerable refrigerator example to get some important questions across:

“… a refrigerator was once just a refrigerator with one purpose: cooling food. Now that we live in an IoT world, embedded inside that refrigerator is a full-fledged network computer which could potentially be exploited to launch a DDOS attack against the consumer (or some external) network. As the technology behind the household items we buy evolves, so must the way we think about the long-term effect to consumers when they purchase them:

What will be the level of security and support while under warranty? If a critical vulnerability is discovered, will an update be provided? What happens after the warranty expires? Should modern refrigerators have a shelf-life, much like the food contained within?

http://www.ftc.gov/news-events/blogs/techftc/2015/02/whats-security-shelf-life-iot

Security