773x175--14-58-38

The 2nd European Workshop on Usable Security (EuroUSEC) will be an affiliated workshop at the 2nd IEEE European Symposium on Security and Privacy (EuroS&P) on April 29, 2017 in Paris at UPMC Campus Jussieu.

EuroUSEC is soliciting “previously unpublished work offering novel research contributions in any aspect of human factors in security and privacy for end-users and IT professionals,” including but not limited to:

  • innovative security or privacy functionality and design
  • new applications of existing models or technology
  • field studies of security or privacy technology
  • usability evaluations of new or existing security or privacy features
  • security testing of new or existing usability features
  • longitudinal studies of deployed security or privacy features
  • studies of administrators or developers and support for security and privacy
  • psychological, sociological and economic aspects of security and privacy
  • the impact of organizational policy or procurement decisions
  • methodology for usable security and privacy research
  • lessons learned from the deployment and use of usable privacy and security features
  • reports of replicating previously published studies and experiments
  • reports of failed usable privacy/security studies or experiments, with focus on the lessons learned

The submission deadline is March 17, 2017 and full instructions are published on the event homepage.

All affiliated workshops are listed on the EuroS&P 2017 homepage.

Academics Conference HCI Privacy by Design Privacy Impact Assessment Security

598x544-19-12-19

Challenge.gov, the official organizer of US federal tech/science competitions, has unveiled the Privacy Policy Snapshot Challenge with a top prize of $20,000 (runners-up: $10k and $5k).

Submissions are being accepted until April 10, 2017.

In their own words, they

call for designers, developers, and health data privacy experts to create an online Model Privacy Notice (MPN) generator. The MPN is a voluntary, openly available resource designed to help health technology developers who collect digital health data clearly convey information about their privacy and security policies to their users. Similar to a nutrition facts label, the MPN provides a snapshot of a product’s existing privacy practices, encouraging transparency and helping consumers make informed choices when selecting products. The MPN does not mandate specific policies or substitute for more comprehensive or detailed privacy policies.

An effective MPN would have to simplify nuanced information about multiple stakeholders’ complex data collection, retention, sharing and usage practices. It must also prioritize the importance of a variety of objective facts about devices, their documentation and methods of consent acquirement. Crucially, it should foresee ways in which manufacturers might attempt to game the evaluation system, and mitigate those possibilities.

Though this challenge only considers technology collecting health data, it will be instructive for similar initiatives in many other IoT fields. It’s a useful step in supporting the right of consumers to have transparent information about diverse privacy and security practices.

Full conditions and requirements can be found on the contest homepage.

Data Protection Health Privacy Policies Security Transparency User Control Wearables

iot_contest_banner

The Federal Trade Commission (FTC) is offering $25,000 (and runner-up prizes) for a “technical solution” that would protect consumers from the security risks of running out-of-date software on IoT devices in their homes.

Demonstrating growing concern about the security/privacy vulnerabilities of billions of connected devices, the FTC is hoping that the winning efforts will benefit the entire IoT spectrum, which goes far beyond the range of connected appliances, meters, screens, toys and gadgets expected to live in the residential home of the future.

The FTC’s press release states:

An ideal tool might be a physical device that the consumer can add to his or her home network that would check and install updates for other IoT devices on that home network, or it might be an app or cloud-based service, or a dashboard or other user interface. Contestants also have the option of adding features such as those that would address hard-coded, factory default or easy-to-guess passwords.

Such solutions could be scalable to entire workplaces, offering widespread protection against security threats.

Contest submissions will be accepted from March 1st until midday May 22, 2017. See the challenge homepage for further details.

The IoT Privacy Forum encourages more of such government contests addressing privacy and security concerns in the IoT. Since privacy is more often a cost center rather than a revenue source, money and attention from government actors is a great way to stimulate markets and technology.

Data Protection Policy Privacy by Design Security Smart Home

wearable-iot

Researchers at American University and the Center for Digital Democracy have today released a report on wearable eHealth devices, which represent a rapidly-growing IoT sector.

Titled Health Wearable Devices in the Big Data Era: Ensuring Privacy, Security & Consumer Protection (download PDF here), the 122 pages cover privacy and security threats, the Big Data marketplace, predictive/targeting methods, the legal and regulatory environment, and an extensive section on promoting ethical data practices. The intro to the report states:

The report documents a number of current digital health marketing practices that threaten the privacy of consumer health information, including condition targeting, look-alike modeling, predictive analytics, scoring, and the real-time buying and selling of individual consumers.

The potential range of intensely personal data obtainable from wearable (not to mention implantable) devices is what makes them such a potent marketing tool:

An emerging set of techniques will be designed to harness the unique capabilities of wearables—such as biosensors that track bodily functions, and “haptic technology” that enables users to “feel” actual body sensations. Pharmaceutical companies are poised to be among the major beneficiaries of wearable marketing. (p.4)

Recognizing the cost-saving and preventative benefits of eHealth devices, the report calls urgently for “meaningful, effective and enforceable safeguards” at the foundations of the connected-health system. Regulation in the U.S. is currently “weak and fragmented,” it notes, and is totally unprepared for sophisticated technologies capable of “unprecedented” data collection.

Data Ownership Data Protection Intimacy Law Policy Privacy by Design Wearables

filed

 

A former Uber employee is suing the company for whistleblower retaliation, exposing a startling set of claims about data privacy practices within the San Francisco-based corporation. At 45, Ward Spangenberg is a seasoned infosec expert who reportedly discovered extremely lax policy in data protection, retention and security — and how near-universal internal access to detailed personal information is compromising all Uber riders.

First up in Spangenburg’s declaration is that “payroll information for all Uber employees was contained in an unsecured Google spreadsheet”.

He says that Uber collects “a myriad of data” about its customers, including names, emails, social security numbers, locations, device types, and “other data that the user may or may not know they were even providing to Uber by requesting a ride”. Furthermore,

Uber’s lack of security regarding its customer data was resulting in Uber employees being able to track high-profile politicians, celebrities and even personal acquaintances of Uber employees, including ex-boyfriends/girlfriends, and ex-spouses. I also reported that […] allowing all employees to access this information (as opposed to a small security team) was resulting in a violation of governmental regulations regarding data protection and consumer privacy rights.

Such a wealth of personal information, available to all “without regard to any particular employment or security clearance” would make a mockery of Uber’s Vulnerability Management Policy, which “specifically stated, in writing” that:

the policy could not be followed if Uber deemed there was a “legitimate business purpose” for not doing so, or if a Director level employee or above permitted such an exception.

Finally, Uber “routinely deleted files which were subject to litigation holds,” while its Incident Response Team

would be called when governmental agencies raided Uber’s offices due to concerns regarding noncompliance with governmental regulations. In those instances, Uber would lock down the office and immediately cut all connectivity so that law enforcement could not access Uber’s information. I would then be tasked with purchasing all new equipment for the office within the day, which I did when Uber’s Montreal office was raided.

Spangenburg was reportedly “also a point person when foreign government agencies raided company offices abroad,” remotely encrypting office computers from Uber’s San Francisco HQ.

“My job was to just make sure that any time a laptop was seized, the protocol locked the laptops up,” he said.

You can read Will Evans‘s excellent article on the story here. Ward Spangenberg’s full declaration can be read here.

Connected Cars Data Protection Privacy by Design Realpolitik Security

courtyard

Professor Katherine Isbister at the University of California Santa Cruz’s Computational Media department is looking for new graduate students (Masters and Ph.D.) to join her group for Fall 2017.

Interested students should apply to the program using the university’s website (see below) and also send an email to [email protected] to indicate interest. Helpful information when sending an email of interest includes:

-a description of research interests and skillset(s) related to developing and studying tangibles and wearables for play
-a curriculum vitae
-a copy of your unofficial transcript

Social Emotional Technology Lab
The mission of the newly founded Social Emotional Technology Lab at UC Santa Cruz is building and studying human computer interaction technologies that enhance social and emotional experience. Our work takes place at the intersection of games and HCI research and practice. A partial list of projects can be found here: http://www.katherineinterface.com/page3/page3.html

Focal Projects
New graduate researchers are particularly sought to take part in research and development of a) playful tangible technology for self regulation (‘Fidget Widgets’) and b) wearables to support collocated play. Recent publications from both projects are available upon request.

Relevant Skillsets
Programming and hardware prototyping
Experience designing and implementing games and playful systems
User research

Computational Media at UC Santa Cruz
Computational Media is all around us — video games, social media, interactive narrative, smartphone apps, computer-generated films, personalized health coaching, and more. To create these kinds of media, to deeply understand them, to push them forward in novel directions, requires a new kind of interdisciplinary thinker and maker. The new graduate degrees in Computational Media at UC Santa Cruz are designed with this person in mind.

Application Process
Applications for the new programs opened October 1st and close January 3rd. The GRE general test is required. For more information, please visit:

Academics HCI

deadline-extended

Usable Security (USEC) Workshop 2017

Abstract Submission Deadline: 7 December 2016
Full Paper Submission Deadline: 14 December 2016

Notification: 21 January 2017
Camera ready copy due: 31 January 2017
Workshop: 26 February 2017 (co-located with NDSS 2017) at the Catamaran Resort Hotel & Spa in San Diego, CA

Conference Website: http://www.dcs.gla.ac.uk/~karen/usec/

Submission Instructions: http://www.dcs.gla.ac.uk/~karen/usec/submission.html

One cannot have security and privacy without considering both the technical and human aspects thereof. If the user is not given due consideration in the development process, the system is likely to enable users to protect their privacy and security in the Internet.

Usable security and security is more complicated than traditional usability. This is because traditional usability principles cannot always be applied. For example, one of the cornerstones of usability is that people are given feedback on their actions, and are helped to recover from errors. In authentication, we obfuscate password entry (a usability fail) and we give people no assistance to recover from errors. Moreover, security is often not related to the actual functionality of the system, so people often see it as a bolt-on, and an annoying hurdle. These and other usability challenges of security are the focus of this workshop.

We invite submissions on all aspects of human factors including mental models, adoption, and usability in the context of security and privacy. USEC 2017 aims to bring together researchers already engaged in this interdisciplinary effort with other computer science researchers in areas such as visualization, artificial intelligence, machine learning and theoretical computer science as well as researchers from other domains such as economics, legal scientists, social scientists, and psychology. We particularly encourage collaborative research from authors in multiple disciplines.

Topics include, but are not limited to:

  • Human factors related to the deployment of the Internet of Things (New topic for 2017)
  • Usable security / privacy evaluation of existing and/or proposed solutions
  • Mental models that contribute to, or complicate, security or privacy
  • Lessons learned from designing, deploying, managing or evaluating security and privacy technologies
  • Foundations of usable security and privacy incl. usable security and privacy patterns
  • Ethical, psychological, sociological, economic, and legal aspects of security and privacy technologies

We also encourage submissions that contribute to the research community’s knowledge base:

  • Reports of replicating previously published studies and experiments
  • Reports of failed usable security studies or experiments, with the focus on the lessons learned from such experience

Academics Security

Dr Gilad Rosner, the IoT Privacy Forum’s founder, was recently interviewed by the European Alliance for Innovation. Dr Rosner will be keynoting at the 3rd EAI International Conference on Safety and Security in Internet of Things, taking place in Paris in October. An excerpt from the interview:

How would you comment on the recent clashes between governments and tech firms, regarding privacy and security?

The internet age has been a windfall for law enforcement and intelligence gathering agencies. The routine collection of personal information through social media, search, mobile phones, and web usage has created enormous databases of people’s movements, activities and communications. All of this comes from commercial endeavor – that is, the internet and social media are propelled by private companies in search of profit. They create the products and they store the data, and so those private data stores represent an irresistible target for data-hungry government entities like the NSA and others.

The ‘government’ is never one thing. Governments are comprised of agencies, interests, and people – overlapping powers and agendas. In the case of law enforcement, different groups have different remits and varying degrees of power. Foreign intelligence gathering is not the same as domestic law enforcement, and the rules that enable and constrain different agencies vary widely. There is a blurry line between lawful and unlawful access to private stores of personal data. The Snowden disclosures gave the world some perspective about just how blurry that line is, and how the governance of intelligence gathering may be porous or insufficient.

Sociologists have noted that states ‘penetrate’ their populations; that people need to be made ‘legible’ so that the state can act upon them. A strong argument can be made that intelligence gathering – for foreign or domestic purposes – is a core characteristic of the modern state. As such, lawful and unlawful access (and who gets to say which is which?) are two sides of the same coin: the state’s desire for information about people. Part of the way liberal democracies are judged is through consideration of their legitimacy. When government actors are accused of hacking into private data stores or otherwise circumventing established legal methods of obtaining access, such as search warrants and subpoenas, that legitimacy is called into question. Still, the line is blurry, and because of the secretive nature of intelligence gathering, it’s difficult to get a complete picture of when agencies are acting within their rights, when company practice facilitates or hinders the transfer of personal data to government actors, and when everyone is acting within ‘normal’ operating procedures.

Read the whole interview here: http://blog.eai.eu/the-blurry-line-between-data-protection-and-coercion/

Academics Conference Data Protection

UC Berkeley’s Center for Long-Term Cybersecurity released “Cybersecurity Futures 2020,” a set of scenarios meant to spur conversations about the future of cybersecurity and related topics. Dr. Gilad Rosner, founder of the IoT Privacy Forum, was one of the contributors to the Intentional Internet of Things scenario, which provokes discussion with this image of the future:

“While the widespread adoption of IoT technologies may be predictable in 2016, the mechanism that will propel this shift is less so. In this scenario, government will intentionally drive IoT adoption to help societies combat recalcitrant large-scale problems in areas like education, the environment, public health, and personal well-being. This will be widely seen as beneficial, particularly as the technologies move quickly from being household novelties to tools for combating climate change and bolstering health. “Smart cities” will transition from hype to reality as urban areas adapt to the IoT with surprising speed. In this world, cybersecurity will fade as a separate area of interest; when digitally connected technologies are part of everyday life, their security is seen as inseparable from personal and national security. But while this world will offer fantastic benefits for public life and reinvigorate the role of governments, there will also be greater vulnerability as IoT technologies become more foundational to government functions and the collective good.”   (from: https://cltc.berkeley.edu/scenario/scenario-four/)

Main page: https://cltc.berkeley.edu/scenarios/

Intro and Executive Summary: https://cltc.berkeley.edu/files/2016/04/intro_04-27-04a_pages.pdf

Full report: https://cltc.berkeley.edu/files/2016/04/cltcReport_04-27-04a_pages.pdf

 

Academics Policy Privacy by Design Security

This Tuesday, Dr Gilad Rosner, founder of the IoT Privacy Forum, will be doing a free one hour webcast called Privacy, Society & the Internet of Things. It’s an exploration of the many meanings of ‘privacy,’ the privacy risks implied by a world of connected devices, and some of the frameworks emerging to address those risks. The webcast will be broadcast live at 10am PT / 1pm ET / 6pm GMT. Register for it here: http://www.oreilly.com/pub/e/3582

Conference Law Policy Privacy by Design