The 2nd European Workshop on Usable Security (EuroUSEC) will be an affiliated workshop at the 2nd IEEE European Symposium on Security and Privacy (EuroS&P) on April 29, 2017 in Paris at UPMC Campus Jussieu.

EuroUSEC is soliciting “previously unpublished work offering novel research contributions in any aspect of human factors in security and privacy for end-users and IT professionals,” including but not limited to:

  • innovative security or privacy functionality and design
  • new applications of existing models or technology
  • field studies of security or privacy technology
  • usability evaluations of new or existing security or privacy features
  • security testing of new or existing usability features
  • longitudinal studies of deployed security or privacy features
  • studies of administrators or developers and support for security and privacy
  • psychological, sociological and economic aspects of security and privacy
  • the impact of organizational policy or procurement decisions
  • methodology for usable security and privacy research
  • lessons learned from the deployment and use of usable privacy and security features
  • reports of replicating previously published studies and experiments
  • reports of failed usable privacy/security studies or experiments, with focus on the lessons learned

The submission deadline is March 17, 2017 and full instructions are published on the event homepage.

All affiliated workshops are listed on the EuroS&P 2017 homepage.

Academics Conference HCI Privacy by Design Privacy Impact Assessment Security


Professor Katherine Isbister at the University of California Santa Cruz’s Computational Media department is looking for new graduate students (Masters and Ph.D.) to join her group for Fall 2017.

Interested students should apply to the program using the university’s website (see below) and also send an email to [email protected] to indicate interest. Helpful information when sending an email of interest includes:

-a description of research interests and skillset(s) related to developing and studying tangibles and wearables for play
-a curriculum vitae
-a copy of your unofficial transcript

Social Emotional Technology Lab
The mission of the newly founded Social Emotional Technology Lab at UC Santa Cruz is building and studying human computer interaction technologies that enhance social and emotional experience. Our work takes place at the intersection of games and HCI research and practice. A partial list of projects can be found here: http://www.katherineinterface.com/page3/page3.html

Focal Projects
New graduate researchers are particularly sought to take part in research and development of a) playful tangible technology for self regulation (‘Fidget Widgets’) and b) wearables to support collocated play. Recent publications from both projects are available upon request.

Relevant Skillsets
Programming and hardware prototyping
Experience designing and implementing games and playful systems
User research

Computational Media at UC Santa Cruz
Computational Media is all around us — video games, social media, interactive narrative, smartphone apps, computer-generated films, personalized health coaching, and more. To create these kinds of media, to deeply understand them, to push them forward in novel directions, requires a new kind of interdisciplinary thinker and maker. The new graduate degrees in Computational Media at UC Santa Cruz are designed with this person in mind.

Application Process
Applications for the new programs opened October 1st and close January 3rd. The GRE general test is required. For more information, please visit:

Academics HCI


Usable Security (USEC) Workshop 2017

Abstract Submission Deadline: 7 December 2016
Full Paper Submission Deadline: 14 December 2016

Notification: 21 January 2017
Camera ready copy due: 31 January 2017
Workshop: 26 February 2017 (co-located with NDSS 2017) at the Catamaran Resort Hotel & Spa in San Diego, CA

Conference Website: http://www.dcs.gla.ac.uk/~karen/usec/

Submission Instructions: http://www.dcs.gla.ac.uk/~karen/usec/submission.html

One cannot have security and privacy without considering both the technical and human aspects thereof. If the user is not given due consideration in the development process, the system is likely to enable users to protect their privacy and security in the Internet.

Usable security and security is more complicated than traditional usability. This is because traditional usability principles cannot always be applied. For example, one of the cornerstones of usability is that people are given feedback on their actions, and are helped to recover from errors. In authentication, we obfuscate password entry (a usability fail) and we give people no assistance to recover from errors. Moreover, security is often not related to the actual functionality of the system, so people often see it as a bolt-on, and an annoying hurdle. These and other usability challenges of security are the focus of this workshop.

We invite submissions on all aspects of human factors including mental models, adoption, and usability in the context of security and privacy. USEC 2017 aims to bring together researchers already engaged in this interdisciplinary effort with other computer science researchers in areas such as visualization, artificial intelligence, machine learning and theoretical computer science as well as researchers from other domains such as economics, legal scientists, social scientists, and psychology. We particularly encourage collaborative research from authors in multiple disciplines.

Topics include, but are not limited to:

  • Human factors related to the deployment of the Internet of Things (New topic for 2017)
  • Usable security / privacy evaluation of existing and/or proposed solutions
  • Mental models that contribute to, or complicate, security or privacy
  • Lessons learned from designing, deploying, managing or evaluating security and privacy technologies
  • Foundations of usable security and privacy incl. usable security and privacy patterns
  • Ethical, psychological, sociological, economic, and legal aspects of security and privacy technologies

We also encourage submissions that contribute to the research community’s knowledge base:

  • Reports of replicating previously published studies and experiments
  • Reports of failed usable security studies or experiments, with the focus on the lessons learned from such experience

Academics Security

IoT Privacy Forum founder Gilad Rosner features in the latest episode of O’Reilly’s Hardware Podcast, discussing his research for the British government, and the differences between European and US privacy cultures.

On his work used by the UK Parliament (paraphrased in parts):

That research was when the UK government put out a call and said, we’d like to vacuum up a lot of social media data and analyze it for government purposes: “beneficial outcomes” rather than law enforcement. Trying to look at data and come up with new information that would theoretically be beneficial to society. They were wondering how they’d go about it — whether “public” social media posts could present ethical problems when inhaling all that data for analysis. The answer is: yes, there are ethical problems here, because even though information is set to “public”, there’s a concept of respecting the context in which the data was uploaded, or volunteered. When I tweet, I’m not necessarily expecting the government to mine that information about me.

When it comes to privacy and data protection, especially with a large actor like the government, one of the most important concerns is procedural safeguards. Governments have ideas all the time, often good ideas, but the apparatus of implementing these ideas is very large, bureaucratic, and diffuse. So what constrains these activities, to make sure they’re being done securely, in line with existing privacy regulations, and with people being sensitive to things not necessarily covered by regulation, but still potentially worrisome? How do we come up with ways of letting good ideas happen, but under control?

Academics Data Protection Law Policy Power Transparency

I’m very happy to announce the publication of a new report: Privacy and the Internet of Things. Published by O’Reilly Media, the report explores the privacy risks implied by increasing numbers of devices in the human environment, and the historical and emerging frameworks to address them. It’s a free report, available for download here:


In this report, you will:

  • Learn the various definitions of the Internet of Things
  • Explore the meaning of privacy and survey its mechanics and methods from American and European perspectives
  • Understand the differences between privacy and security in the IoT
  • Examine major privacy risks implied by the proliferation of connected devices
  • Review existing and emerging frameworks for addressing IoT privacy risks
  • Find resources for further reading and research into IoT privacy

I’d be very happy to discuss any of the report’s content. Please feel free to email me at gilad(at)iotprivacyforum.org.


Academics Data Protection Policy Privacy by Design

“Implementing transparency and control in the context of IoT raises a number of challenges. Addressing these challenges is the main goal of the UPRISE-IoT European (CHIST ERA) project in which this PhD will be conducted.

Among these challenges, specific attention will be paid to the following topics:

  • Analysis of the physical environment of the users to get an accurate picture of the devices surrounding them and the data that they collect.
  • Analysis of the purposes of these collections (how the data are supposed to be used) and their legal basis (e.g. the privacy notices of the entities collecting the data).
  • Analysis of the potential privacy risks posed by these data collections.
  • Definition of a framework for interacting with the users. This framework should make it possible for users to get a good understanding of the above information and to express their wishes (e.g. through user-centric privacy policies) in a user-friendly and non-ambiguous way.The PhD project will be conducted in collaboration with the members of the Inria PRIVATICS group and the other partners of the UPRISE-IoT project. It will not necessarily address all the above topics, and the specific focus will be adjusted in agreement with the successful candidate, based on his expertise and motivation.


    The thesis will be located in the Inria Rhône-Alpes Research Center, either in Grenoble or in Lyon (south-east of France).

    Required skills:

    The candidate should have a Master’s degree in computer science or a related field. Knowledge and motivation for one of the following fields would be appreciated: networks, privacy, security, human computer interaction.
    Knowledge of French is not required.”


Academics Transparency User Control

Dr Gilad Rosner, the IoT Privacy Forum’s founder, was recently interviewed by the European Alliance for Innovation. Dr Rosner will be keynoting at the 3rd EAI International Conference on Safety and Security in Internet of Things, taking place in Paris in October. An excerpt from the interview:

How would you comment on the recent clashes between governments and tech firms, regarding privacy and security?

The internet age has been a windfall for law enforcement and intelligence gathering agencies. The routine collection of personal information through social media, search, mobile phones, and web usage has created enormous databases of people’s movements, activities and communications. All of this comes from commercial endeavor – that is, the internet and social media are propelled by private companies in search of profit. They create the products and they store the data, and so those private data stores represent an irresistible target for data-hungry government entities like the NSA and others.

The ‘government’ is never one thing. Governments are comprised of agencies, interests, and people – overlapping powers and agendas. In the case of law enforcement, different groups have different remits and varying degrees of power. Foreign intelligence gathering is not the same as domestic law enforcement, and the rules that enable and constrain different agencies vary widely. There is a blurry line between lawful and unlawful access to private stores of personal data. The Snowden disclosures gave the world some perspective about just how blurry that line is, and how the governance of intelligence gathering may be porous or insufficient.

Sociologists have noted that states ‘penetrate’ their populations; that people need to be made ‘legible’ so that the state can act upon them. A strong argument can be made that intelligence gathering – for foreign or domestic purposes – is a core characteristic of the modern state. As such, lawful and unlawful access (and who gets to say which is which?) are two sides of the same coin: the state’s desire for information about people. Part of the way liberal democracies are judged is through consideration of their legitimacy. When government actors are accused of hacking into private data stores or otherwise circumventing established legal methods of obtaining access, such as search warrants and subpoenas, that legitimacy is called into question. Still, the line is blurry, and because of the secretive nature of intelligence gathering, it’s difficult to get a complete picture of when agencies are acting within their rights, when company practice facilitates or hinders the transfer of personal data to government actors, and when everyone is acting within ‘normal’ operating procedures.

Read the whole interview here: http://blog.eai.eu/the-blurry-line-between-data-protection-and-coercion/

Academics Conference Data Protection

UC Berkeley’s Center for Long-Term Cybersecurity released “Cybersecurity Futures 2020,” a set of scenarios meant to spur conversations about the future of cybersecurity and related topics. Dr. Gilad Rosner, founder of the IoT Privacy Forum, was one of the contributors to the Intentional Internet of Things scenario, which provokes discussion with this image of the future:

“While the widespread adoption of IoT technologies may be predictable in 2016, the mechanism that will propel this shift is less so. In this scenario, government will intentionally drive IoT adoption to help societies combat recalcitrant large-scale problems in areas like education, the environment, public health, and personal well-being. This will be widely seen as beneficial, particularly as the technologies move quickly from being household novelties to tools for combating climate change and bolstering health. “Smart cities” will transition from hype to reality as urban areas adapt to the IoT with surprising speed. In this world, cybersecurity will fade as a separate area of interest; when digitally connected technologies are part of everyday life, their security is seen as inseparable from personal and national security. But while this world will offer fantastic benefits for public life and reinvigorate the role of governments, there will also be greater vulnerability as IoT technologies become more foundational to government functions and the collective good.”   (from: https://cltc.berkeley.edu/scenario/scenario-four/)

Main page: https://cltc.berkeley.edu/scenarios/

Intro and Executive Summary: https://cltc.berkeley.edu/files/2016/04/intro_04-27-04a_pages.pdf

Full report: https://cltc.berkeley.edu/files/2016/04/cltcReport_04-27-04a_pages.pdf


Academics Policy Privacy by Design Security