courtyard

Professor Katherine Isbister at the University of California Santa Cruz’s Computational Media department is looking for new graduate students (Masters and Ph.D.) to join her group for Fall 2017.

Interested students should apply to the program using the university’s website (see below) and also send an email to katherine.isbister@ucsc.edu to indicate interest. Helpful information when sending an email of interest includes:

-a description of research interests and skillset(s) related to developing and studying tangibles and wearables for play
-a curriculum vitae
-a copy of your unofficial transcript

Social Emotional Technology Lab
The mission of the newly founded Social Emotional Technology Lab at UC Santa Cruz is building and studying human computer interaction technologies that enhance social and emotional experience. Our work takes place at the intersection of games and HCI research and practice. A partial list of projects can be found here: http://www.katherineinterface.com/page3/page3.html

Focal Projects
New graduate researchers are particularly sought to take part in research and development of a) playful tangible technology for self regulation (‘Fidget Widgets’) and b) wearables to support collocated play. Recent publications from both projects are available upon request.

Relevant Skillsets
Programming and hardware prototyping
Experience designing and implementing games and playful systems
User research

Computational Media at UC Santa Cruz
Computational Media is all around us — video games, social media, interactive narrative, smartphone apps, computer-generated films, personalized health coaching, and more. To create these kinds of media, to deeply understand them, to push them forward in novel directions, requires a new kind of interdisciplinary thinker and maker. The new graduate degrees in Computational Media at UC Santa Cruz are designed with this person in mind.

Application Process
Applications for the new programs opened October 1st and close January 3rd. The GRE general test is required. For more information, please visit:

Academics HCI

deadline-extended

Usable Security (USEC) Workshop 2017

Abstract Submission Deadline: 7 December 2016
Full Paper Submission Deadline: 14 December 2016

Notification: 21 January 2017
Camera ready copy due: 31 January 2017
Workshop: 26 February 2017 (co-located with NDSS 2017) at the Catamaran Resort Hotel & Spa in San Diego, CA

Conference Website: http://www.dcs.gla.ac.uk/~karen/usec/

Submission Instructions: http://www.dcs.gla.ac.uk/~karen/usec/submission.html

One cannot have security and privacy without considering both the technical and human aspects thereof. If the user is not given due consideration in the development process, the system is likely to enable users to protect their privacy and security in the Internet.

Usable security and security is more complicated than traditional usability. This is because traditional usability principles cannot always be applied. For example, one of the cornerstones of usability is that people are given feedback on their actions, and are helped to recover from errors. In authentication, we obfuscate password entry (a usability fail) and we give people no assistance to recover from errors. Moreover, security is often not related to the actual functionality of the system, so people often see it as a bolt-on, and an annoying hurdle. These and other usability challenges of security are the focus of this workshop.

We invite submissions on all aspects of human factors including mental models, adoption, and usability in the context of security and privacy. USEC 2017 aims to bring together researchers already engaged in this interdisciplinary effort with other computer science researchers in areas such as visualization, artificial intelligence, machine learning and theoretical computer science as well as researchers from other domains such as economics, legal scientists, social scientists, and psychology. We particularly encourage collaborative research from authors in multiple disciplines.

Topics include, but are not limited to:

  • Human factors related to the deployment of the Internet of Things (New topic for 2017)
  • Usable security / privacy evaluation of existing and/or proposed solutions
  • Mental models that contribute to, or complicate, security or privacy
  • Lessons learned from designing, deploying, managing or evaluating security and privacy technologies
  • Foundations of usable security and privacy incl. usable security and privacy patterns
  • Ethical, psychological, sociological, economic, and legal aspects of security and privacy technologies

We also encourage submissions that contribute to the research community’s knowledge base:

  • Reports of replicating previously published studies and experiments
  • Reports of failed usable security studies or experiments, with the focus on the lessons learned from such experience

Academics Security

arxan-connected-cars

I like this infographic (click above to expand image) though with due respect to the authors, I’m skeptical about the claim that ‘connected cars’ (as if there’s only one thing called a connected car) have 10 times the amount of code in a Boeing 787. But I’m nitpicking. I appreciate that this graphic specifically calls out the OBD2 port as a worry spot as well as noting that insurance dongles lack security. It would be great to do security analysis on all existing dongles in significant circulation to see how bad things really are. I also quite liked this: “LTE coverage and Wifi in the car expose you to the same vulnerabilities as a house on wheels.” That’s simple and effective writing – bravo Arxan.

The Recommendations at the bottom are aimed at consumers. They’re all reasonable and this is the first time I’m seeing “Don’t jailbreak your car.” Again, good on you, Arxan. I’m amused by the suggestion to check your outlets periodically and make sure you know what’s installed. It’s like a combination of encouraging safe sex for your car combined with ‘watch out for spoofed ATMs.’

Arxan is, however, a B2B company, so I would like to see, in addition to consumer recommendations, industry recommendations. Of course, those suggestions are part the services they offer so they can’t give away too much for free, but still – a few pearls of wisdom would be welcome. I know it’s too much to ask for policy-oriented suggestions – especially ones that raise costs – so here are a few:

  • Security Impact Analysis should be a regulatory requirement for all cars that rise above a certain threshold of connectivity (a topic for exploration)

  • Strengthen data breach notification laws (a general suggestion, not just for cars or IoT)

  • Car companies should be required to have CISOs

Data Ownership Data Protection Policy Security User Control

IoT Privacy Forum founder Gilad Rosner features in the latest episode of O’Reilly’s Hardware Podcast, discussing his research for the British government, and the differences between European and US privacy cultures.

On his work used by the UK Parliament (paraphrased in parts):

That research was when the UK government put out a call and said, we’d like to vacuum up a lot of social media data and analyze it for government purposes: “beneficial outcomes” rather than law enforcement. Trying to look at data and come up with new information that would theoretically be beneficial to society. They were wondering how they’d go about it — whether “public” social media posts could present ethical problems when inhaling all that data for analysis. The answer is: yes, there are ethical problems here, because even though information is set to “public”, there’s a concept of respecting the context in which the data was uploaded, or volunteered. When I tweet, I’m not necessarily expecting the government to mine that information about me.

When it comes to privacy and data protection, especially with a large actor like the government, one of the most important concerns is procedural safeguards. Governments have ideas all the time, often good ideas, but the apparatus of implementing these ideas is very large, bureaucratic, and diffuse. So what constrains these activities, to make sure they’re being done securely, in line with existing privacy regulations, and with people being sensitive to things not necessarily covered by regulation, but still potentially worrisome? How do we come up with ways of letting good ideas happen, but under control?

Academics Data Protection Law Policy Power Transparency

I’m happy to report that the IoT Privacy Forum will be getting a new website very soon….

Behind the scenes

I’m very happy to announce the publication of a new report: Privacy and the Internet of Things. Published by O’Reilly Media, the report explores the privacy risks implied by increasing numbers of devices in the human environment, and the historical and emerging frameworks to address them. It’s a free report, available for download here:

http://www.oreilly.com/iot/free/privacy-and-the-iot.csp

In this report, you will:

  • Learn the various definitions of the Internet of Things
  • Explore the meaning of privacy and survey its mechanics and methods from American and European perspectives
  • Understand the differences between privacy and security in the IoT
  • Examine major privacy risks implied by the proliferation of connected devices
  • Review existing and emerging frameworks for addressing IoT privacy risks
  • Find resources for further reading and research into IoT privacy

I’d be very happy to discuss any of the report’s content. Please feel free to email me at gilad(at)iotprivacyforum.org.

 

Academics Data Protection Policy Privacy by Design

“Implementing transparency and control in the context of IoT raises a number of challenges. Addressing these challenges is the main goal of the UPRISE-IoT European (CHIST ERA) project in which this PhD will be conducted.

Among these challenges, specific attention will be paid to the following topics:

  • Analysis of the physical environment of the users to get an accurate picture of the devices surrounding them and the data that they collect.
  • Analysis of the purposes of these collections (how the data are supposed to be used) and their legal basis (e.g. the privacy notices of the entities collecting the data).
  • Analysis of the potential privacy risks posed by these data collections.
  • Definition of a framework for interacting with the users. This framework should make it possible for users to get a good understanding of the above information and to express their wishes (e.g. through user-centric privacy policies) in a user-friendly and non-ambiguous way.The PhD project will be conducted in collaboration with the members of the Inria PRIVATICS group and the other partners of the UPRISE-IoT project. It will not necessarily address all the above topics, and the specific focus will be adjusted in agreement with the successful candidate, based on his expertise and motivation.

    Location:

    The thesis will be located in the Inria Rhône-Alpes Research Center, either in Grenoble or in Lyon (south-east of France).

    Required skills:

    The candidate should have a Master’s degree in computer science or a related field. Knowledge and motivation for one of the following fields would be appreciated: networks, privacy, security, human computer interaction.
    Knowledge of French is not required.”

https://cappris.inria.fr/wp-content/uploads/2013/03/PhD-position-UPRISE.pdf

Academics Transparency User Control

Dr Gilad Rosner, the IoT Privacy Forum’s founder, was recently interviewed by the European Alliance for Innovation. Dr Rosner will be keynoting at the 3rd EAI International Conference on Safety and Security in Internet of Things, taking place in Paris in October. An excerpt from the interview:

How would you comment on the recent clashes between governments and tech firms, regarding privacy and security?

The internet age has been a windfall for law enforcement and intelligence gathering agencies. The routine collection of personal information through social media, search, mobile phones, and web usage has created enormous databases of people’s movements, activities and communications. All of this comes from commercial endeavor – that is, the internet and social media are propelled by private companies in search of profit. They create the products and they store the data, and so those private data stores represent an irresistible target for data-hungry government entities like the NSA and others.

The ‘government’ is never one thing. Governments are comprised of agencies, interests, and people – overlapping powers and agendas. In the case of law enforcement, different groups have different remits and varying degrees of power. Foreign intelligence gathering is not the same as domestic law enforcement, and the rules that enable and constrain different agencies vary widely. There is a blurry line between lawful and unlawful access to private stores of personal data. The Snowden disclosures gave the world some perspective about just how blurry that line is, and how the governance of intelligence gathering may be porous or insufficient.

Sociologists have noted that states ‘penetrate’ their populations; that people need to be made ‘legible’ so that the state can act upon them. A strong argument can be made that intelligence gathering – for foreign or domestic purposes – is a core characteristic of the modern state. As such, lawful and unlawful access (and who gets to say which is which?) are two sides of the same coin: the state’s desire for information about people. Part of the way liberal democracies are judged is through consideration of their legitimacy. When government actors are accused of hacking into private data stores or otherwise circumventing established legal methods of obtaining access, such as search warrants and subpoenas, that legitimacy is called into question. Still, the line is blurry, and because of the secretive nature of intelligence gathering, it’s difficult to get a complete picture of when agencies are acting within their rights, when company practice facilitates or hinders the transfer of personal data to government actors, and when everyone is acting within ‘normal’ operating procedures.

Read the whole interview here: http://blog.eai.eu/the-blurry-line-between-data-protection-and-coercion/

Academics Conference Data Protection

UC Berkeley’s Center for Long-Term Cybersecurity released “Cybersecurity Futures 2020,” a set of scenarios meant to spur conversations about the future of cybersecurity and related topics. Dr. Gilad Rosner, founder of the IoT Privacy Forum, was one of the contributors to the Intentional Internet of Things scenario, which provokes discussion with this image of the future:

“While the widespread adoption of IoT technologies may be predictable in 2016, the mechanism that will propel this shift is less so. In this scenario, government will intentionally drive IoT adoption to help societies combat recalcitrant large-scale problems in areas like education, the environment, public health, and personal well-being. This will be widely seen as beneficial, particularly as the technologies move quickly from being household novelties to tools for combating climate change and bolstering health. “Smart cities” will transition from hype to reality as urban areas adapt to the IoT with surprising speed. In this world, cybersecurity will fade as a separate area of interest; when digitally connected technologies are part of everyday life, their security is seen as inseparable from personal and national security. But while this world will offer fantastic benefits for public life and reinvigorate the role of governments, there will also be greater vulnerability as IoT technologies become more foundational to government functions and the collective good.”   (from: https://cltc.berkeley.edu/scenario/scenario-four/)

Main page: https://cltc.berkeley.edu/scenarios/

Intro and Executive Summary: https://cltc.berkeley.edu/files/2016/04/intro_04-27-04a_pages.pdf

Full report: https://cltc.berkeley.edu/files/2016/04/cltcReport_04-27-04a_pages.pdf

 

Academics Policy Privacy by Design Security

This Tuesday, Dr Gilad Rosner, founder of the IoT Privacy Forum, will be doing a free one hour webcast called Privacy, Society & the Internet of Things. It’s an exploration of the many meanings of ‘privacy,’ the privacy risks implied by a world of connected devices, and some of the frameworks emerging to address those risks. The webcast will be broadcast live at 10am PT / 1pm ET / 6pm GMT. Register for it here: http://www.oreilly.com/pub/e/3582

Conference Law Policy Privacy by Design