Contact Information
Professor Pamela Briggs |
ProjectsPrivacy, Trust and Identity Permissions for Ambient Intelligence Summary Our technological future is likely to be centred on the concept of Ambient Intelligence (AmI). This term refers to the convergence of ubiquitous computing, ubiquitous communication, and interfaces that are both socially aware and capable of adapting to the needs and preferences of the user. It evokes a society in which humans will be surrounded by 'always-on', unobtrusive, interconnected intelligent objects few of which will bear any resemblance to the computing devices of today. One of the particular challenges of AmI, which marks it out from other ESociety developments, is that the user will be involved in huge numbers of moment-to-moment exchanges of personal data without explicitly sanctioning each transaction. In the present we already carry around devices (mobile phones, personal digital assistants) that exchange personal information with other devices - but we initiate most exchanges ourselves. In the future, devices embedded in the environment and potentially in the body will use software agents to communicate seamlessly about any number of different things: our present state of health, our preferences for what to eat, our immediate schedule, our credentials, our immediate destination, our need for a taxi to get us there in 10 minutes. Agent technologies will be required to manage the flow of information, and a great deal of exciting technical work is ongoing in this field. But personal and social concerns remain unanswered. How might we instruct these agents about when, where and to whom certain details can be released? Questions naturally arise: Do people want information to be shared in this way? How will people understand and manage the process of information exchange on such a vast scale? Moreover, and most crucially, how can individuals set and evaluate the privacy, trust and identity permissions that will govern what is being shared? The AmI challenge is particularly pressing, since in future there will be no obvious physical markers to tell us when we move from private to public cyberspaces and so individuals must be given a clearer vision of how and when to control personal data. The aim of this project is to develop a better understanding of the means by which people will seek to control their personal information and breaks down into the following questions: What permission rules should be implemented in order that seamless information exchange be deemed acceptable in an AmI context? What means should exist for individuals to customise these rules for their own personal circumstances and beliefs? How much information should individuals be given about the data being exchanged? How and how often should the process be monitored? The project will involve four phases: in Phase I key AmI stakeholders will provide specific scenarios illustrating the ways in which privacy, trust and identity information might be exchanged in the future, in Phase 11 scenarios will scripted and videotaped; in Phase III the videotapes will be presented to focus groups drawn from representative sectors of society with a view to drawing up a robust set of user-generated rules concerning privacy, trust and identity permissions for AmI. Finally, in Phase IV a survey will validate these rules with a wider sample. Workshop Call for Papers For further information about the conference, see http://www.pervasive2006.org Family and Communication Technology Workshop, May 2007, Northumbria University (E Society funded) (pdf) Call for papers for Social Science Computer Review (SSCORE). (pdf) Publications Little, L. & Briggs, P. (2007). Pervasive Prying? Trust, Privacy and Identity Issues for Ubiquitous Computing. Paper submitted to Interact Conference, 2007 (pdf) Little, L. & Briggs, P. (2006). Using AmI systems for exchanging health information: Considering trust and privacy issues. Paper presented at York: (pdf) Little, L., Storer, T., Briggs, P., & Duncan, I. (2007). E-voting in an AmI world: Trust, privacy and social implications, Paper submitted to SSCORE special issue (pdf) Briggs, P., Little, L., Love, S., Marsh, S., & Coventry, L. (2005). Ambient Intelligence: Does Private Mean Public? Panel Submission Accepted for the BHCI Conference Edinburgh, September 2005. Briggs, P. and Marsh, S. (in press). Trust, forgiveness and regret: a psychological model? Paper accepted for Pervasive '06, workshop on Trust, Privacy and Identity Issues for Ambient Intelligence. Briggs, P., Little, L., Love, S., & Duncan, I. (2006). Privacy, trust and identity issues for ambient intelligence. Workshop to be held at Pervasive Conference, Dublin Little, L., Marsh, S., & Briggs, P. (2006). Trust and privacy permissions for an ambient world. Chapter submitted to R. Song, L. Korba, G. Yee (Eds.) Trust in e-services: technologies, practices and challenges. Little, L., & Briggs, P. (2006). Tumult and turmoil: privacy in an ambient world. Paper to be presented at Pervasive Workshop: Privacy, trust and identity issues for ambient intelligence, Dublin Little, L., & Briggs, P. (2006). Investigating privacy in an ambient world. Paper accepted for workshop on Privacy and HCI, CHI 2006, Montreal, Canada Little, L & Briggs, P. (2005). Designing Ambient Intelligent
Scenarios to Promote Discussion of Human Values. Paper presented
at Ambient Intelligence workshop, Interact, Rome, September 2005. Patrick, A., Marsh, S. and Briggs, P. (2005). Designing systems that people will trust. In L Cranor and S. Garfinkel (Eds.) Designing Secure Systems That People Can Use. New York: O'Reilly & Associates Little, L., & Briggs, P. (2006). Tumult and Turmoil: Privacy in an Ambient World. Paper submitted to CHI 2006 conference, Montreal , Canada . T. Storer, L. Little, and I. Duncan, ``An exploratory study of voter attitudes towards a pollsterless remote voting system,'' in IaVoSS Workshop on Trustworthy Elections (WOTE 06) Pre-Proceedings (D. Chaum, R. Rivest, and P. Ryan, eds.), (Robinson College, University of Cambridge, England), pp. 77-86, June 2006. (pdf) |