Identity, Trust, and Reputation have always been core concepts in society. Society has evolved a variety of mechanisms and general approaches to support them. However, with the ever-increasing shift to digital systems, there has been a need to reinterpret these concepts. We might legitimately ask: Who do you say you are and why should I believe you? Given you are who you say you are, why should I deal with you anyway? Why should I believe what you tell me? How should I use my (and others') past experiences of dealing with you to infer something about your future (mis)behaviour? Assuming we can answer such questions, what is then facilitated?
The aim of the module is to provide understanding of how we might seek to develop systems that allow us to answer these questions on a principled basis.
The module covers technical and non-technical aspects. For example, achieving anonymity in a system is a highly technical affair, but the goal of anonymity may be highly contentious (raising issues of accountability and privacy, views on which differ between cultures and between individuals within specific cultures). In addition, as digital devices have proliferated, the user has become an integral part of many systems and has emerged as a prime target for attack. Thus, the module addresses issues such as the users' awareness of false claims of identity and their ability to identify such claims.
Module learning outcomes
At the end of the module the student will:
Understand the importance of identity, trust, reputation and related concepts and why they are needed.
Understand how these concepts are interpreted and implemented in modern-day systems and applications.
Be able to identify major threats to identity, trust and reputation in a variety of system types.
Be able to assess the relative merits of specific solution approaches for particular contexts.
Be able to analyse various authentication mechanisms.
Be able to set identification and authentication requirements in an informed matter and be able to assess the quality of solutions proffered by third parties.
Be aware of leading edge research in the identified areas of interest and the challenges faced.
Indicative assessment
Task
% of module mark
Essay/coursework
100
Special assessment rules
None
Indicative reassessment
Task
% of module mark
Essay/coursework
100
Module feedback
Students will receive oral feedback during the classroom week, and written feedback on their assessment submission.
Indicative reading
D. Artz and Y. Gil, A survey of trust in computer science and the Semantic Web, Journal of Web Semantics: Science, Services and Agents on the World Wide Web, vol. 5, no. 2, June 2007, pp. 58-71.
M. Burrows, M. Abadi, and R. Needham, A logic of authentication, ACM Transactions on Computer Systems, vol. 8, no. 1, Feb. 1990, pp. 18-36.
K. Hoffman, D. Zage, and C. Nita-Rotaru, A survey of attack and defense techniques for reputation systems, ACM Computing Surveys, vol. 42, nol. 1, Dec. 2009, pp. 1-31.
F. Hendrikx, K. Bubendorfer, and R. Chard, Reputation systems: A survey and taxonomy, Journal of Parallel and Distributed Computing, vol. 75, Jan. 2015, pp. 184-197.
Dieter Gollmann. Computer Security, Third Edition (Chapters 5, 11 and 12). Wiley, 2011.