National Academies Press: OpenBook

Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop (2010)

Chapter: 2 Framing the Security and Usability Challenges

« Previous: 1 Overview of Security, Privacy, and Usability
Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×

2
Framing the Security and Usability Challenges

Talks by Butler Lampson and Donald Norman provided workshop participants with an overview of key challenges related to security and usability. Lampson’s presentation discussed the current state of computer security and its relationship to usability considerations. Norman’s remarks centered on the issue of design as it relates to usability, security, and privacy. The following sections summarize these remarks.

AN OVERVIEW OF THE STATE OF COMPUTER SECURITY (BUTLER LAMPSON)

Computer security today is in bad shape: people worry about it a lot and spend a good deal of money on it, but most systems are insecure. The primary reason for this poor state of computer security, Lampson argued, is that users do not have a model of security that they can understand. Lampson suggested that research is needed to decide whether appropriate models can be elicited from what users already know, or whether there is a need to invent and promote new models.

Metrics play an important role in addressing the state of computer security. Security is about risk management: balancing the loss from breaches against the costs of security. Unfortunately, both are difficult to measure. Cost is partly in dollars budgeted for firewalls, software, and help desks but mostly in the time that users spend typing and resetting passwords, responding to warnings, finding workarounds so that

Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×

they can do their jobs, and so forth. Frequently the costs and risks are unknown, and there are no easy ways to estimate them.

A proper allocation of economic incentives is essential to improving computer security. Users, administrators, organizations, and vendors respond to the incentives that they perceive. Users just want to get their work done. Without an appropriate understanding of the risks involved and how proper security may help avoid those risks, they view security as a burden, causing them to ignore it or to attempt to work around it. Organizations do not measure the cost of the time that users spend on security and therefore do not demand usable security. Vendors thus have minimal incentive to supply it.

Many people think that security in the real world is based on locks. In fact, real-world security depends mainly on deterrence and hence on the possibility and severity of punishment. The reason that one’s house is not burgled is not that the burglar cannot get through the lock on the front door; rather, it is that the chance of getting caught, while small, together with a significant punishment, makes burglary uneconomic. It is difficult to deter attacks on a computer connected to the Internet because it is difficult to find “the bad guys.” One way to fix this is to communicate only with parties that are accountable, that one can punish. There are many different punishments: money fines, ostracism from some community, firing, jail, and other options.

Some punishments require identifying the responsible party in the physical world, but others do not. For example, to deter spam, one might reject e-mail unless it is signed by someone known to the receiver or unless it comes with “optional postage” in the form of a link certified by a trusted third party, such as Amazon or the U.S. Postal Service; if one clicks the link, the sender contributes a dollar to a charity.

The choice of safe inputs and the choice of accountable sources are both made by one’s own system, not by any centralized authority. These choices will often depend on information from third parties about identity, reputation, and so forth, but which parties to trust is also one’s own choice. All trust is local.

To be practical, accountability needs an ecosystem that makes it easy for senders to become accountable and for receivers to demand it. If there are just two parties, they can get to know each other in person and exchange signing keys. Because this approach does not scale, there is also a need for third parties that can certify identities or attributes, as they do today for cryptographic keys. This need not hurt anonymity unduly, since the third parties can preserve anonymity except when there is trouble, or accept bonds posted in anonymous cash.

This scheme is a form of access control: you accept input from me only if I am accountable. There is a big practical difference, though, because

Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×

accountability allows for punishment or the possibility to undo things that should not have been allowed to occur. Auditing is crucial, to establish a chain of evidence, but very permissive access control is acceptable because one can deal with misbehavior after the fact rather than preventing it up front.

One obvious problem with accountability is that one often wants to communicate with parties about whom one does not know much, such as unknown vendors or gambling sites. To reconcile accountability with the freedom to go anywhere on the Internet, one should, Lampson suggests, use two (or more) separate machines: a green machine that demands accountability and a red one that does not.

On the green machine one keeps important things, such as personal, family, and work data, backup files, and so forth. It needs automated management to handle the details of accountability for software and Web sites, but one chooses the manager and decides how high to set the bar: like one’s house or like a bank vault. Of course the green machine is not perfectly secure—no practical machine can be—but it is far more secure than what is generally available today.

On the red machine one lives wild and free, not putting anything there that one really cares about keeping secret or not losing. If anything goes wrong, the red machine is reset to some known state.

Things are so bad for usable security, Lampson concluded, that it will be necessary to give up on perfection and focus on essentials. The primary cause of the problem is metrics and incentives: the costs either of getting security or of not having it are not known, so users do not care much about it. Therefore, vendors have no incentive to make security usable.

To fix this, it is necessary to measure the cost of security, and especially the time that users spend on it. Simple models of security that users can understand are needed. To make systems trustworthy, accountability is needed, and to preserve freedom, separate green and red machines are needed, to protect things that one really cares about from the wild things that can happen on the Internet.

USABLE SECURITY AND PRIVACY: IT’S A MATTER OF DESIGN (DONALD NORMAN)

Among the recurring questions at the workshop were these: Does added security make things more difficult to use? Will people always resent the extra steps? The answer to both questions is the same: Not necessarily. Consider the physical world of doors and locks mentioned earlier: one can see that they can get in the way of easy access but are tolerated because they seem necessary and because the amount of effort required to open them usually seems reasonable. This example highlights

Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×

two key design issues: (1) the importance of users (and vendors) understanding the necessity for protection and (2) the reasonableness of the effort required.

Different groups are involved in ensuring the security of a computer system, each group requiring a different form of design assistance. System developers provide the underlying mechanisms, but the information technology (IT) administrators at the various sites determine just how those policies are to be enforced. The IT staff is under considerable pressure from its own administration to reduce security and privacy concerns, but to do so it must be well versed in technology, in the law, in the needs of the user community, and in the psychology of both the legitimate and the illegitimate users. What the community needs, Norman suggested, is a set of standardized scripts, templates, and system tools that allows them to implement best practices in ways that are both effective and efficient, standardizing interactions across systems in order to simplify the life of users but still tailoring the requirements to any special needs of the organization. These tools do not exist today.

In the absence of standard guidelines and adequate tools, different systems implement the same policies with very different philosophies and requirements, complicating life for people who must use multiple systems. Developers who lack an understanding of real human behavior tend to impose logical rules and requirements on a bewildered, overwhelmed audience. The users, either not understanding the rationale or simply disagreeing with the necessity for the procedures imposed on them, see these as impediments to accomplishing their jobs. Moreover, the system developers may lack understanding of the clever ruses and social engineering skills of the illegitimate users, who break into systems the easy way: by lying, stealing, and deceiving. The strongest locks in the world do not deter the clever social engineer.

Security and privacy are difficult problems. Norman suggested that a way to improve security is to design systems that are easy to use for their intended purposes or by the intended people, but difficult for non-authorized people or uses. For these purposes, Norman added, one needs to consider components not normally considered in simple product design: means of authenticating identities or authority, needs, and permissions.

It also means undertaking research to ensure that systems are accompanied by a clear and understandable conceptual model, Norman concluded. Individuals do appear willing to adapt to the inconvenience of locks that seem reasonable for protection, but not to those that just get in the way. If people understand why they are required to implement security protocols, they might be more willing to pay a reasonable penalty of inconvenience.

Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×
Page 7
Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×
Page 8
Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×
Page 9
Suggested Citation:"2 Framing the Security and Usability Challenges." National Research Council. 2010. Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/12998.
×
Page 10
Next: 3 Current Research at the Intersection of Usability, Security, and Privacy »
Toward Better Usability, Security, and Privacy of Information Technology: Report of a Workshop Get This Book
×
Buy Paperback | $29.00 Buy Ebook | $23.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Despite many advances, security and privacy often remain too complex for individuals or enterprises to manage effectively or to use conveniently. Security is hard for users, administrators, and developers to understand, making it all too easy to use, configure, or operate systems in ways that are inadvertently insecure. Moreover, security and privacy technologies originally were developed in a context in which system administrators had primary responsibility for security and privacy protections and in which the users tended to be sophisticated. Today, the user base is much wider--including the vast majority of employees in many organizations and a large fraction of households--but the basic models for security and privacy are essentially unchanged.

Security features can be clumsy and awkward to use and can present significant obstacles to getting work done. As a result, cybersecurity measures are all too often disabled or bypassed by the users they are intended to protect. Similarly, when security gets in the way of functionality, designers and administrators deemphasize it.

The result is that end users often engage in actions, knowingly or unknowingly, that compromise the security of computer systems or contribute to the unwanted release of personal or other confidential information. Toward Better Usability, Security, and Privacy of Information Technology discusses computer system security and privacy, their relationship to usability, and research at their intersection.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!