Information security is a hard economic issue

Ross Anderson is reputed information security and cryptanalysis expert. He is the author of today’s paper review, with an interesting paper that was written in 2001, and as it is mentioned in a postscript right at around the time of the attack to the twin towers in New York, at the 11th September of that year.

Dr. Anderson have done research in the area of economics of information security, and this is still nowadays an important topic – increasingly of greater and greater relevance – and there were more recent publications of his pen; but I decided to bring back the earlier paper, because it was more of a foundational character and as a way of a proper introduction to the topic from an economic perspective here for the Information Age readers.

The main conclusion that can be drawn from Dr. Ross Anderson’s paper is that the perverse incentives surrounding the economics of security of computer systems, where the advantage or economic gain of attackers is always greater than that of defenders, call for the alignment of incentives in the design of computer security systems with the right economic insights. This alignment stems from the significant difficulty in optimal development of security technology, thus, enabling the rational adoption of this technology is the only way for the adoption to be rational.

But I will delve in some more details of the paper that I considered of worth to underline:

Why information security is hard – an economic perspective

 

Abstract:

According to one common view, information security comes down to technical measures. Given better access control policy models, formal proofs of cryptographic protocols, approved firewalls, better ways of detecting intrusions and malicious code, and better tools for system evaluation and assurance, the problems can be solved. The author puts forward a contrary view: information insecurity is at least as much due to perverse incentives. Many of the problems can be explained more clearly and convincingly using the language of microeconomics: network externalities, asymmetric information, moral hazard, adverse selection, liability dumping and the tragedy of the commons.
In the early days of the technology of computer systems, the economics of its incentives were supposed to be mainly due to specific technical processes and protocols. But the rapid development of the software development industry, with its strong network of professionals, all with a set of incentives that could easily be tweaked, made the industry of computer systems realize that they were not immune from the common already identified issues and problems with  the network structure of industry organization. They were vulnerable to well-studied microeconomic case studies such as network externalities, the Tragedy of the Commons, liability dumping and others. That is the view defended by Dr. Ross Anderson in his paper, that the computer security industry is an industry where its optimal organization couldn’t only be achieved through technical means. There were a need to address economic incentives properly. As examples:
(…)
A different kind of incentive failure surfaced in early
2000, with distributed denial of service attacks against
a number of high-prole web sites. These exploit a
number of subverted machines to launch a large coor-
dinated packet flood at a target. Since many of them
flood the victim at the same time, the trac is more
than the target can cope with, and because it comes
from many different sources, it can be very dicult to
stop [7]. Varian pointed out that this was also a case of
incentive failure [20]. While individual computer users
might be happy to spend $100 on anti-virus software
to protect themselves against attack, they are unlikely
to spend even $1 on software to prevent their machines
being used to attack Amazon or Microsoft.
(…)
This is an example of what economists refer to as
the `Tragedy of the Commons’ [15]. If a hundred peas-
ants graze their sheep on the village common, then
whenever another sheep is added its owner gets almost
the full benet { while the other ninety-nine suer only
a small decline in the quality of the grazing. So they
aren’t motivated to object, but rather to add another
sheep of their own and get as much of the grazing as
they can. The result is a dustbowl; and the solution
is regulatory rather than technical. A typical tenth-
century Saxon village had community mechanisms to
deal with this problem; the world of computer secu-
rity still doesn’t. Varian’s proposal is that the costs
of distributed denial-of-service attacks should fall on
the operators of the networks from which the flooding
traffic originates; they can then exert pressure on
their users to install suitable defensive software, or,
for that matter, supply it themselves as part of the
subscription package.
These observations prompted us to look for other
ways in which economics and computer security inter-
act.

From network externalities to perverse incentives

 

This lead to the emergence of the realization that computer security systems were confronted with the same issues as those found in industries such as phone companies, airlines and credit card companies. These were mainly to do with network effects and network externalities:

(…)

The more people use a typical network, the more
valuable it becomes. The more people use the phone
system – or the Internet – more people there are to
talk to and so the more useful it is to each user. This
is sometimes referred to as Metcalfe’s law, and is not
limited to communication systems. The more mer-
chants take credit cards, the more useful they are to
customers, and so the more customers will buy them;
and the more customers have them, the more mer-
chants will want to accept them. So while that net-
works can grow very slowly at first – credit cards took
almost two decades to take o – once positive feed-
back gets established, they can grow very rapidly.

The telegraph, the telephone, the fax machine and most re-
cently the Internet have all followed this model.

(…)

A good introduction to network economics is by
Shapiro and Varian [17]. For our present purposes,
there are three particularly important features of in-
formation technology markets.

  • First, the value of a product to a user depends on
    how many other users adopt it.

 

  • Second, technology often has high fixed costs and
    low marginal costs. The first copy of a chip or
    a software package may cost millions, but subse-
    quent copies may cost very little to manufacture.This isn’t unique to information markets; it’s also seen in business sectors such as airlines and ho-
    tels. In all such sectors, pure price competition
    will tend to drive revenues steadily down towards
    the marginal cost of production (which in the case
    of information is zero). So businesses need ways
    of selling on value rather than on cost.

 

  • Third, there are often large costs to users from
    switching technologies, which leads to lock-in.
    Such markets may remain very profitable, even
    where (incompatible) competitors are very cheap
    to produce. In fact, one of the main results of
    network economic theory is that the net present
    value of the customer base should equal the total
    costs of their switching their business to a com-
    petitor [19].
220px-metcalfe-network-effect-svg
Network economics with these kind of effects often lead to a diverse suite of perverse incentives and behaviours that at a first superficial look might appear to be irrational, but because of those effects were otherwise perfectly rational:
The huge first-mover advantages that can arise
in economic systems with strong positive feedback are
the origin of the so-called \Microsoft philosophy” of
`we’ll ship it on Tuesday and get it right by version
3′. Although sometimes attributed by cynics to a per-
sonal moral failing on the part of Bill Gates, this is
perfectly rational behaviour in many markets where
network economics apply.
Another network effect is liability dumping:
(…) In fact, the access controls in Windows NT are often irrele-
vant, as most applications either run with adminis-
trator privilege (or, equivalently, require dangerously
powerful operating system services to be enabled).
This is also explained simply from the viewpoint of
network economics: mandatory security would sub-
tract value, as it would make life more dicult for the
application developers. Indeed, Odlyzko observes that
much of the lack of user-friendliness of both Microsoft
software and the Internet is due to the fact that both
Microsoft and the Internet achieved success by appeal-
ing to developers. The support costs that Microsoft
dumps on users – and in fact even the cost of the time
wasted waiting for PCs to boot up and shut down –
greatly exceed its turnover [16]
The more technical issues of security administration and the type of cryptographic protocol to use is also in this incentive loop, reinforcing the point made by the author:
Network owners and builders will also appeal to the
developers of the next generation of applications by
arranging for the bulk of the support costs to fall on
users rather than developers, even if this makes effec-
tive security administration impractical. One reason
for the current appeal of public key cryptography may
be that it can simplify development – even at the cost
of placing an unreasonable administrative burden on
users who are neither able nor willing to undertake
it [9]. The technical way to try to fix this problem is
to make security administration more `user-friendly’ or
`plug-and-play’; many attempts in this direction have
met with mixed success. The more subtle approach
is to try to construct an authentication system whose
operators benefit from network effects; (…)

Conclusions

 

The rest of the paper contains some more interesting topics about economic and information security. Topics of relevance today such as information warfare, where the author does important distinctions between offense and defense incentives, to the way economic insights might be useful also to understand and respond adequately to situations of asymmetry of information or the dangers of ‘security-by-obscurity’. But I will end this review for the moment with the main conclusions from this important paper, with implications the relevance of which we are still bearing witness in our current state of affairs in matters of information and economic security:

(…)

Much has been written on the failure of informa-
tion security mechanisms to protect end users from
privacy violations and fraud. This misses the point.
The real driving forces behind security system design
usually have nothing to do with such altruistic goals.
They are much more likely to be the desire to grab a
monopoly, to charge different prices to different users
for essentially the same service, and to dump risk. Of-
ten this is perfectly rational.

In an ideal world, the removal of perverse eco-
nomic incentives to create insecure systems would de-
politicize most issues. Security engineering would then
be a matter of rational risk management rather than
risk dumping. But as information security is about
power and money – about raising barriers to trade,
segmenting markets and differentiating products – the
evaluator should not restrict herself to technical tools
like cryptanalysis and information flow, but also apply
economic tools such as the analysis of asymmetric in-
formation and moral hazard. As fast as one perverse
incentive can be removed by regulators, businesses
(and governments) are likely to create two more.

In other words, the management of information se-
curity is a much deeper and more political problem
than is usually realized; solutions are likely to be sub-
tle and partial, while many simplistic technical ap-
proaches are bound to fail. The time has come for en-
gineers, economists, lawyers and policymakers to try
to forge common approaches.

Inserted Image: Metcalfe’s Law

Featured Image: Secure Networks: Remember the DMZ in 2012

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s