forked from rahuldeve/Diversification-Dataset
-
Notifications
You must be signed in to change notification settings - Fork 0
/
110.txt
executable file
·882 lines (759 loc) · 35.4 KB
/
110.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
Why Information Security is Hard
{ An Economic Perspective
Ross Anderson
University of Cambridge Computer Laboratory,
JJ Thomson Avenue, Cambridge CB3 0FD, UK
Ross.Anderson@cl.cam.ac.uk
Abstract
According to one common view, information secu-
rity comes down to technical measures. Given better
access control policy models, formal proofs of crypto-
graphic protocols, approved (cid:12)rewalls, better ways of de-
tecting intrusions and malicious code, and better tools
for system evaluation and assurance, the problems can
be solved.
In this note, I put forward a contrary view: infor-
mation insecurity is at least as much due to perverse
incentives. Many of the problems can be explained
more clearly and convincingly using the language of
microeconomics: network externalities, asymmetric
information, moral hazard, adverse selection, liability
dumping and the tragedy of the commons.
1 Introduction
In a survey of fraud against autoteller machines [4],
it was found that patterns of fraud depended on who
was liable for them. In the USA, if a customer dis-
puted a transaction, the onus was on the bank to prove
that the customer was mistaken or lying; this gave US
banks a motive to protect their systems properly. But
in Britain, Norway and the Netherlands, the burden
of proof lay on the customer: the bank was right un-
less the customer could prove it wrong. Since this
was almost impossible, the banks in these countries
became careless. Eventually, epidemics of fraud de-
molished their complacency. US banks, meanwhile,
su(cid:11)ered much less fraud; although they actually spent
less money on security than their European counter-
parts, they spent it more e(cid:11)ectively [4].
There are many other examples. Medical payment
systems that are paid for by insurers rather then by
hospitals fail to protect patient privacy whenever this
conflicts with the insurers wish to collect information
about its clients. Digital signature laws transfer the
risk of forged signatures from the bank that relies on
the signature (and that built the system) to the person
alleged to have made the signature. Common Criteria
evaluations are not made by the relying party, as Or-
ange Book evaluations were, but by a commercial fa-
cility paid by the vendor. In general, where the party
who is in a position to protect a system is not the
party who would su(cid:11)er the results of security failure,
then problems may be expected.
A di(cid:11)erent kind of incentive failure surfaced in early
2000, with distributed denial of service attacks against
a number of high-pro(cid:12)le web sites. These exploit a
number of subverted machines to launch a large coor-
dinated packet flood at a target. Since many of them
flood the victim at the same time, the tra(cid:14)c is more
than the target can cope with, and because it comes
from many di(cid:11)erent sources, it can be very di(cid:14)cult to
stop [7]. Varian pointed out that this was also a case of
incentive failure [20]. While individual computer users
might be happy to spend $100 on anti-virus software
to protect themselves against attack, they are unlikely
to spend even $1 on software to prevent their machines
being used to attack Amazon or Microsoft.
This is an example of what economists refer to as
the Tragedy of the Commons [15]. If a hundred peas-
ants graze their sheep on the village common, then
whenever another sheep is added its owner gets almost
the full bene(cid:12)t { while the other ninety-nine su(cid:11)er only
a small decline in the quality of the grazing. So they
arent motivated to object, but rather to add another
sheep of their own and get as much of the grazing as
they can. The result is a dustbowl; and the solution
is regulatory rather than technical. A typical tenth-
century Saxon village had community mechanisms to
deal with this problem; the world of computer secu-
rity still doesnt. Varians proposal is that the costs
of distributed denial-of-service attacks should fall on
the operators of the networks from which the flood-
ing tra(cid:14)c originates; they can then exert pressure on
their users to install suitable defensive software, or,
for that matter, supply it themselves as part of the
subscription package.
These observations prompted us to look for other
ways in which economics and computer security inter-
act.
2 Network Externalities
Economists have devoted much e(cid:11)ort to the study
of networks such as those operated by phone compa-
nies, airlines and credit card companies.
The more people use a typical network, the more
valuable it becomes. The more people use the phone
system { or the Internet { more people there are to
talk to and so the more useful it is to each user. This
is sometimes referred to as Metcalfes law, and is not
limited to communication systems. The more mer-
chants take credit cards, the more useful they are to
customers, and so the more customers will buy them;
and the more customers have them, the more mer-
chants will want to accept them. So while that net-
works can grow very slowly at (cid:12)rst { credit cards took
almost two decades to take o(cid:11) { once positive feed-
back gets established, they can grow very rapidly. The
telegraph, the telephone, the fax machine and most re-
cently the Internet have all followed this model.
As well as these physical networks, the same prin-
ciples apply to virtual networks, such as the commu-
nity of users of a mass-market software architecture.
When software developers started to believe that the
PC would outsell the Mac, they started developing
their products for the PC (cid:12)rst, and for the Mac only
later (if at all). This e(cid:11)ect was reinforced by the fact
that the PC was easier for developers to work with.
The growing volume of software available for the PC
but not the Mac made customers more likely to buy
a PC than a Mac, and the resulting positive feedback
squeezed the Mac out of most markets. A similar e(cid:11)ect
made Microsoft Word the dominant word processor.
A good introduction to network economics is by
Shapiro and Varian [17]. For our present purposes,
there are three particularly important features of in-
formation technology markets.
(cid:15) First, the value of a product to a user depends on
how many other users adopt it.
(cid:15) Second, technology often has high (cid:12)xed costs and
low marginal costs. The (cid:12)rst copy of a chip or
a software package may cost millions, but subse-
quent copies may cost very little to manufacture.
This isnt unique to information markets; its also
seen in business sectors such as airlines and ho-
tels. In all such sectors, pure price competition
will tend to drive revenues steadily down towards
the marginal cost of production (which in the case
of information is zero). So businesses need ways
of selling on value rather than on cost.
(cid:15) Third, there are often large costs to users from
switching technologies, which leads to lock-in.
Such markets may remain very pro(cid:12)table, even
where (incompatible) competitors are very cheap
to produce.
In fact, one of the main results of
network economic theory is that the net present
value of the customer base should equal the total
costs of their switching their business to a com-
petitor [19].
All three of these e(cid:11)ects tend to lead to "winner
take all" market structures with dominant (cid:12)rms. So
it is extremely important to get into markets quickly.
Once in, a vendor will try to appeal to complementary
suppliers, as with the software vendors whose band-
wagon e(cid:11)ect carried Microsoft to victory over Apple.
In fact, successful networks tend to appeal to comple-
mentary suppliers even more than to users: the po-
tential creators of \killer apps" need to be courted.
Once the customers have a substantial investment in
complementary assets, they will be locked in. (There
are a number of complexities and controversies; see for
example [14]. But the above simpli(cid:12)ed discussion will
take us far enough for now.)
These network e(cid:11)ects have signi(cid:12)cant consequences
for the security engineer, and consequences that are of-
ten misunderstood or misattributed. Consultants of-
ten explain that the reason a design broke for which
they were responsible was that the circumstances were
impossible:
the client didnt want a secure system,
but just the most security I could (cid:12)t on his product
in one week on a budget of $10,000. It is important
to realize that this is not just management stupid-
ity. The huge (cid:12)rst-mover advantages that can arise
in economic systems with strong positive feedback are
the origin of the so-called \Microsoft philosophy" of
well ship it on Tuesday and get it right by version
3. Although sometimes attributed by cynics to a per-
sonal moral failing on the part of Bill Gates, this is
perfectly rational behaviour in many markets where
network economics apply.
Another common complaint is that software plat-
forms are shipped with little or no security support, as
with Windows 95/98; and even where access control
mechanisms are supplied, as with Windows NT, they
are easy for application developers to bypass. In fact,
the access controls in Windows NT are often irrele-
vant, as most applications either run with adminis-
trator privilege (or, equivalently, require dangerously
powerful operating system services to be enabled).
This is also explained simply from the viewpoint of
network economics: mandatory security would sub-
tract value, as it would make life more di(cid:14)cult for the
application developers. Indeed, Odlyzko observes that
much of the lack of user-friendliness of both Microsoft
software and the Internet is due to the fact that both
Microsoft and the Internet achieved success by appeal-
ing to developers. The support costs that Microsoft
dumps on users { and in fact even the cost of the time
wasted waiting for PCs to boot up and shut down {
greatly exceed its turnover [16].
Network owners and builders will also appeal to the
developers of the next generation of applications by
arranging for the bulk of the support costs to fall on
users rather than developers, even if this makes e(cid:11)ec-
tive security administration impractical. One reason
for the current appeal of public key cryptography may
be that it can simplify development { even at the cost
of placing an unreasonable administrative burden on
users who are neither able nor willing to undertake
it [9]. The technical way to try to (cid:12)x this problem is
to make security administration more user-friendly or
plug-and-play; many attempts in this direction have
met with mixed success. The more subtle approach
is to try to construct an authentication system whose
operators bene(cid:12)t from network e(cid:11)ects; this is what
Microsoft Passport does, and well discuss it further
below.
In passing, it is worth mentioning that (thanks to
distributed denial of service attacks) the economic as-
pects of security failure are starting to get noticed by
government. A recent EU proposal recommends ac-
tion by governments in response to market imperfec-
tions, where market prices do not accurately reflect
the costs and bene(cid:12)ts of improvemed network secu-
rity [11]. However, this is only the beginning of the
story.
3 Competitive applications and corpo-
rate warfare
Network economics has many other e(cid:11)ects on secu-
rity engineering. Rather than using a standard, well
analyzed and tested architecture, companies often go
for a proprietary obscure one { to increase customer
lock-in and increase the investment that competitors
have to make to create compatible products. Where
possible, they will use patented algorithms (even if
these are not much good) as a means of imposing li-
censing conditions on manufacturers. For example,
the DVD Content Scrambling System was used as a
means of insisting that manufacturers of compatible
equipment signed up to a whole list of copyright pro-
tection measures [5]. This may have come under severe
pressure, as it could prevent the Linux operating sys-
tem from running on next-generation PCs; but e(cid:11)orts
to foist non-open standards continue in many appli-
cations from SDMI and CPRM to completely propri-
etary systems such as games consoles.
A very common objective is di(cid:11)erentiated pricing.
This is usually critical to (cid:12)rms that price a product or
service not to its cost but to its value to the customer.
This is familiar from the world of air travel: you can
spend $200 to fly the Atlantic in coach class, $2000 in
business class or $5000 in (cid:12)rst. Some commentators
are surprised by the size of this gap; yet a French
economist, Jules Dupuit, had already written in 1849:
.
[I]t is not because of the few thousand
francs which would have to be spent to put a
roof over the third-class carriage or to uphol-
ster the third-class seats that some company
or other has open carriages with wooden
benches .
. What the company is try-
ing to do is prevent the passengers who can
pay the second-class fare from traveling third
class; it hits the poor, not because it wants
to hurt them, but to frighten the rich .
.
. And it is again for the same reason that
the companies, having proved almost cruel to
the third-class passengers and mean to the
second-class ones, become lavish in dealing
with (cid:12)rst-class customers. Having refused
the poor what is necessary, they give the rich
what is superfluous. [10]
This is a also common business model in the soft-
ware and online services sectors. A basic program or
service may be available free; a much better one for a
subscription; and a Gold service at a ridiculous price.
In many cases, the program is the same except that
some features are disabled for the budget user. Many
cryptographic and other technical protection mecha-
nisms have as their real function the maintenance of
this di(cid:11)erential.
Another business strategy is to manipulate switch-
ing costs.
Incumbents try to increase the cost of
switching, whether by indirect methods such as con-
trolling marketing channels and building industries of
complementary suppliers, or, increasingly, by direct
methods such as making systems incompatible and
hard to reverse engineer. Meanwhile competitors try
to do the reverse: they look for ways to reuse the
base of complementary products and services, and to
reverse engineer whatever protection the incumbent
builds in. This extends to the control of complemen-
tary vendors, sometimes using technical mechanisms.
Sometime, security mechanisms have both product
di(cid:11)erentiation and higher switching costs as goals. An
example which may become politicized is accessory
control. According to one company that sells au-
thentication chips into the automative market, some
printer companies have begun to embed cryptographic
authentication protocols in laser printers to ensure
that genuine toner cartridges are used. If a competi-
tors cartridge is loaded instead, the printer will qui-
etly downgrade from 1200 dpi to 300 dpi. In mobile
phones, much of the pro(cid:12)t is made on batteries, and
authentication can be used to spot competitors prod-
ucts so they can be drained more quickly [3].
Another example comes from Microsoft Passport.
This is a system whose ostensible purpose is single
signon: a Passport user doesnt have to think up sep-
arate passwords for each participating web site, with
all the attendant hassle and risk. Instead, sites that
use Passport share a central authentication server run
by Microsoft to which users log on. They use web
redirection to connect their Passport-carrying visitors
to this server; authentication requests and responses
are passed back and forth by the users browser in
encrypted cookies. So far, so good.
But the real functions of Passport are somewhat
more subtle [18]. First, by patching itself into all the
web transactions of participating sites, Microsoft can
collect a huge amount of data about online shopping
habits and enable participants to swap it.
If every
site can exchange data with every other site, then the
value of the network to each participating web site
grows with the number of sites, and there is a strong
network externality. So one such network may come
to dominate, and Microsoft hopes to own it. Second,
the authentication protocols used between the mer-
chant servers and the Passport server are proprietary
variants of Kerberos, so the web server must use Mi-
crosoft software rather than Apache or Netscape (this
has supposedly been (cid:12)xed with the latest release, but
participating sites still cannot use their own authen-
tication server, and so remain in various ways at Mi-
crosofts mercy).
So Passport isnt so much a security product, as a
play for control of both the web server and purchasing
information markets. It comes bundled with services
such as Hotmail, is already used by 40 million people,
and does 400 authentications per second on average.
Its known flaws include that Microsoft keeps all the
users credit card details, creating a huge target; var-
ious possible middleperson attacks; and that you can
be impersonated by someone who steals your cookie
(cid:12)le. (Passport has a logout facility thats supposed
to delete the cookies for a particular merchant, so you
can use a shared PC with less risk, but this feature
didnt work properly for Netscape users when it was
(cid:12)rst deployed [13].)
The constant struggles to entrench or undermine
monopolies and to segment and control markets de-
termine many of the environmental conditions that
make the security engineers work harder. They make
it likely that, over time, government interference in
information security standards will be motivated by
broader competition issues, as well as by narrow is-
sues of the e(cid:11)ectiveness of infosec product markets
(and law enforcement access to data).
So much for commercial information security. But
what about the government sector? As information at-
tack and defense become ever more important tools of
national policy, what broader e(cid:11)ects might they have?
4 Information Warfare { O(cid:11)ense and
Defense
One of the most important aspects of a new technol-
ogy package is whether it favours o(cid:11)ense or defense in
warfare. The balance has repeatedly swung back and
forth, with the machine gun giving an advantage to
the defense in World War 1, and the tank handing it
back to the o(cid:11)ense by World War 2.
The di(cid:14)culties of developing secure systems using
a penetrate-and-patch methodology have been known
to the security community since at least the Anderson
report in the early 1970s [2]; however, a new insight
on this can be gained by using an essentially economic
argument, that enables us to deal with vulnerabilities
in a quantitative way [6].
To simplify matters, let us suppose a large, complex
product such as Windows 2000 has 1,000,000 bugs,
each with an MTBF of 1,000,000,000 hours. Suppose
that Paddy works for the Irish Republican Army, and
his job is to break into the British Armys computer
to get the list of informers in Belfast; while Brian is
the army assurance guy whose job is to stop Paddy.
So he must learn of the bugs before Paddy does.
Paddy has a day job so he can only do 1000 hours
of testing a year. Brian has full Windows source code,
dozens of PhDs, control of the commercial evalua-
tion labs, an inside track on CERT, an information
sharing deal with other UKUSA member states { and
he also runs the governments scheme to send round
consultants to critical industries such as power and
telecomms to advise them how to protect their sys-
tems. Suppose that Brian bene(cid:12)ts from 10,000,000
hours a year worth of testing.
After a year, Paddy (cid:12)nds a bug, while Brian has
found 100,000. But the probability that Brian has
found Paddys bug is only 10%. After ten years he
will (cid:12)nd it { but by then Paddy will have found nine
more, and its unlikely that Brian will know of all of
them. Worse, Brians bug reports will have become
such a (cid:12)rehose that Microsoft will have kill(cid:12)led him.
In other words, Paddy has thermodynamics on his
side. Even a very moderately resourced attacker can
break anything thats at all large and complex. There
is nothing that can be done to stop this, so long as
there are enough di(cid:11)erent security vulnerabilities to
do statistics: di(cid:11)erent testers (cid:12)nd di(cid:11)erent bugs. (The
actual statistics are somewhat more complicated, in-
volving lots of exponential sums; keen readers can (cid:12)nd
the details at [6].)
There are various ways in which one might hope to
escape this statistical trap.
(cid:15) First, although its
reasonable to expect a
35,000,000 line program like Windows 2000 to
have 1,000,000 bugs, perhaps only 1% of them are
security-critical. This changes the game slightly,
but not much; Paddy now needs to recruit 100
volunteers to help him (or, more realistically,
swap information in a grey market with other sub-
versive elements). Still, the e(cid:11)ort required of the
attacker is still much less than that needed for
e(cid:11)ective defense.
(cid:15) Second, there may be a single (cid:12)x for a large num-
ber of the security critical bugs. For example,
if half of them are stack overflows, then perhaps
these can all be removed by a new compiler.
(cid:15) Third, you can make the security critical part of
the system small enough that the bugs can be
found. This was understood, in an empirical way,
by the early 1970s. However, the discussion in the
above section should have made clear that a mini-
mal TCB is unlikely to be available anytime soon,
as it would make applications harder to develop
and thus impair the platform vendors appeal to
developers.
So information warfare looks rather like air war-
fare looked in the 1920s and 1930s. Attack is sim-
ply easier than defense. Defending a modern infor-
mation system could also be likened to defending a
large, thinly-populated territory like the nineteenth
century Wild West: the men in black hats can strike
anywhere, while the men in white hats have to defend
everywhere. Another possible relevant analogy is the
use of piracy on the high seas as an instrument of state
policy by many European powers in the sixteenth and
seveteenth centuries. Until the great powers agreed to
deny pirates safe haven, piracy was just too easy.
The technical bias in favour of attack is made even
worse by asymmetric information. Suppose that you
head up a U.S. agency with an economic intelligence
mission, and a computer scientist working for you has
just discovered a beautiful new exploit on Windows
2000. If you report this to Microsoft, you will protect
250 million Americans; if you keep quiet, you will be
able to conduct operations against 400 million Euro-
peans and 100 million Japanese. Whats more, you
will get credit for operations you conduct successfully
against foreigners, while the odds are that any op-
erations that they conduct successfully against U.S.
targets will remain unknown to your superiors. This
further emphasizes the motive for attack rather than
defense. Finally { and this appears to be less widely
realized { the balance in favour of attack rather than
defense is still more pronounced in smaller countries.
They have proportionally fewer citizens to defend, and
more foreigners to attack.
In other words, the increasing politicization of in-
formation attack and defense may even be a destabi-
lizing factor in international a(cid:11)airs.
5 Distinguishing Good from Bad
Since Auguste Kerckho(cid:11)s wrote his two seminal pa-
pers on security engineering in 1883 [12], people have
discussed the dangers of security-by-obscurity, that
is, relying on the attackers being ignorant of the de-
sign of a system. Economics can give us a fresh insight
into this. We have already seen that obscure designs
are often used deliberately as a means of entrenching
monopolies; but why is it that, even in relatively com-
petitive security product markets, the bad products
tend to drive out the good?
The theory of asymmetric information gives us an
explanation of one of the mechanisms. Consider a used
car market, on which there are 100 good cars (the
plums), worth $3000 each, and 100 rather trouble-
some ones (the lemons), each of which is worth only
$1000. The vendors know which is which, but the
buyers dont. So what will be the equilibrium price of
used cars?
If customers start o(cid:11) believing that the probability
they will get a plum is equal to the probability they
will get a lemon, then the market price will start o(cid:11)
at $2000. However, at that price only lemons will be
o(cid:11)ered for sale, and once the buyers observe this, the
price will drop rapidly to $1000 with no plums being
sold at all. In other words, when buyers dont have as
much information about the quality of the products
as sellers do, there will be severe downward pressure
on both price and quality. Infosec people frequently
complain about this in many markets for the prod-
ucts and components we use; the above insight, due
to Akerlof [1], explains why it happens.
The problem of bad products driving out good ones
can be made even worse when the people evaluat-
ing them arent the people who su(cid:11)er when they fail.
Much has been written on the ways in which corpo-
rate performance can be adversely a(cid:11)ected when exec-
utives have incentives at odds with the welfare of their
employer. For example, managers often buy products
and services which they know to be suboptimal or even
defective, but which are from big name suppliers. This
is known to minimize the likelihood of getting (cid:12)red
when things go wrong. Corporate lawyers dont con-
demn this as fraud, but praise it as due diligence.
Over the last decade of the twentieth century, many
businesses have sought to (cid:12)x this problem by extend-
ing stock options to ever more employees. However,
these incentives dont appear to be enough to ensure
prudent practice by security managers. (This might
be an interesting topic for a PhD; does it come down
to the fact that security managers also have less in-
formation about threats, and so cannot make rational
decisions about protection versus insurance, or is it
simply due to adverse selection among security man-
agers?)
This problem has long been perceived, even if not
in precisely these terms, and the usual solution to be
proposed is an evaluation system. This can be a pri-
vate arrangement, such as the equipment tests carried
out by insurance industry laboratories for their mem-
ber companies, or it can be public sector, as with the
Orange Book and the Common Criteria.
For all its faults, the Orange Book had the virtue
that evaluations were carried out by the party who re-
lied on them { the government. The European equiva-
lent, ITSEC, introduced a pernicious innovation { that
the evaluation was not paid for by the government but
by the vendor seeking an evaluation on its product.
This got carried over into the Common Criteria.
This change in the rules provided the critical per-
verse incentive.
It motivated the vendor to shop
around for the evaluation contractor who would give
his product the easiest ride, whether by asking fewer
questions, charging less money, taking the least time,
or all of the above. To be fair, the potential for this
was realized, and schemes were set up whereby con-
tractors could obtain approval as a CLEF (commercial
licensed evaluation facility). The threat that a CLEF
might have its license withdrawn was supposed to o(cid:11)-
set the commercial pressures to cut corners.
But in none of the half-dozen or so disputed cases
Ive been involved in has the Common Criteria ap-
proach proved satisfactory. Some examples are doc-
umented in my book, Security Engineering [3]. The
failure modes appear to involve fairly straightforward
pandering to customers wishes, even (indeed espe-
cially) where these were in conflict with the interests of
the users for whom the evaluation was supposedly be-
ing prepared. The lack of sanctions for misbehaviour
{ such as a process whereby evaluation teams can lose
their accreditation when they lose their sparkle, or get
caught in gross incompetence or dishonesty, is proba-
bly a contributory factor.
But there is at least one more signi(cid:12)cant perverse
incentive. From the users point of view, an evaluation
may actually subtract from the value of a product. For
example, if you use an unevaluated product to gener-
ate digital signatures, and a forged signature turns up
which someone tries to use against you, you might rea-
sonably expect to challenge the evidence by persuad-
ing a court to order the release of full documentation
to your expert witnesses. A Common Criteria certi(cid:12)-
cate might make a court much less ready to order dis-
closure, and thus could severely prejudice your rights.
A cynic might suggest that this is precisely why its
the vendors of products which are designed to transfer
liability (such as digital signature smartcards), to sat-
isfy due diligence requirements (such as (cid:12)rewalls) or to
impress naive users (such as PC access control prod-
ucts), who are most enthusiastic about the Common
Criteria.
So an economist is unlikely to place blind faith in
a Common Criteria evaluation. Fortunately, the per-
verse incentives discussed above should limit the up-
take of the Criteria to sectors where an o(cid:14)cial certi-
(cid:12)cation, however irrelevant, erroneous or misleading,
o(cid:11)ers competitive advantage.
6 Conclusions
Much has been written on the failure of informa-
tion security mechanisms to protect end users from
privacy violations and fraud. This misses the point.
The real driving forces behind security system design
usually have nothing to do with such altruistic goals.
They are much more likely to be the desire to grab a
monopoly, to charge di(cid:11)erent prices to di(cid:11)erent users
for essentially the same service, and to dump risk. Of-
ten this is perfectly rational.
In an ideal world, the removal of perverse eco-
nomic incentives to create insecure systems would de-
politicize most issues. Security engineering would then
be a matter of rational risk management rather than
risk dumping. But as information security is about
power and money { about raising barriers to trade,
segmenting markets and di(cid:11)erentiating products { the
evaluator should not restrict herself to technical tools
like cryptanalysis and information flow, but also apply
economic tools such as the analysis of asymmetric in-
formation and moral hazard. As fast as one perverse
incentive can be removed by regulators, businesses
(and governments) are likely to create two more.
In other words, the management of information se-
curity is a much deeper and more political problem
than is usually realized; solutions are likely to be sub-
tle and partial, while many simplistic technical ap-
proaches are bound to fail. The time has come for en-
gineers, economists, lawyers and policymakers to try
to forge common approaches.
Acknowledgements
I got useful comments on early drafts of some of
this material from Avi Rubin, Hal Finney, Jack Lang,
Andrew Odlyzko and Hal Varian.
Postscript
The dreadful events of September 11th happened
just before this manuscript was (cid:12)nalised. They will
take some time to digest, and rather than rewriting
the paper it seemed better to add this short postscript.
I believe that the kind of economic arguments ad-
vanced here will be found to apply to protecting
bricks as much as clicks. It may take years for the
courts to argue about liability; there will remain a
strong public interest in ensuring that the operational
responsibility for protection does not become divorced
from the liability for the failure of that protection.
The arguments in section 4 are also brought into
sharper relief. In a world in which the black hats can
attack anywhere but the white hats have to defend
everywhere, the black hats have a huge economic ad-
vantage. This suggests that local defensive protection
is not enough; there is an essential role for global de-
fence, of which deterrence and retribution may be an
unavoidable part.
The suppression of piracy, mentioned in that sec-
tion, may be a useful example.
It might also be a
sobering one. Although, from the late seventeenth
century, major governments started to agree that the
use of pirates as instruments of state policy was unac-
ceptable, there was no single solution. It took many
treaties, many naval actions, and the overthrow of a
number of rogue governments, over a period of more
than a century, to pacify the worlds oceans. The
project became entwined, in complex ways, with other
campaigns, including the abolition of slavery and the
spread of colonialism. Liberals faced tough moral
dilemmas: was it acceptable to conquer and colonise
a particular territory, in order to suppress piracy and
slavery there? In the end, economic factors appear to
have been politically decisive; piracy simply cost busi-
ness too much money. History may not repeat itself,
but it might not be wise to ignore it.
References
[1] GA Akerlof, \The Market for Lemons: Qual-
ity Uncertainty and Market Mechanism,"
Quarterly Journal of Economics v 84 (August
1970) pp 488{500
[2] J Anderson, Computer Security Technology
Planning Study, ESD-TR-73-51, US Air
Force Electronic Systems Division (1973)
http://csrc.nist.gov/publications/
history/index.html
[3] RJ Anderson, Security Engineering { A Guide
to Building Dependable Distributed Systems,
Wiley (2001) ISBN 0-471-38922-6
[4] RJ Anderson, \Why Cryptosystems Fail" in
Communications of the ACM vol 37 no 11
(November 1994) pp 32{40
[5] JA Bloom, IJ Cox, T Kalker, JPMG Linnartz,
ML Miller, CBS Traw, \Copy Protection for
DVD Video", in Proceedings of the IEEE v 87
no 7 (July 1999) pp 1267{1276
[6] RM Brady, RJ Anderson, RC Ball, Murphys
law, the (cid:12)tness of evolving species, and the lim-
its of software reliability, Cambridge Univer-
sity Computer Laboratory Technical Report
[18] J
Spolsky,
Make Microsoft
http://joel.editthispage.com/stories/
storyReader$139
a Country?"
\Does
Issuing
Passports
at
[19] H Varian, Intermediate Microeconomics { A
Modern Approach, Fifth edition, WW Norton
and Company, New York, 1999; ISBN 0-393-
97930-0
[20] H Varian,
\Managing Online
Security
Risks", Economic Science Column, The
New York Times, June 1, 2000, http:
//www.nytimes.com/library/financial/
columns/060100econ-scene.html
no. 476 (1999); at http;//www.cl.cam.ac.
uk/~rja14
[7] CERT, Results of the Distributed-Systems
Intruder Tools Workshop, Software Engi-
neering Institute, Carnegie Mellon Univer-
sity, http://www.cert.org/reports/dsit_
workshop-final.html, December 7, 1999
[8] W Curtis, H Krasner, N Iscoe, \A Field Study
of the Software Design Process for Large Sys-
tems", in Communications of the ACM v 31
no 11 (Nov 88) pp 1268{1287
[9] D Davis, \Compliance Defects in Public-Key
Cryptography", in Sixth Usenix Security Sym-
posium Proceedings (July 1996) pp 171{178
[10] \De linfluence des p(cid:19)eages sur lutilit(cid:19)e des voies
de communication", 1849, Annales des Ponts
et Chausses.
[11] European Union, Network and Information
Security: Proposal for a European Policy Ap-
proach, COM(2001)298 (cid:12)nal, 6/6/2001
[12] A Kerckho(cid:11)s, \La Cryptographie Militaire", in
Journal des Sciences Militaires, 9 Jan 1883, pp
5{38; http://www.fabien-petitcolas.net/
kerckhoffs/
[13] DP Kormann, AD Rubin, \Risks of the Pass-
in Computer
port Single Signon Protocol",
Networks (July 2000); at http://avirubin.
com/vita.html
[14] SJ Liebowitz, SE Margolis, \Network Exter-
nalities (E(cid:11)ects)", in The New Palgraves Dic-
tionary of Economics and the Law, MacMil-
lan, 1998; see http://wwwpub.utdallas.edu/
~liebowit/netpage.html
[15] WF Lloyd, Two Lectures on the Checks to
Population, Oxford University Press (1833)
[16] AM Odlyzko,
\Smart
and stupid net-
works: Why the
like Mi-
crosoft", ACM netWorker, Dec 1998, pp
38{46, at http://www.acm.org/networker/
issue/9805/ssnet.html
Internet
is
[17] C Shapiro, H Varian,
Information Rules,
Harvard Business School Press (1998), ISBN
0-87584-863-X