is viewed as a bug
in the system, not a feature.
Uri Rivner of RSA, which makes and sells cryptographic
solutions including hand-held authenticators, agrees:
‘…[I]n the online consumer authentication market, usability
is in many cases of greater importance than security. It’s
true that some people [would] like to see changes in the
banks’ security procedures and [would] appreciate it if the
financial institution handed them authentication devices or
came up with other visible security measures.
But other customers don’t really care about all of that; they
demand security from the bank, but all they really want is
to access their account, pay bills and transfer money
without any delay or additional challenge…’ [6]
Q. OK, let’s go back to the failure points above. Can you give
historical examples of each sort of failure, to paint a picture
of the sorts of thing that can go wrong? Let’s start with the
most exciting-sounding one: a cryptosystem which got cracked.
A. An example many people probably know about is Wired
Equivalent Privacy (WEP), the authentication and encryption
system originally proposed for wireless networking. WEP
relies on a secret key, either 40 or 108 bits in length; to access
and use the network, you need to know the key. (This, in turn,
means you can read all the traffic on the network, just as if
you were on a LAN.)
As it happens, the cipher used by WEP has a statistical flaw
which affects the randomness of its early output bytes.
Interestingly, the cipher, RC4, is also used in SSL (which we
will talk about later), but
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24