Last weekend I had a romantic getaway with my lovely bride of 13 years, to celebrate our anniversary. We went to Newport, Rhode Island. If you have the opportunity to go there and stay in a lovely boutique hotel, eat at the finest restaurants, and enjoy the company of the love of your life, you should probably do that. Highly recommended.
I also brought along a book to read, since I had decided that it would be a twitter-free weekend. I asked my wife to just put some mindless Dan Brown I hadn’t read onto the Kindle, and she obliged with a book called “Digital Fortress.” You know Dan Brown. He’s the guy who wrote Da Vinci Code and Angels & Demons. Books that are full of interesting art history and geography lessons. I’ve learned quite a lot from his books. But now I wonder how much of what I learned was wrong.
Just after college, I had a subscription to Scientific American magazine. I loved that magazine. It suited my penchant for learning all sorts of cool sciency stuff. And then I read an article about computer science — my wheelhouse. And it was wrong. It was so, so wrong. Error after error after error. And I realized that if SA was printing such complete hogwash about a subject I know, why should I think that the articles about everything I don’t know are any more accurate? Being an atheist, this is about as close as one gets to a crisis of faith. I canceled my subscription.
So I’m reading “Digital Fortress,” which is about computers and cryptography and stuff like that. And I understand that it is a work of fiction. I’m fine with that. Remember the scene in Jurassic Park where the girl sits down at the computer and there is a rich 3D display, and she says “It’s a unix system. I know this.”
Unlike most computer geeks, I’m fine with that scene. It’s absurd, but it’s entertainment, and the fact is that a real unix system at that time would just have been a white $ on a black screen. The (somewhat clunky) 3D animation makes better story telling.
The primary fictional device that Dan Brown uses is treating data as code. An email is data. So if he says a “tracer” email sends a message back, that doesn’t really make any sense. But that’s OK. Because in fact, there are ways that something like that could happen. For example, several years ago one of Microsoft’s core libraries for displaying JPEG images had a bug that allowed executable code to be embedded. If you opened a carefully constructed JPEG in your mail program, it could run some code that could, for example, find your email address and send a message back to the sender.
So I’m not going to quibble with the fiction stuff. My problem is with the “fact” stuff. This author loves to tell little asides during the story. It’s one of the things I liked most about his other books. I like learning little factlets, such as that the word “quarantine” came from the practice of keeping ships in harbor for 40 days to ensure the people on them didn’t have plague. (I looked that up. It appears he got that one right.)
But, as with my Scientific American experience, I know the truth behind the little asides he’s relaying, and they are bullshit. He isn’t giving the reader bits of history. He’s repeating popular apocrypha.
[The internet] had been created by the Department of Defense three decades earlier — an enormous network of computers designed to provide secure government communications in the event of nuclear war.
Uh, no. My first job out of college was for a company called Bolt, Beranek and Newman (BBN) which happened to be the the company that created the internet. And I’m on a mailing list of ex-BBN employees, many of whom are the actual guys who did it. So I know this one cold.
It was created by DoD (specifically, [D]ARPA; I put the D in brackets because that letter comes and goes on the name of the agency depending on whether it’s politically fashionable to be a hawk or a dove). So he got that right. The rest of the sentence is completely wrong.
ARPANET was never enormous. There were just a handful of nodes. It was never secure. The communications took place over plain old telephone lines connected with modems. It wasn’t intended for communications, the way most people understand that word. It was originally a way to let people using terminals in one place interact with computers in another place. Universities and government labs, mostly. And one at [D]ARPA. And it never, ever, ever had anything to do with Nuclear War.
The nuclear war myth probably started because the nature of the way the original ARPANET was assembled. It was fault-tolerant. Since each computer was connected to more than one other, and messages could be routed from their source to the destination multiple ways, it was resilient to computers going down or backhoes cutting telephone lines. Except it really wasn’t all that resilient, in my experience. But the perceived resilience led some people to think it would be resistant to nukes for some reason, which is pretty silly, because nukes would have wrecked the phone companies, and the T1 lines that connected all the computers to each other would have gone down.
But you don’t have to believe me, just look at this Wikipedia article about the ARPANET.
Here’s another example:
It seemed a moth had landed on one of the computer’s circuit boards and shorted it out. From that moment on, computer glitches were referred to as bugs.
Uh, no. The first sentence is true. There’s even a notebook with the actual moth at the Smithsonian museum. But the second sentence is bullshit. Computer glitches were always called bugs. That’s why the lab techs thought it was so funny that there was a moth that caused a problem. Using the word “bug” to describe glitches in systems goes way, way back. Edison used it.
But again, there is no reason to take my word for it. Just read the Wikipedia entry on Software Bugs.
There are other mistakes of a factual nature:
“…if the key is a standard sixty-four bit — even in broad daylight, nobody could possibly read and memorize all sixty-four characters.”
The woman who says this is established as a computer genius, so I’m pretty sure she wouldn’t confuse bits and bytes. Danny, though, seems like exactly the sort who might.
A bit can have one of two values: on/off, 1/0, true/false. Like that. If you have 5 of these, you can encode 32 values, which is enough for the alphabet. If you have 6 you can encode all the capital and lowercase letters (52 total), the digits 0-9, and have room for a couple pieces of punctuation like + and -. So if you need to encode a 64-bit key into something a person could read, you could do that with eleven 6-bit blocks. Each 6-bit block being represented by a capital or lowercase letter, a digit, or a + or -.
For example: BVtK4JXGQ2U
So our hero should have said “nobody could possibly read and memorize all eleven characters.” But, well, they could actually.
[debugging a computer program] was like inspecting an encyclopedia for a single typo.
It really isn’t anything like that, actually. If you wrote the code yourself (which she did in this case) it’s more like searching an encyclopedia for a wrinkled page where you already have a pretty good idea where to look, and then finding a big black ink blot on that page.
The really bad news is that I’m only halfway through the damn book. But at this point it’s like a bad information treasure hunt. I have to keep reading it to see which “fact” he’s going to bungle next.
Note that I’m not saying you shouldn’t read the book. As far as I know, the asides in the Da Vinci Code and Angels & Demons were just as horrifically wrong, and that didn’t make the books bad. It’s just best not to read suspense thrillers about subjects you know. And it’s a good idea to take anything you learn in a Dan Brown book with a grain of salt.