Tuesday, April 3, 2012

C is for Cats. I mean Computers!

I debated internally on what to choose for C. I originally settled on cats. Truth is, I don't care much for cats, but they make great characters in fantasy books because of how contentionist they can be. Contention = conflict = moving the story forward.
But then my mind said to me, "Self. You're an idiot! You work on computers for a living, why can't you blog on that instead of cats, which you despise in general." And I said, "Brain, you're a genius! Where are you when I'm writing?"

So, C is for Computers.

Nothing irks me more than to read a book or watch a movie where computers are a character and it is handled like the writer knows nothing of computers. Take the new "Doctor Who" for example. In one of the earlier episodes Doctor Who is combating a robot (robot = movable computer) who has a power surge for about five seconds and then stops and says, "Okay, I have downloaded the entire Internet and with this knowledge I will destroy you." (or something like that)

My mind and I both yelled, "WHAT?!?!?"
I hate to say this, but everybody raves about Doctor Who and I couldn't bring myself to watch another episode. I know, I'm sure I'm breaking some sort of law of nerdyism, but there you have it. It just ruined it for me!

Ever see The Net with Sandra Bullock? My brain feels like it just divided by zero every time I think about it. Ugh! And what was that other disaster that came out lately? Oh yeah, Eagle Eye!

There have been notable exceptions. I loved the first two Terminator movies, they seemed to have gotten it mostly right. Except that if machines do decide to take over the world it won't be John Conner who saves the day, it will be somebody on par with Stephen Hawkings and Bill Gates armed with an arsenal of EMPs. Either that, or a fleet of robots coded to defend you (ie, Transformers).
If you pit humans vs computers in a physical confrontation, especially in any story set in the future, the humans are going to lose. Machines are faster, more adaptive to their environment, have faster reflexes, have perfect aim, and are MADE OF METAL for crying out loud.

If you are going to write about computers, please, for the love of all that is digital, DO IT RIGHT!! Learn about Moore's Law. Understand the limitations on data storage, bandwidth, processing power, heat, and data access speeds. Don't be fooled by Watson on Jeopardy. You saw a little box, what you didn't see was the large data room filled to the brim with a ton of servers. If you are going to have a robot in modern days with the brain power of Watson you aren't going to fit it in something the size of your average human. 

I know, this is a rant, something I'm not usually prone to do (too much). But know that I'm not alone here. Do it right, or lose credibility. And if your story loses credibility it isn't going to go far, unless it has a big star like Sandra Bullock or Shia Le Bouf to give it legs.

And now for a little known fact. One of the first things I had learned about when working with computers is if they are programmed wrong then it is called a bug. Want to know why? In 1947 computers used to fill entire rooms and even university floors. Sadly, they had far less computing power than your cheapest cell phone. Anyways, they threw a math problem at one computer and the result came back wrong. So they investigated and they found a moth was causing the error. They removed the "bug" and the computer started working. Since then, computers gone wrong have been known to have bugs in them. One more interesting thing... here is a picture of the world's first official computer bug: