Hardware Bible – Introduction

Part I: Introduction

Before venturing off on some quest, most people find it handy to know what they are looking for. Imagine King Arthur’s knights off looking for the Holy Grail and returning with an old fishing net, a lipstick-stained Starbucks cup, an empty wine bottle, and so on—with each knight claiming he had found the one true holy relic from the Last Supper. Without guidance as to what a “grail” really is, each could easily justify his find and claim success in his quest. Certainly such a free-form search would have changed the face of literature, but anyone depending on the magical powers of the Grail would likely be disappointed.

Although you’re more likely to succeed in a quest to learn about computers than Arthur’s knights were in finding the Grail, knowing what you’re after will truly help you know when you’ve got it.

The title of this book, taken in its most general sense, might imply that it’s all about nuts and bolts, but we’re going to deal with a more specific kind of hardware—the type that goes into computer systems and the devices that computer systems go into. Of course, we will talk about real nuts and bolts—both the ones you use to put computers together and the figurative kind. However, our aim here is understanding the digital hardware that defines our lives and the age in which we live.

The first step on our quest will be defining what a computer really is, what it does, and how it relates to us. Imagine King Arthur holding up a placard and pointing to it saying, “This is what the Grail you seek looks like.” According to legend, he actually had it easier—the Grail appeared to all the knights of the Round Table in a vision, so they all got a peek at the real thing before they donned their armor. Odds are you own or have access to a computer already, so you’re ahead of the knights. Not only have you already seen one, you’ve also likely touched one.

If you have a computer handy, try it. Touch your computer. Get a tingly feeling? If you did, there’s probably something wrong with the grounding of your electrical supply. Otherwise, you probably feel like you’ve followed someone’s guidance and have done something that, after thinking about it, is pretty stupid. You probably don’t feel you know anything more about computers than you did before you made this touch. That’s because, unlike the Grail, the humble computer isn’t going to fill you with mystical knowledge. Therefore, you know right away that the computer is nothing mystical, nothing you have to hold in awe. More than that, you need never touch a computer and feel stupid again. You’ve already done that.

Let’s begin our quest and peer inside to see what a computer really is…intelligently.

Chapter 1. Computers

Pinning down the meaning of a word is devilishly difficult, particularly when it’s used by a sideshow huckster, lawyer, or some other professional liar. Definitions vary not only with who is speaking, but also when. Language evolves and meanings shift. Words often go to extremes, shifting meanings even to opposites—cold becomes hot, bad becomes good, and rap becomes popular.

Computer is one of those shifty words. Exactly what makes up a computer depends on who you are and when you’re talking about. Today a computer is something you can hold in your hand or at least lift with your hands. Thirty years ago, you would have needed a hand from a friend or two to roll a computer machine around. Sixty years ago, a computer could have given you a hand.

Computers in History

Strictly speaking, a computer is something that computes, which is not a particularly informative definition. In the vagueness of the term, however, you’ll find an interesting bit of history. The word computer does not necessarily mean an electronic machine or a machine at all. If you were a researcher a hundred years ago and you wanted to take a break from heavy-duty math work, such as creating a tide table, you might have taken your computer out to lunch. Scientists engaged in difficult mathematics often employed a bevy of computers—men and women with pencils, papers, and green eye-shades who computed the numbers they needed.

Up until the end of World War II, a computer was a person who computed. She might use a pencil (a pen if she were particularly confident of her results), a slide rule, or even a mechanical calculator. Poke a few numbers in, pull a crank, and the calculator machine printed an answer in purple ink on paper tape—at least if the questions involved simple arithmetic, such as addition. If this person did a lot of calculations, the black ink of the numbers soon faded to a pale gray, and he grew calluses on his fingertips and cranking hand.

The early machines for mathematics were once all known as calculators, no matter how elaborate—and they could be quite elaborate. Charles Babbage, a 19th-century English country gentleman with a bold idea and too much time on his hands, conceived the idea of a machine that would replace the human computers used to calculate values in navigational tables. Babbage foresaw his mechanical computer-replacement as having three advantages over number-crunchers who wielded pencil and paper: The machine would eliminate mistakes, it would be faster, and it would be cheaper. He was right about all but the last, and for that reason he never saw the most intricate machines he designed actually built. Moreover, he never called his unbuilt machines “computers.” His names for them were the Difference Engine and the Analytical Engine. Even though Babbage’s machines are considered the forerunners of today’s computers—sometimes even considered the first computers by people who believe they know such things—they really weren’t known as “computers” in Babbage’s time. The word was still reserved for the humans who actually did the work.

The word computer was first applied to machines after electricity replaced blood as the working medium inside them. In the early part of the 20th century, researchers struggled with the same sort of problems as those in Babbage’s time, and they solved them the same way. In the 10 years from 1937 to 1947, scientists created the first devices that are classed as true computers, starting with an electrically powered mechanical machine and ending with an all-electronic device powered by an immense number of vacuum tubes, which required an equally immense amount of good fortune for them to all work long enough to carry out a calculation. Nobody called them computers just yet, however.

The first of these machines—a mechanical computer of which Babbage would have been proud—was the IBM-financed Automatic Sequence Controlled Calculator, which is often called Harvard Mark I. The five-ton design included 750,000 parts, including switches, relays, and rotating shafts and clutches. It stretched out for 50 feet and was eight feet tall. It sounded, according to an observer of the time, like a roomful of ladies knitting.

Many of the fundamentals of today’s computers first took form in the partly electronic, partly mechanical machine devised by John Vincent Atanasoff at Iowa State College (now University). His ideas and a prototype built with the aid of graduate student Clifford Berry have become a legend known as the Atanasoff Berry Computer (with the acronym ABC), the first electronic digital computer—although it was never contemporaneously called a “computer.” Iowa State called the device “the world’s fastest calculator” as late as 1942.

In Britain, crypto-analysts developed a vacuum-tube (valve in Britain) device they called Colossus that some people now call the first electronic computer—usually British folk who don’t want you to forget that the English can be clever, too. But the rest of the world never called Colossus a computer—or anything else—because it was top secret until the end of the century.

The present usage of the word computer goes back only to June 5, 1943, when ENIAC (the most complex vacuum tube-based device ever made) was first proposed as a collaboration between the United States Army and the University of Pennsylvania. The original agreement on that date first used the description that became its name, as well as the name for all subsequent machines: the Electronic Numerical Integrator and Computer.

Three years and $486,804.22 later, the machine made its first computation at the university. The 30-ton behemoth, and its offspring, captured the imagination of the world, and the term computer shifted from flesh-and-blood human calculators to machines. In Hollywood, such thinking machines grew even bigger and took over the world, at least in 1950s science fiction movies. In business, ENIAC’s offspring, the Univac, took over billing for utilities and gave a new name to bookkeeping foul-ups and bureaucratic incompetence: computer error. Also, scientists tried to figure out how to squeeze a room-sized computer into a space capsule, into which they could barely shoehorn a space-suited human being.

The scientists pretty much figured things out—they created the microprocessor, which led to the age of microcircuits—but not until after a few scientific diversions, including sending men to the moon. Oddly enough, although modern microelectronic circuitry is credited as an offshoot of the space program (shrinking things down and making them lighter was important to an industry in which lifting each ounce cost thousands of dollars), in the history of technology, the moon landing (1969) comes two years before the invention of the microprocessor (1971).

Once the microprocessor hit, however, tinkerers figured how to make small computers cheap enough that everyone could afford one. Computers

Computers in Today’s World

The glitz is gone. Computers no longer rule Hollywood. Robots and aliens (and alien robots in particular) are now the big villains set to rule the world. Businesses no longer seek the fastest computers but instead strive for slowness—that is, slowing down the time between buying and replacing machines.

Computers are now as disposable as old newspaper. We all have one—or will soon. But their advantages hark back to Babbage’s claims for his unrealized Analytical Engine. They are accurate, error-free, fast, and cheap—particularly when you compare using a computer for calculations with doing the same work by hand. Got an aspirin?

The one claim that we won’t award to the modern computer is being smarter than human calculators. Computers aren’t smarter than you are, even if one can balance a checkbook and you can’t. Certainly the computer has a better head for numbers than you do. After all, computers are specifically designed to work with numbers. For people, numbers are—at best—an afterthought, at least if you’re not an accountant. People have bigger worries, such as finding food, shelter, and sex. Computers have us to take care of those details for them (maybe not the sex) so that they can concentrate on calculating.

Computers are good at calculating—and they’re fast. Even today’s cheapest personal computers can figure the product of two 40-bit numbers billions of times in the fraction of a second it takes one of us human beings even to realize we have two 40-bit numbers to multiply.

Scientists and engineers like to make comparisons between the intelligence of their computers (usually the fastest computer ever built, which changes month to month) and the thinking ability of animate creatures, typically something like “an insect brain.” Most scientists know it’s all balderdash, but they make these claims because it gets them big headlines. No mere bug can multiply two 40-bit numbers—or even wants to.

Computers are more accurate because they are designed that way. Using digital logic, their thinking automatically wipes out any noise that can confuse their calculations. By elaborating on the math with error-checking, they can quickly detect and prevent most mistakes. They don’t think at all like you and I do. They have to be told exactly what to do, led by the hand through the process of finding an answer by a set of instructions we call a program.

Computers also have better memories than people. Again, they are designed that way. One of our human advantages is that we can adapt and shift the way we deal with things thanks to our malleable memories. Remember Mom’s cooking? Computers have long-term memories designed for just the opposite purpose—to remember everything, down to the last detail, without error and without limit. Even Deep Blue, the computer that finally beat a flesh-and-blood chess Grand Master, would quickly succumb to the elements if left outside. Although able to calculate billions of chess moves in seconds, it lacks the human sense to come in out of the rain. And the feet.

What we call “computer intelligence” is something far different from what we call “intelligence” in humans. That’s actually good, because even experts can’t agree on what human intelligence is, or even how humans think. Things are much more straightforward for computers. We know how they think as well as how to measure how well they do their jobs.

Computer Intelligence

Biomechanical engineers are changing the way people think about insect intelligence. In an effort to develop machines that walk like insects, they have discovered that form doesn’t dictate function, and function doesn’t dictate form. The two are intimately linked.

Although some researchers have worked hard to develop machines that walk like insects by adding more and more intelligence, others have gone back to the drawing board with a more mechanical approach. By mimicking the structure of the insect body, they have been able to make mechanical insects that duplicate the gait of your typical six-legged bug. These mechanical insects can even change their gait and crawl over obstacles, all using no more intelligence than an electric motor. In other words, biomechanical engineers have discovered that the intelligence is in the design, not the insect.

Comparing the thinking power of the 12 neurons in an insect’s brain with the power of a computer becomes a question of design. The insect is designed for eating garbage and buzzing around your head. Although you might feed your computer a diet of garbage, it’s really designed for executing programs that calculate answers for you, thus letting you communicate and otherwise waste your time. Any comparison between bug and computer is necessarily unfair. It must inevitably put one or the other on alien turf. The computer is easily outclassed by anything that walks, but walking bugs won’t help you solve crossword puzzles.

With that perspective, you can begin to see the problems with comparing computer and human intelligence. Although they sometimes perform the same tasks—for example, playing chess—they have wildly different designs. A human can forage in the jungle (African or Manhattan), whereas a computer can only sit around and rust. A computer can turn one pixel on your monitor green at 7:15 a.m. on August 15, 2012, whereas you might not even bother waking up for the event.

Computers earn our awe because they can calculate faster and remember better. Even the most complex computer applications—video games, photo editors, and digital animators—all reduce down to these two capabilities.

Calculating involves making decisions and following instructions. People are able to and often do both, so comparing computers to people is only natural. However, whereas it might take you 10 minutes to decide between chocolate and vanilla, a computer makes a decision in about a billionth of a second. If you had to make all the decisions your computer does to load a copy of Microsoft Windows, multicellular organisms could evolve, rise up from the sea, mutate into dinosaurs and human beings, and then grow wings and fly to nearby planets before you finished. So it stands to reason that computers are smarter, right?

As with insects, the playing field is not quite level. Your decision isn’t as simple as it looks. It’s not a single issue: Your stomach is clamoring for chocolate, but your brain knows that with just one taste, a zit will grow on your nose and make you look like a rhinoceros. And there are those other flavors to tempt you, too. You might just have several billion conflicts to resolve before you can select a cone, plain or sugar, and fill it full of…did you want ice cream or frozen yogurt, by the way?

No doubt some of us follow instructions better than others. Computers, on the other hand, can’t help but follow their instructions. They are much better at it, patiently going through the list step by step. But again, that doesn’t make them smarter than you. On the contrary, if you follow instructions to the letter, you’re apt to end up in trouble. Say you’ve found the map to Blackbeard’s treasure, and you stand, shovel in hand, at the lip of a precipice. If the map says to take 10 paces forward, you’re smart enough to know better. A computer confronted with the same sort of problem would plummet from its own digital precipice. It doesn’t look ahead (well, some new machines have a limited ability to sneak a peek at coming instructions), and it doesn’t know enough to stop short of trouble.

Computers have excellent memory. But unlike a person, a computer’s memory isn’t relational. Try to remember the name of an old friend, and you might race through every name you know in alphabetical order. Or you might remember he had big ears, and instantly the name “Ross” might pop into your mind. You’ve related your friend’s name with a distinguishing feature.

By itself, a computer’s memory is more like a carved stone tablet—permanent, unchanging, and exacting. A computer has to know precisely what it’s looking for in order to find it. For example, a computer can find a record of your 2002 adjusted gross income quite easily, but it can’t come up with Ross’s name based on the fact that he has big ears. On the other hand, you, as a mere human, have about zero chance of remembering the 2002 adjusted gross incomes of 10,000 people—something even a 10-year-old personal computer can tackle adroitly.

The point is not to call either you or your computer dumb but rather to make you see the differences in the way you both work. The computer’s capabilities compliment your own. That’s what makes them so wonderfully useful. They can’t compete with people in the brain function department, except in the imaginations of science fiction writers.

So What Is a Computer?

In today’s world and using today’s technology, a computer is an electronic device that uses digital logic under the direction of a program for carrying out calculations and making decisions based on those calculations. By this essential definition, a computer has four elements to its construction: electronics, digital logic, programming, and calculating/decision-making.

That’s a mouthful of a definition, and one that’s pretty hard to swallow, at least in one piece. To help you understand what all this means, let’s take a closer look at each of the elements of this definition.

Electronics

The technology that makes the circuits of a computer work is called electronics. Dig into this word, and you’ll find that it comes from electrons, the lightweight particles carrying a negative charge that comprise one of the fundamental constituents of atoms. The flow of electrons through metals is what we normally think of as electricity, a word taken from the Greek word for amber, elektrum. Static electricity, the stuff that creates sparks when you touch a metal doorknob after shuffling across a carpet on a dry winter day, was once most readily produced by rubbing amber with a silk cloth. So, the stuff that makes computers work is named after what’s essentially petrified pine tree sap.

Electronics is a technology that alters the flow of charges through electrical circuits. In the more than two centuries since Benjamin Franklin started toying with kites and keys, scientists have learned how to use electricity to do all sorts of useful things, with two of the most important being to operate motors and burn light bulbs. Motors can use electricity to move and change real-world objects. Lights not only help you see but also see what an electrical circuit is doing.

For the record, this electricity stuff is essential for building a computer. Babbage showed how it could be done with cams, levers, gears, and an old crank (which might have been Babbage himself). If you, like Babbage, have too much time on your hands, you could build a computer for yourself that runs hydraulically or with steam. In fact, scientists hope to build computers that toy with quantum states.

However, electricity and electronics have several big advantages for running a computer over nearly everything else (except that quantum stuff, and that’s why scientists are interested in it). Electricity moves quickly, at nearly the speed of light. Electrical devices are easy to interface with (or connect to) the real world. Think of those motors and electric lights. They operate in the real world, and an electrical computer can change the currents that run these motors and lights. Moreover, engineers have mastered the fabrication of electrical circuits of all sizes, down to those so small you can’t see them, even if you can see the results of their work on your computer screen. And above all, electrical circuits are familiar with off-the-shelf parts readily available, so you can easily tinker with electrical devices and build them economically. And that’s the bottom line. Just as celebrities are famous primarily for being famous, electronics are used a lot because they are used a lot.

Digital

Most of the circuits inside a computer use a special subset of electronic technology called digital electronics. The most important characteristic of the digital signals in these circuits is that they usually have only two states, which are most often described as on and off (or one and zero). Some special digital systems have more than two states, but they are more than you need to understand right now.

Usually the states are defined as the difference between two voltage levels, typically zero and some standard voltage, or between a positive and negative voltage value. The important part about the states of digital electronics is not what they are but what is between them—nothing. Certainly you can find a whole world of numbers between zero and one—fractions come to mind—but with digital technology the important fact is not whether there could be something between the digital states, but that anything other than the two digital states gets completely ignored. In essence, digital technology says if something is not a zero it is a one. It cannot be anything else.

Think about it, defining the world this way could make sense. For example, an object is either a horse or it is not a horse. Take a close look. It has hooves, four legs, a mane, and a tail, so you call it a horse. If it has six legs and a horny shell and, by the way, you just stepped on it, it is probably not a horse. Yes, we could get on sketchy ground with things such as horseflies, but nit-picking like that is just noise, which is exactly what the two digital states ignore.

This noise-free either/or design is what makes digital technology so important. Noise is normally a problem with electrical signals. The little bit of electricity in the air leaks into the electrical signals flowing through wires. The unwanted signal becomes noise, something that interferes with the signal you want to use. With enough noise, the signal becomes unusable. Think of trying to converse over the telephone with a television playing in the background. At some point, turning the TV up too much makes it impossible to hold a phone conversation. The noise level is simply too high.

Digital signals, however, allow circuits to ignore the noise. For computers, that’s wonderful, because every little bit of added noise could confuse results. Adding two plus two would equal four plus some noise—perhaps just a little, but a little might be the difference between being solvent and having your checking account overdrawn. Noise-free digital technology helps ensure the accuracy of computer calculations.

But sometimes things can get tricky. Say you encounter a beast with hooves, four legs, a mane, a tail, and black-and-white stripes. That’s a horse of a different color—a creature that most zoologists would call a zebra and spend hours telling you why it’s not a horse. The lesson here (besides being careful about befriending didactic zoologists) is that how you define the difference between the two digital states is critical. You have to draw the line somewhere. Once you do—that is, once you decide whether a zebra is a horse or not—it fits the two-state binary logic system.

Logic

Logic is what we use to make sense of digital technology. Logic is a way of solving problems, so you can consider it a way of thinking. For computers, “logically” describes exactly how they think. Computers use a special system of logic that defines rules for making decisions based, roughly, on the same sort of deductive reasoning used by Sherlock Holmes, some great Greek philosophers, and even you (although you might not be aware of it—and might not always use it).

Traditional logic uses combinations of statements to reach a conclusion. Here is an example of logical reasoning:

  1. Dragons eat people.

  2. I am a people.

  3. There is a dragon waiting outside my door.

Conclusion: If I go outside, I will be eaten.

This sort of reasoning works even when you believe more in superstition than science, as the example shows.

Computer circuitry is designed to follow the rules of a formal logic system and will always follow the rules exactly. That’s one reason why people sometimes believe computers are infallible. But computers make no judgments about whether the statements they operate on are true or false. As long as they get logically consistent statements (that is, the statements don’t contradict one another), they will reach the correct conclusions those statements imply.

Computers are not like editors, who question the content of the information they are given. To a computer, proposition 1, “Dragons eat people,” is accepted unquestioningly. Computers don’t consider whether a race of vegetarian dragons might be running about.

Computers don’t judge the information they process because they don’t really process information. They process symbols represented by electrical signals. People translate information into symbols that computers can process. The process of translation can be long and difficult. You do some of it by typing, and you know how hard that is—translating thoughts into words, then words into keystrokes.

Computers work logically on the symbols. For the computer, these symbols take electronic form. After all, electrical signals are the only things that they can deal with. Some symbols indicate the dragon, for example. They are the data. Other symbols indicate what to do with the data—the logical operations to carry out. All are represented electronically inside the computer.

Engineers figured out ways of shifting much of the translation work from you to the computer. Consequently, most of the processing power of a computer is used to translate information from one form to another, from something compatible with human beings into an electronic form that can be processed by the logic of the computer. Yes, someone has to write logic to make the translation—and that’s what computer programming is all about.

Programmability

A computer is programmable, which means that it follows a program. A program is a set of instructions that tells the computer how to carry out a specific task. In that way, a program is like a recipe for chocolate chip cookies (a metaphor we’ll visit again) that tells, step by step, how to mix ingredients together and then burn the cookies.

Programmability is important because it determines what a computer does. Change the program the computer follows, and it will start performing a new, different task. The function of the computer is consequently determined by the program.

That’s how computers differ from nearly all machines that came before them. Other machines are designed for a specific purpose: A car carries you from here to there; an electric drill makes holes in boards and whatever else gets in the way; a toaster makes toast from bread. But a computer? It can be a film editor, a dictation machine, a sound recorder, or a simple calculator. The program tells it what to do.

Calculating and Decision-Making

The real work that goes on inside your computer bears no resemblance to what you ask it to do. Programs translate human-oriented tasks into what the computer actually does. And what the computer does is amazingly simple: It reacts to and changes patterns of bits, data, and logic symbols.

Most of the time the bit-patterns the computer deals with represent numbers. In fact, any pattern of binary bits—the digital ones and zeroes that are the computer’s fodder—translates directly into a binary number. The computer manipulates these bit-patterns or binary numbers to come up with its answers. In effect, it is calculating with binary numbers.

The computer’s calculations involve addition, subtraction, multiplication, division, and a number of other operations you wouldn’t consider arithmetic, such as logic operations (and, or, and not) and strange tasks such as moving bits in a binary code left and right.

More importantly, the computer can compare two binary codes and do something based on that comparison. In effect, it decides whether one code is bigger (or smaller, or whatever) than another code and acts on that decision.

Work your way up through this discussion, and you’ll see that those codes aren’t necessarily numbers. They could be translations of human concerns and problems—or just translations of letters of the alphabet. In any case, the computer can decide what to do with the results, even if it doesn’t understand the ideas behind the symbols it manipulates.

The whole of the operation of the computer is simply one decision after another, one operation after another, as instructed by the computer’s program, which is a translation of human ideas into a logic system that uses binary code that can be carried out by electronic signals.

So what is a computer? You have one answer—one that more than anything else tells you that there is no easy answer to what a computer is. In fact, it takes an entire book to explain most of the details, the ins and outs of a computer. This book.

Leave a comment

Your email address will not be published. Required fields are marked *