The Essential Guide to Semiconductors – Essential Guide to Memory

Essential Guide to Memory

In this chapter…

Overview of Memory Chips

Nonvolatile ROM

Volatile RAM

Memory Interfaces

Future Memories

Overview of Memory Chips

Memory chips are among the most plentiful kind of chips, making up a big part of semiconductor shipments and sales around the world. They’re produced and sold by the millions and nearly every electronic item that has any kind of chip in it has at least one memory chip as well. Computers are the biggest users of memory chips. Most PCs have at least eight memory chips, and they often contain two dozen or more. Memory chips also lead the way in new advances in chip manufacturing. New silicon technologies are tried out on memory chips before they’re used anywhere else.

Chip manufacturing gets better, faster, and more advanced all the time, as we know, and nowhere is this more evident than in memory chips. In fact, memory chips are often used as the yardstick for the whole semiconductor industry’s progress. They’re the standard by which other chips are measured.

As advanced as they are, memory chips are also among the cheapest of all chips. Even though they continually get better, faster, and more efficient, memory chips never seem to get more expensive. It’s this constant increase in functionality (or spiraling decrease in price, depending on your perspective) that has, more than anything else, led to the invasion of electronics into everyday low-cost items in our homes, cars, and pockets. If memory wasn’t so cheap, microprocessors, PCs, cell phones, and video games wouldn’t be so inexpensive, either.

About Memory Chips

As you know, memory chips store data. They do this electronically in the form of bits. A bit of memory is simply a tiny amount of electricity. Each bit either has an electric charge or it doesn’t; there’s no in between. That makes life easy for the memory chip. Each bit is either one way or the other: either black or white, up or down, true or false, one or zero. It’s pretty easy to remember a bit when there are only two options.

Each bit of storage in a memory chip uses a small amount of silicon on the chip. The bigger the memory chip, the more bits it can store. The storage capacity of memory chips has grown dramatically over the past 30 years, from chips that held 1,000 bits to today’s chips, which hold 1 billion bits—a million-fold improvement. Yet for all that, memory prices have remained surprisingly flat.

If you’re making memory chips, you try to squeeze each bit into as little space as possible to save silicon. If each bit is even slightly smaller, a million-bit chip will be significantly smaller and cheaper to manufacture. Memory prices are quite competitive, so even a tiny size or cost advantage over your competitors can spell millions of dollars of extra profit over a few months of production. Memory makers never stop trying to make transistors slightly smaller or pack them closer together. As with any commodity, small differences in manufacturing efficiency can translate into big differences in the bottom line.

The constant pressure and competition among memory makers and the lack of any big differentiating features has kept the prices of these chips consistently low. Figure 7.1 shows how the cost per bit of memory has plummeted over several generations of silicon technology. If the cost per bit falls, the cost for a given type of memory chip falls with it. Expensive memory chips quickly become cheap memory chips, and then disappear from the market completely. The commercial shelf life of memory chips is only slightly longer than that of romance novels.

Figure 7.1. The cost to manufacture 1 million bits (1 Mbit) of semiconductor memory chips has fallen by a factor of 1 million since the fi

Even as the cost per bit sinks like a rock, the number of bits per chip rises like a rocket. The net effect is that the price of a new memory chip has remained remarkably constant for years. You just get more for your money.

Memory chips are sold in several standard sizes, or capacities, like cartons of milk. You can’t buy an arbitrary number of memory bits like bananas to suit yourself; you have to choose from among a few standard sizes. Each new memory chip usually offers four times the capacity of the chip that came before it. For example, you might find chips that store 64 million bits, 256 million bits, and 1,024 million bits for sale at the same time. If what you really wanted was a 288-million-bit chip, you’re out of luck and you will have to pay for the 1,024-million-bit chip.

Apart from capacity, the only thing that differentiates one memory chip from another is speed. Some memory chips can retrieve their data faster than others. Large-capacity chips are generally slower than small-capacity chips (because they have to search longer). Some customers will pay a price premium for faster memory chips, but others won’t. If you’re a memory maker, you have to decide whether to design your chips for maximum capacity (making them slower) or for speed (making them less capacious). The balancing act is a delicate one, and it depends on market conditions, which are difficult to predict.

Types of Memory

Memory chips come in two basic types: permanent and impermanent. The permanent kind of chips store data bits forever, even after their electrical power has been turned off or disconnected from the chip. Their memory lasts for years and years without batteries. The impermanent kind, on the other hand, only works while the power is on. When you power down or disconnect these chips, all their memory is lost. Although that might seem pointless, the latter category is actually the more popular of the two.

Chips with permanent memories are called nonvolatile memory chips. Impermanent memory chips are called volatile memory. That doesn’t mean the chips are explosive, only that their memories are easily lost.

Another way to categorize memory chips is by ROM versus RAM. Your basic read-only memory (ROM) can’t be erased or have new data written into it. It’s permanent, like a stone tablet. You can erase and reprogram a random-access memory (RAM) chip as often as you want. All RAM chips are volatile, losing their memories if not carefully tended. All ROM chips are nonvolatile, storing their data forever, but not all nonvolatile chips are ROMs; some can be erased or changed. Got it?

RAM chips are more expensive than ROM chips but they’re indispensable for computers, handheld devices, and most other computer-like products. Microprocessors need an erasable “workspace” in which to perform their calculations and follow their programmed instructions. RAM chips provide this necessary capacity.

ROMs are cheaper than RAMs and they are used for permanent or semipermanent storage. ROM chips are useful for storing permanent bits of data, such as the security code for your garage-door opener or the tiny program that runs a thermostat. Computers also use ROM chips to store some permanent programs that are “etched in stone” and should never be lost, such as how to start up the computer and how to access a floppy diskette.

Nonvolatile ROM

ROM chips come in a few different types. Like all memory chips, ROM prices are based mostly on capacity, with a small price premium for speed. Some ROMs can retrieve data faster than others, and this might be worth a little extra money to someone building a fast system.

Masked ROM

The simplest and cheapest ROMs are called masked ROMs (MROMs). These chips are mass-produced with particular data bits already stored in them. Once the MROMs are made, the data can never be changed. This is usually the point: MROMs are good for mass-produced products like cars, in which the software for the antilock brake system should never be altered.

Because their data patterns are permanently set when the chips are built, MROMs are only made in large volumes for big customers. It isn’t cost-effective for a semiconductor maker to produce just a few thousand ROMs once in awhile. The cost of stopping the production line and setting up the tooling is prohibitive. So although MROMs are the cheapest form of ROMs, they’re reserved for only the largest customers or the most popular products.

Programmable ROM

A more cost-effective alternative for the average consumer is the programmable ROM (PROM). These are like ROM chips but they are manufactured blank, with no data bits stored in them. (Viewed another way, all the bits are zeros.) The customer then loads the contents of the PROM using a machine called a PROM burner. There’s only one chance to program a PROM; after the data has been loaded, the PROM’s contents are permanent.

Because PROMs can only be “burned” once, they’re called one-time programmable ROMs (OTP ROMs). These chips are a good way for medium-sized companies to enjoy the low cost of mass-produced ROMs without the risk of large volume commitments. Companies can keep a supply of blank PROMs in stock for as long as necessary, burning their contents at the last minute. Sometimes this is literally true: Fast-selling electronics goods are sometimes stored in the warehouse with a wire hanging from the shipping carton. The product receives its final programming just before it goes on the truck.

Erasable Programmable ROM

The next step in the cost-versus-flexibility scale is the erasable programmable ROM (EPROM). Unlike ROMs and PROMs, EPROMs can be erased if you have the right equipment. EPROMs are more expensive than either ROMs or PROMs, and they can’t be used in some safety-critical equipment because of the possibility of erasure. Still, EPROMs are extremely popular in computers and other high-volume products where standards and features need to change from time to time.

EPROMs are manufactured and shipped blank, just like PROMs. Customers load data bits into them just the same way, too, using a PROM burner. The only difference between a PROM and an EPROM is that you can erase an EPROM if you expose it to ultraviolet light for long enough. Usually 30 minutes under a special lamp is enough to erase an EPROM and return it back to its original blank condition.

This violates our earlier rule about ROMs being permanent and not erasable, but EPROMs still qualify as ROMs. Because it takes half an hour or more to erase and reprogram an EPROM, these chips do not compete with RAMs, which can be erased and reprogrammed in a fraction of a second. Even though EPROMs can be erased, they’re still nonvolatile; any data stored in an EPROM will last for years unless it’s deliberately erased.

EPROMs have a small round window on the top of their package, shown in Figure 7.2, to let ultraviolet light through. The window allows light to shine directly on the silicon chip inside. The EPROM chip, unlike most chips, is designed to be sensitive to ultraviolet light, gradually losing its memory.

Figure 7.2. Erasable programmable read-only memory (EPROM) chips have a window in the top that exposes the silicon chip underneath. Shinin

Electrically Erasable Programmable ROM

Electrically erasable programmable ROMs (EEPROMs) are an advancement over EPROMs because they can be erased electronically, instead of through exposure to ultraviolet light. This makes EEPROM chips easier to use in many systems. For example, an EEPROM can be permanently soldered onto a printed circuit board and still be erased without removing it, whereas a normal EPROM would have to be removed from the system and placed under an ultraviolet lamp. This makes EEPROMs more user friendly in PCs and other consumer items where the tools and the skills to remove and erase an EPROM just aren’t there.

Typically for engineers, EEPROMs are often called E2PROMs, or “e-squareds.” EEPROMs have many useful advantages over traditional EPROMs, but they never captured very much market share because they were quickly overtaken by flash ROMs, which are covered next.

Flash ROM

Flash ROMs are now the most popular type of erasable ROM (and no, erasable ROM is not really an oxymoron). Like EEPROMs, flash ROMs can be erased and reprogrammed completely electronically, without removing the chip from the system. This makes flash ROMs very useful in products for which a relatively naïve customer might be asked to erase and reprogram the chips without any specialized knowledge. MP3 music players and occasionally PCs fall into this category.

Where flash ROMs differ from EEPROMs is speed. EEPROMs are slow to erase but flash ROMs are pretty quick, as their name suggests. You can erase and reprogram an entire flash ROM in a few seconds, versus several minutes for a conventional EEPROM. For most consumer products, that difference has enormous psychological value. Customers are often nervous enough about reprogramming or updating the software on their products; getting it over with quickly alleviates a lot of tension.

Flash ROMs were originally much more expensive than EEPROMs, which were, in turn, more expensive than EPROMs, and so on. The price of flash chips has come down considerably in the 10 years or so since they were first developed because their volume has increased. Flash ROMs have now almost entirely replaced EEPROMs and are even encroaching on EPROM and PROM territory. The advantages of flash ROM’s erasability, relative speed, and high volume make it nearly ideal as an all-purpose permanent memory chip.

Volatile RAM

The other major category of memory chips besides ROMs is RAMs. RAM chips store data bits just like ROM chips do, but with two major differences: RAM chips can be erased and reprogrammed almost instantaneously and the data stored in RAM chips is not permanent. RAM chips “forget” everything they’re storing when you turn off their electricity. RAM chips are used in computers and other products that need lots of short-term memory storage.

RAM chips are inferior to ROM chips in almost every way, yet they are necessary because they work like electronic scratchpads. Even flash ROM chips take far too much time to erase and reuse to be practical for the main memory in a computer. RAM chips are more expensive and harder to use than ROM chips, but they’re the most popular kind of memory chips made. The price of every new PC is determined, in part, by the spot price of RAM chips when the computer was built.

RAM chips come in two basic types: SRAM and DRAM. The initial S and D stand for static and dynamic RAM, respectively. Neither description is particularly useful to us here, and even the term RAM is confusing. Random-access is meant to indicate that a computer can access any bit stored in a RAM chip in random order. For example, it doesn’t have to read out the bits in sequence, like watching a videotape from start to finish. RAM chips can provide any bit of data at any time, randomly.

Honestly, ROM chips can also read out data randomly, so RAM doesn’t accurately describe the difference between them at all. Read-write memory (RWM) would be more descriptive, but harder to pronounce.

SRAM

SRAM chips are simpler to design, use less electricity in operation, cost less to manufacture, and are faster than DRAM. Overshadowing all these advantages, however, SRAMs have one major drawback: They can only store about one-quarter as many data bits as a DRAM chip. The capacity of both SRAMs and DRAMs increases year by year, but SRAMs are always a factor of four behind their DRAM cousins. Because of that, SRAMs are generally used only in products that run on small batteries (where SRAM’s efficiency is valuable) or those that need to be very fast.

Unlike ROM chips, SRAM chips can store data as easily as they retrieve it. In fact, they can store data slightly faster than they retrieve it. This makes SRAM chips important for very fast computers. They are also used for the instruction cache and data cache inside microprocessor chips. An erasable ROM might be erased and reprogrammed with new data once per month, but SRAMs store new data thousands of times per second. SRAMs don’t have to be erased all at once like EPROMs, either. They can change one bit of data while holding millions of other bits intact.

DRAM

DRAMs are the volume leader of the entire memory chip market. Computer makers and many consumer electronics products use DRAMs in enormous quantities because they have the greatest storage capacity for the money. SRAM chips might be faster and use less energy, but DRAMs are the cheapest way to get lots of erasable memory.

To the engineer, DRAMs have some drawbacks, but their economic advantages outweigh their technical shortcomings. For starters, DRAM chips use more electrical power to operate than SRAMs do. The difference can be dramatic (to an engineer, that is). Battery-powered products that run for hours with SRAM chips might run for only 20 minutes using DRAM chips. Laptop PCs, unfortunately, use DRAMs because of their lower cost per bit; SRAM-based PCs would have noticeably longer battery life but be far more expensive.

DRAMs also “forget” easily, a serious snag for a memory chip. In fact, the average DRAM can store data only for about one one-hundredth of a second, even if the power is on all the time. After that, it risks forgetting what bits it was storing. To prevent this, DRAM chips must have their memories refreshed about 100 times per second. This takes special circuits, and sometimes a separate chip, just to keep refreshing each DRAM’s memory. All this requires even more electricity to operate.

DRAMs can’t be used while they’re refreshing, either. That means each DRAM chip is offline about 100 times per second, or about 5 percent of the time. These little timeouts aren’t noticeable to the average person, but they complicate the design of every computer, which has to tolerate memory chips that sometimes work and sometimes just don’t work.

DRAMs don’t retain the contents of their memory when the power is turned off. This makes them volatile memory chips, like SRAMs. That’s why most personal computers have a button or a menu option to shut down; it gives your PC a chance to copy the contents of its volatile memory chips onto the hard disk before the power goes away.

Memory Interfaces

For all the different kinds of memory chips in existence, there’s no standard way to connect these chips to other chips in a system. Flash ROMs, EEPROMs, DRAMs, and SRAMs all connect in different and incompatible ways to microprocessors and other chips. Figuring out ways to make these chips all work together is one of the reasons electrical engineers get paid so well.

ROMs are generally used as low-speed devices, so the problem isn’t so bad. Many products that use ROM chips aren’t particularly performance sensitive, so getting data out of the ROM in a quick and timely manner often isn’t important. RAMs are another matter.

RAMs, and especially DRAMs, are used mostly in computer systems where speed and low cost are paramount. It’s important, therefore, to come up with a quick and reliable way to connect fast DRAMs to even faster microprocessors. Transmitting data bits back and forth between these two chips at high speed is a science unto itself, and many different methods have emerged. Most of these are arcane topics best suited for engineering conferences.


Future Memories

The need for electronic data storage will probably be with us forever, and if history is any indication, the demand for memory will grow at an amazing rate. The downward spiral in the cost per bit of memory has fueled an equal and opposite reaction by increasing demand. As soon as memory becomes affordable, we find ways to fill it up. The first Cray-1 supercomputer had a whopping (for its time) 8 megabytes of memory, the same amount as the first Palm Pilot handheld organizer. IBM’s first PC came with 64 kilobytes (KB) of memory; now PCs have more memory than that embedded in their CD-ROM drives.

Memory chips have driven semiconductor manufacturing in many areas. The incessant pressure to reduce cost and increase storage capacity pushed DRAM makers to experiment with the first stacked transistors. Like apartment buildings in crowded cities, they started building upward, increasing capacity in three dimensions. DRAM makers also are the first to benefit from any factional improvements in photolithography that slightly decreases the size of transistors. The (almost) guaranteed large volume of DRAM sales encourages these companies to invest heavily in any technology that can increase their production capacity. Consumers benefit from a semiconductor race that keeps all the DRAM makers fighting for every advantage or any sliver of market share.

With memory makers scrambling for tiny advantages here and there, any truly momentous breakthrough would be a godsend. Storing data bits is pretty simple in concept, so researchers have experimented with numerous ways to completely reinvent the humble memory chip. Optical chips that use photons of light; three-dimensional arrays of silicon chips; iron-based chips that store data magnetically; subatomic phenomena that rely on quantum concepts of “spin” and “charm”—they’ve all been tried and tested. Eventually, some or all of these might lead to commercially viable products.

In the meantime, memory capacity will continue to rise as prices fall. The constant improvements in conventional semiconductor manufacturing that got us to this point show no sign of letting up. New hurdles present themselves, to be sure, but the forces of innovation prevail each time. The economic rewards for overcoming each set of technological hurdles ensures that the world’s best and brightest forces are brought to bear. With no end in sight, the million-fold increase in memory capacity and million-fold decrease in cost should multiply for another few powers of 10.