You are here: Home page > Computers > Computer memory

A 256MB flash memory integrated circuit chip.

Computer memory

Is your memory like an elephant's... or is it more like a sieve? You often hear people comparing themselves to one of those things, but you almost never hear someone say their memory is like a computer. That's partly because human brains and computer memories have very different purposes and operate in quite different ways. But it also reflects the fact that where we humans often struggle to remember names, faces, and even the day of the week, computer memories are the closest thing we have to memory perfection. How exactly do these "remarkable rememberers" actually work? Let's take a closer look!

Photo: A computer memory chip like this is an example of an integrated circuit. That means it's a miniaturized collection of thousands of electronic parts (usually called components) created on a tiny chip of silicon about the size of a pinkie nail. This one is a 1-gigabit NAND flash memory chip from a USB memory stick.

Sponsored links

Contents

  1. What is memory?
  2. The two types of memory
  3. Internal memory
  4. The growth of RAM
  5. Auxiliary memory
  6. How memories store information in binary
  7. A brief history of computer memory
  8. Find out more

What is memory?

The basic purpose of memory—human or machine—is to keep a record of information for a period of time. One of the really noticeable things about human memory is that it's extremely good at forgetting. That sounds like a major defect until you consider that we can only pay attention to so many things at once. In other words, forgetting is most likely a clever tactic humans have evolved that helps us to focus on the things that are immediately relevant and important in the endless clutter of our everyday lives—a way of concentrating on what really matters. Forgetting is like turning out old junk from your closet to make room for new stuff. [1]

Computers don't remember or forget things the way that human brains do. Computers work in binary (explained more fully in the box below): they either know something or they don't—and once they've learned, barring some sort of catastrophic failure, they generally don't forget. Humans are different. We can recognize things ("I've seen that face before somewhere") or feel certain that we know something ("I remember learning the German word for cherry when I was at school") without necessarily being able to recollect them. Unlike computers, humans can forget... remember... forget... remember... making memory seem more like art or magic than science or technology. When clever people master tricks that allow them to memorize thousands of pieces of information, they're celebrated like great magicians—even though what they've achieved is far less impressive than anything a five-dollar, USB flash memory stick could do!

Illustration of the human brain by Jan Stephan van Calcar c.1543.

Illustration: Computers remember things in a very different way from human brains, although it is possible to program a computer to remember things and recognize patterns in a brain-like way using what are called neural networks. Historic illustration of brain anatomy c.1543 by Jan Stephan van Calcar, who worked closely with the pioneering anatomist Andreas Vesalius.

Sponsored links

The two types of memory

One thing human brains and computers do have in common is different types of memory. Human memory is actually split into a short-term "working" memory (of things we've recently seen, heard, or processed with our brains) and a long-term memory (of facts we've learned, events we've experienced, things we know how to do, and so on, which we generally need to remember for much longer). A typical computer has two different kinds of memory as well.

There's a built-in main memory (sometimes called internal memory), made up of silicon chips (integrated circuits). It can store and retrieve data (computerized information) very quickly, so it's used to help the computer process whatever it's currently working on. Generally, internal memory is volatile, which means it forgets its contents as soon as the power is switched off. That's why computers also have what's called auxiliary memory (or storage) as well, which remembers things even when the power is disconnected. In a typical PC or laptop, auxiliary memory is generally provided by a hard drive or a flash memory. Auxiliary memory is also called external memory because in older, larger computers, it was typically housed in a completely separate machine connected to the main computer box by a cable. In a similar way, modern PCs often have plug-in auxiliary storage in the form of USB flash memory sticks, SD memory cards (which plug into things like digital cameras), plug in hard-drives, CD/DVD ROMs and rewriters and so on.

A 20GB iPod PCMCIA hard drive and a 30GB laptop hard drive (inside view), both made by Toshiba.

Photo: These two hard drives are examples of auxiliary computer memory. On the left, we have a 20GB PCMCIA hard drive from an iPod. On the right, there's a somewhat bigger 30GB hard-drive from a laptop. The 30GB hard drive can hold about 120 times more information than the 256MB flash memory chip in our top photo. See more photos like this in our main article on hard drives.

In practice, the distinction between main memory and auxiliary memory can get a little blurred. Computers have a limited amount of main memory (typically somewhere between 512MB and 4GB on a modern computer). The more they have, the more quickly they can process information, and the faster they get things done. If a computer needs to store more space than its main memory has room for, it can temporarily move less important things from the main memory onto its hard drive in what's called a virtual memory to free up some space. When this happens, you'll hear the hard drive clicking away at very high speed as the computer reads and writes data back and forth between its virtual memory and its real (main) memory. Because hard drives take more time to access than memory chips, using virtual memory is a much slower process than using main memory—and it really slows your computer down. That's essentially why computers with more memory work faster.

Internal memory

RAM and ROM

The chips that make up a computer's internal memory come in two broad flavors known as RAM (random access memory) and ROM (read-only memory). RAM chips remember things only while a computer is powered on, so they're used for storing whatever a computer is working on in the very short term. ROM chips, on the other hand, remember things whether or not the power is on. They're preprogrammed with information in the factory and used to store things like the computer's BIOS (the basic input/output system that operates fundamental things like the computer's screen and keyboard). RAM and ROM are not the most helpful names in the world, as we'll shortly find out, so don't worry if they sound baffling. Just remember this key point: the main memory inside a computer is based on two kinds of chip: a temporary, volatile kind that remembers only while the power is on (RAM) and a permanent, nonvolatile kind that remembers whether the power is on or off (ROM).

The growth of RAM

Today's machines have vastly more RAM than early home computers. This table shows typical amounts of RAM for Apple computers, from the original Apple I (released in 1976) to the iPhone 12 smartphone (released over four decades later) with about half a million times more RAM onboard! These are rough comparisons based on the idea of KB meaning about 1000 bytes, MB meaning about a million bytes, and GB meaning about a billion. In fact, KB, MB, and GB can be a little bit ambiguous, since in computer science, 1KB is actually 1024 bytes. Don't worry about it: it really doesn't change these comparisons very much.)

Year Machine Typical RAM ~ × Apple I
1976 Apple I 8KB 1
1977 Apple ][ 24KB 3
1980 Apple III 128KB 16
1984 Macintosh 256KB 32
1986 Mac Plus 1MB 125
1992 Mac LC 10MB 1250
1996 PowerMac 16MB 2000
1998 iMac 32MB 4000
2007 iPhone 128MB 16000
2010 iPhone 4 512MB 64000
2016 iPhone 7 3GB 375000
2020 iPhone 12 4GB 500000

Apple ][ microcomputer in a museum glass case

Photo: The Apple ][ had a basic 4K of memory, expandable to 48K. That seemed a huge amount at the time, but a modern smartphone has about 60,000 times more RAM than its 48K predecessor. In 1977, a 4K RAM upgrade for an Apple ][ cost a whopping $100, which works out at $1 for 41 bytes; today, it's easy to find 1GB for less than $10, so $1 buys you over 100MB—about 25 million times more memory for your money!

Sponsored links

Random and sequential access

This is where things can get slightly confusing. RAM has the name random access because (in theory) it's just as quick for the computer to read or write information from any one part of a RAM memory chip as from any other. (Incidentally, that applies just as much to most ROM chips, which you could say are examples of nonvolatile, RAM chips!) Hard drives are also, broadly speaking, random-access devices, because it takes roughly the same time to read information from any point on the drive.

Artwork showing how a hard drive can access information randomly. Artwork showing how a tape drive has to access information sequentially.
Picture: 1) Random access: A hard drive can read or write any piece of information in more or less the same amount of time, just by scanning its read-write head back and forth over the spinning platter. 2) Sequential access: A tape drive has to spool the tape backward or forward until it's at the right position before it can read or write information.

Not all kinds of computer memory is random access, however. It used to be common for computers to store information on separate machines, known as tape drives, using long spools of magnetic tape (like giant-sized versions of the music cassettes in old-fashioned Sony Walkman cassette players). If the computer wanted to access information, it had to spool backward or forward through the tape until it reached exactly the point it wanted—just like you had to wind back and forth through a tape for ages to find the track you wanted to play. If the tape was right at the beginning but the information the computer wanted was at the very end, there was quite a delay waiting for the tape to spool forward to the right point. If the tape just happened to be in the right place, the computer could access the information it wanted pretty much instantly. Tapes are an example of sequential access: information is stored in sequence and how long it takes to read or write a piece of information depends where the tape happens to be in relation to the read-write head (the magnet that reads and writes information from the tape) at any given moment.

An IBM System/370 mainframe computer with tape drives in the background.

Photo: Sequential access in action: This is the operator's terminal of an IBM System/370 mainframe computer dating from 1981. You can see a bank of five tape drives whirring away in the background and, behind them, cupboards filled with stored tapes. If the computer needed to read some really old data (say, last year's payroll records or a backup of data made a few days ago), a human operator had to search for the correct tape in the cupboard and then "mount it" (load it into the drive) before the machine could read it! We still talk about "mounting" discs and drives to this day, even when all we mean is getting a computer to recognize some part of its memory that isn't currently active. Photo courtesy of NASA Glenn Research Center (NASA-GRC).

An IBM magnetic tape library.

Photo: One small part of a giant library of magnetic tapes! Tape libraries were commonplace well into the 1980s. Although we hear of it much less now, tape is still widely used today. Photo courtesy of the Carol M. Highsmith Archive, Library of Congress, Prints and Photographs Division.

DRAM and SRAM

RAM comes in two main varieties called DRAM (dynamic RAM) and SRAM (static RAM). DRAM is the less expensive of the two and has a higher density (packs more data into a smaller space) than SRAM, so it's used for most of the internal memory you find in PCs, games consoles, and so on. SRAM is faster and uses less power than DRAM and, given its greater cost and lower density, is more likely to be used in the smaller, temporary, "working memories" (caches) that form part of a computer's internal or external memories. It's also widely used in portable gadgets such as cellphones, where minimizing power consumption (and maximizing battery life) is extremely important.

The differences between DRAM and SRAM arise from the way they're built out of basic electronic components. Both types of RAM are volatile, but DRAM is also dynamic (it needs power to be zapped through it occasionally to keep its memory fresh) where SRAM is static (it doesn't need "refreshing" in the same way). DRAM is more dense (stores more information in less space) because it uses just one capacitor and one transistor to store each bit (binary digit) of information, where SRAM needs several transistors for each bit.

NASA 3D stacked memory circuit from 1999.

Photo: Most memory chips are two dimensional, with the transistors (electronic switches) that store information laid out in a flat grid. By contrast, in this 3D stack memory, the transistors are arranged vertically, as well as horizontally, so more information can be packed into a smaller space. Photo courtesy of NASA Langley Research Center (NASA-LaRC) and Internet Archive.

ROM

Like RAM, ROM also comes in different varieties—and, just to confuse matters, not all of it is strictly readonly. The flash-memory you find in USB memory sticks and digital camera memory cards is actually a kind of ROM that retains information almost indefinitely, even when the power is off (much like conventional ROM) but can still be reprogrammed relatively easily whenever necessary (more like conventional RAM). Technically speaking, flash memory is a type of EEPROM (electrically erasable programmable ROM), which means information can be stored or wiped out relatively easily just by passing an electric current through the memory. Hmmm, you might be thinking, doesn't all memory work that way... by passing electricity through it? Yes! But the name is really a historic reference to the fact that erasable and reprogrammable ROM used to work a different way. Back in the 1970s, the most common form of erasable and rewritable ROM was EPROM (erasable programmable ROM). EPROM chips had to be erased by the relatively laborious and inconvenient method of first removing them from their circuit and then blasting them with powerful ultraviolet light. Imagine if you had to go through that longwinded process every time you wanted to store a new set of photos on your digital camera memory card.

32K EPROM chip showing the UV light window. AMD AM27C256 dating from 1986.

Photo: An old-fashioned 32K EPROM chip dating from 1986. You could only erase and reprogram these by blasting ultraviolet light through the little circular window!

Gadgets such as cellphones, modems, and wireless routers often store their software not on ROM (as you might expect) but on flash memory. That means you can easily update them with new firmware (relatively permanent software stored in ROM), whenever an upgrade comes along, by a process called "flashing." As you may have noticed if you've ever copied large amounts of information to a flash memory, or upgraded your router's firmware, flash memory and reprogrammable ROM works more slowly than conventional RAM memory and takes longer to write to than to read.

Auxiliary memory

The most popular kinds of auxiliary memory used in modern PCs are hard drives, CD/DVD ROMs, and solid-state drives (SSDs), which are similar to hard drives only they store information on large amounts of flash memory instead of spinning magnetic discs.

Typical 3.5mm floppy drive.

Photo: The 3.5-inch floppy disk was the most popular form of auxiliary memory in the 1980s and 1990s—these were the flash sticks of their time! Inside the hard, plastic protective case, there's a flimsy spinning circle of magnetic material—that's the floppy bit. You can see it if you gently slide the shutter at the top.

But in the long and fascinating history of computing, people have used all kinds of other memory devices, most of which stored information by magnetizing things. Floppy drives (popular from about the late-1970s to the mid-1990s) stored information on floppy disks. These were small, thin circles of plastic, coated with magnetic material, spinning inside durable plastic cases, which were gradually reduced in size from about 8 inches, through 5.25 inches, down to the final popular size of about 3.5 inches. Zip drives were similar but stored much more information in a highly compressed form inside chunky cartridges. In the 1970s and 1980s, microcomputers (the forerunners of today's PCs) often stored information using cassette tapes, exactly like the ones people used back then for playing music. You might be surprised to hear that big computer departments still widely use tapes for backing up data today, largely because this method is so simple and inexpensive. It doesn't matter that tapes work slowly and sequentially when you're using them for backups, because generally you want to copy and restore your data in a very systematic way—and time isn't necessarily that critical.

Left: A large magnetic core memory unit from 1954. Middle: A magnetic core memory circuit from the unit. Right: An individual ferrite magnetic memory core from the circuit

Photo: Memory as it used to be in 1954. This closet-sized magnetic core memory unit (left), as tall as an adult, was made up of individual circuits (middle) containing tiny rings of magnetic material (ferrite), known as cores (right), which could be magnetized or demagnetized to store or erase information. Since any core could be read from or written to as easily as any other, this was a form of random access memory. Photos courtesy of NASA Glenn Research Center (NASA-GRC).

Going back even further in time, computers of the 1950s and 1960s recorded information on magnetic cores (small rings made from ferromagnetic and ceramic material) while even earlier machines stored information using relays (switches like those used in telephone circuits) and vacuum tubes (a bit like miniature versions of the cathode-ray tubes used in old-style televisions).

How memories store information in binary

Photos, videos, text files, or sound, computers store and process all kinds of information in the form of numbers, or digits. That's why they're sometimes called digital computers. Humans like to work with numbers in the decimal (base 10) system (with ten different digits ranging from 0 through 9). Computers, on the other hand, work using an entirely different number system called binary based on just two numbers, zero (0) and one (1). In the decimal system, the columns of numbers correspond to ones, tens, hundreds, thousands, and so on as you step to the left—but in binary the same columns represent powers of two (two, four, eight, sixteen, thirty two, sixty four, and so on). So the decimal number 55 becomes 110111 in binary, which is 32+16+4+2+1. You need a lot more binary digits (also called bits) to store a number. With eight bits (also called a byte), you can store any decimal number from 0–255 (00000000–11111111 in binary).

One reason people like decimal numbers is because we have 10 fingers. Computers don't have 10 fingers. What they have instead is thousands, millions, or even billions of electronic switches called transistors. Transistors store binary numbers when electric currents passing through them switch them on and off. Switching on a transistor stores a one; switching it off stores a zero. A computer can store decimal numbers in its memory by switching off a whole series of transistors in a binary pattern, rather like someone holding up a series of flags. The number 55 is like holding up five flags and keeping one of them down in this pattern:

Binary code illustrated with a pattern of flags
Artwork: 55 in decimal is equal to (1×32) + (1×16) + (0×8) + (1×4) + (1×2) + (1×1) = 110111 in binary. A computer doesn't have any flags inside it, but it can store the number 55 with six transistors switched on or off in the same pattern.

So storing numbers is easy. But how can you add, subtract, multiply, and divide using nothing but electric currents? You have to use clever circuits called logic gates, which you can read all about in our logic gates article.

A brief history of computer memory

Here are just a few selected milestones in the development of computer memory; for the bigger picture, please check out our detailed article on the history of computers.

Sponsored links

Don't want to read our articles? Try listening instead

If you'd rather listen to our articles than read them, please subscribe to our new podcast on Apple Podcasts, Spotify, Audible, Amazon, Podchaser, or your favorite podcast app, or listen below:

Find out more

On this website

You might like these other articles on our site covering similar topics:

Books

General introductions

Upgrading your PC memory

Articles

Patents

These are much more detailed technical descriptions of how memory works:

References

  1.    For more about human memory strategies, see Daniel Schacter's The Seven Sins of Memory: How the Mind Forgets and Remembers, Houghton Mifflin Harcourt, 2002. On the specific idea that forgetting is a useful feature of memory, see Scott A. Small's Forgetting: The Benefits of Not Remembering, Crown, 2021.

Please do NOT copy our articles onto blogs and other websites

Articles from this website are registered at the US Copyright Office. Copying or otherwise using registered works without permission, removing this or other copyright notices, and/or infringing related rights could make you liable to severe civil or criminal penalties.

Text copyright © Chris Woodford 2010, 2021. All rights reserved. Full copyright notice and terms of use.

Follow us

Rate this page

Please rate or give feedback on this page and I will make a donation to WaterAid.

Tell your friends

If you've enjoyed this website, please kindly tell your friends about us on your favorite social sites.

Press CTRL + D to bookmark this page for later, or email the link to a friend.

Cite this page

Woodford, Chris. (2010/2020) Computer memory. Retrieved from https://www.explainthatstuff.com/how-computer-memory-works.html. [Accessed (Insert date here)]

Can't find what you want? Search our site below

More to explore on our website...

Back to top