The Smallest New Thing in Computers

Daisy Raymondson

Writer’s comment: I wrote “The Smallest New Thing in Computers” for my technical writing class, English 104E. The topic occurred to me because quantum computers are the ultimate goal of some research I helped with over the summer at Los Alamos Labs. In the paper, I tried to show how some of the most fascinating fundamental research in physics can be applied in unexpected ways. I modelled the format and style after Newsweek, which includes stories on scientific topics for an educated audience of non-scientists.
—Daisy Raymondson

Instructor’s comment: Students typically come into our English 104E Scientific & Technical Writing classes concerned about their ability to write effective technical documents like lab reports, journal articles, and literature reviews. Rarely do they worry much about the lay paper assignment, and yet that assignment is in many ways the most formidable of all. Scientists who cannot communicate effectively with lay and executive readers risk cutting themselves off from the very agencies and institutions that might otherwise fund their research. Witness the current plight of physics research in the United States. Under the present White House, many American physicists have suddenly found themselves essentially de-funded, largely because they have failed to articulate cogently enough the value of their current work, whether it concern the magnetic containment of fusion energy or the untapped promise of quantum computers.
         Thankfully, in Daisy Raymondson we have an aspiring physicist who understands how to express complex scientific ideas with such clarity, precision, and irresistible logic that even non-scientific readers can appreciate the significance of the cutting edge research she so magisterially describes.
—Victor Squitieri, English Department



The Smallest New Thing In Computers

Future computers will harness the mysteries of quantum mechanics

         Imagine a computer that could break all the encryption on the Internet. All those online purchases you make, that convenient online banking—all would be transparent to this computer.
         Current encryption relies on the fact that not even the fastest computers can factor large integers into prime numbers in a reasonable amount of time. It can take months or years to do the computations necessary to steal one person’s credit card number, so it’s just not worth the trouble.
         But researchers are now laying the groundwork for quantum computers, which would possess enough sheer computational power to break those codes in seconds. Fortunately, quantum computers will not mean the end of safe encryption. Some of the same principles being used to create quantum computers are also being used to develop quantum encryption, which improves upon our current encryption. But factorization is not the only use for quantum computers; it is merely the most striking example of their competitive edge over conventional computers, which will soon reach the limits of their potential.
         Computers have steadily increased in speed and decreased in size since the days when they performed only simple calculations and took up entire rooms. In the last couple of decades, we have come to accept these trends as the norm and expect them to continue into the foreseeable future.
         But Moore’s law—the rule of thumb that computer chips double in speed about every 18 months—will not hold forever. Building faster computers requires making them smaller and packing more transistors into a smaller area. The size of a computer chip not only determines what devices it can be used in, but also limits its speed. Today’s chips work so fast that the time it takes electrons to carry signals from one part of a chip to another makes up a significant fraction of the time used in performing calculations.
         All computers store information in bits. A single bit is a one or a zero, and long sequences of bits make up binary codes that represent numerical information. When a conventional computer makes a calculation, it represents the bits in an electrical circuit. Whether a particular transistor is on or off indicates the value, either one or zero, of that bit.
         Even though new fabrication technologies will allow us to shrink future transistors even more, the laws of physics will limit just how small we can make them. Quantum mechanics, essentially a description of the behavior of light and atoms (see box), tells us that on very small scales, particles like electrons begin to behave as waves, making it increasingly difficult to confine and control them. Thus at some point, transistors become too small to remain in the on or off position reliably.
         But we can turn these effects to our advantage with quantum computers, which replace circuits and transistors with individual ionized atoms. The net charge on the ions makes it possible to suspend them in magnetic fields.
         The justification for using ions in place of transistors comes from quantum mechanics: a particle (an ion, for example) can have only certain values of energy, and any energy between these values is forbidden. Thus ions suspended in mag-netic fields can be used as bits. An ion in its lowest possible energy state is a zero; an ion in an excited state (higher energy level) is a one.
         If an ion makes a transition to a higher energy level, the extra energy must come from somewhere. The atom has to absorb the energy, usually in the form of a photon, or “particle” of light. An atom (or ion) will remain in an excited state until it emits another photon with the same energy. The time a given atom spends in an excited state varies at random (another feature of quantum mechanics), but the average lifetime is well-known to experimenters and depends on the energy state.
         For quantum computing experiments, scientists choose ions with relatively long-lived excited states, called metastable states, to use for their bits. The ground state is a zero; an excited state is a one. Lasers provide the light needed to excite the ions. Instead of flipping switches in circuits, quantum computers make calculations through the interactions of the trapped ions. The state of each ion influences the states of its neighbors, so a chain of trapped ions mimics an electrical circuit.
         Here is where quantum computers differ fundamentally from their conventional counterparts. We can observe an ion in either the “zero” or the “one” state. But if we do not observe it, then not only do we not know which state it occupies, it is not really in either one (see box on previous page). Or, in a way, the ion is in both states at the same time. This mixing of states, known as the principle of superposition, leads to a startling result. Instead of functioning as a single bit, each ion acts like many bits, each available to interact with neigh-boring ions to perform calculations.
         The resulting increase in computing power is phenomenal. Quantum computers could take on many problems which are currently intractable because of the mind-boggling amount of number crunching required. All this increased power would be available for running simulations and refining theories in physics, chemistry, geology, and medicine.
         Estimates of when we will see quantum computers in use vary, but most experts give a lower limit of fifteen years. Still, it is never too early to begin thinking of the possibilities.

Quantum Mechanics governs the behavior of very small particles such as photons, atoms, and electrons. Some salient features:
  • Small particles behave with characteristics of both particles and waves.
  • We cannot know the exact position and speed of a particle at the same time.
  • A particle can take on only certain discrete values of energy. To move from one energy state to another, it must absorb or emit a photon carrying energy equal to the en-ergy difference between the states.


The Superposition Principle and Schrodinger’s Cat
  • An atom or ion can exist only in certain discrete energy levels. Energies between these levels are not allowed.
  • To move between energy levels, the atom must emit or absorb a photon, or particle of light, carrying energy equal to the difference between the two levels.
  • If we do not observe an atom, it has some probability to be in either state. In fact, until we observe it, it exists in a combination of both states at once, called a superposition of states.

A famous “thought experiment” was proposed by Schrodinger to illustrate the principle of superposition. Imagine we have a cat in a box. We will measure the energy level of a certain atom. If the atom is in its ground state—call it “zero”Ñwe will release poisonous gas into the box and kill the cat. If the atom is in an excited state, which we’ll call “one,” we will let the cat live. If we directly control the release of the gas, we will know immediately whether the cat lives.

But what if we set a machine to release the gas upon detection of a one, and we do not observe the result? Then we do not know until we look into the box whether the cat is alive or dead. According to the most widely accepted interpretation of quantum mechanics, even if we wait a long time, the cat is really both alive and dead until we observe it, since both possibilities still exist. The cat exists in a superposition of alive and dead states!