A computer memory unit based on a single atom could mark the end of Moore's Law, a rule-of-thumb invented by Gordon Moore, cofounder of Fairchild Semiconductor and Intel, who suggested that the number of components that could be squeezed on to an integrated circuit, and thus computer power, would approximately double every eighteen months. A study published in Nature now shows that a single bit, a binary digit, can be stored in a single atom. This marks the ultimate limit of cramming components on to chips until subatomic particles become amenable to manipulation. The team demonstrated how they might read and write 1s and 0s to a single atom using the electrified tip of a scanning tunneling microscope.
The Moore, the merrier