When will we reach an endpoint? The answer (after the jump) will surprise you
“When it comes to atoms, language can be used only as in poetry. The poet, too, is not nearly so concerned with describing facts as with creating images.” – Niels Bohr, recipient of the 1922 Nobel Prize in Physics
I’ve had this ongoing notion that researchers will soon reach the point of creating the ultimate small storage particle. In discussing this with some nanotech friends, they felt we may reach an endpoint when we get to the size of the electron. So I decided to run with that assumption and calculate out how long it would take, based on Moore’s Law, to reach a point where we are storing information on electrons.
Moore’s Law has been talked about so much in recent years that some people think it was actually a law enacted by Congress and signed by the President.
Moore’s Law is the empirical observation made in 1965 by Intel co-founder Gordon Moore. He concluded that “the number of transistors on an integrated circuit will double every 24 months.” Although it is often quoted as doubling every 18 months, Intel’s official Moore’s Law page, as well as an interview with Gordon Moore himself, confirms his original thinking that it is every two years.
Expanding on this thinking, a similar law (sometimes called Kryder’s Law) has proven true for the ongoing concentration of hard disk storage cost per unit of information. The current rate of increase in hard drive capacity is increasing at roughly the same exponential rate as the increase in transistor count. However, recent trends show that this rate is dropping and has not been met for the last four years.
Another version states that RAM storage capacity increases at the same rate as processing power.
In writing this article, I asked retired University of Colorado Professor, Mark Dubin, if he could do the Moore’s Law math to determine how long it will be before we are storing information on individual electrons. Here is how he calculated it:
Extending Moore’s Law into the future, we will reach the size of the hydrogen atom in 2058 and the size of an electron in 2133
Professor Mark Dubin: “I included the diameter of a hydrogen atom, which may be more relevant than the non-quantum, classical scattering radius of an electron. My resulting graph (above) shows that the hydrogen atom radius will be reached by about 2058, and the electron “radius” by about 2133.
The commonly accepted value for Moore’s Law, that he states, is about 2 years. So a hydrogen atom radius would be reached in about 50 years, (26 iterations) from today. The electron radius would be reached in 124 years (62) iterations) from today.
The size of a quantum dot might be more relevant. Currently the smallest lithography quantum dot is about 1-2 nanometers (.0001-.0002 microns) in diameter, which on the graph is about 2050 (43 years and 22 iterations from now). I bring this up because I think that quantum computing using quantum dots will be needed at that small scale.
Given that very crude quantum computing was just recently achieved for 4-or-so bits, 2050 may be a realistic estimate of when useful quantum computing might be generally available. The reason I think quantum dots are more relevant than the electron itself, is that quantum computing relies on the energy levels of electrons that are normally part of some atom. Of course, free electron gases have been achieved in 2-dimensional bounded surfaces at low temperatures, so eventually the electron could be the ultimate limit (but then again there are other sub-atomic particles). However, that takes us past lithography, which is what Moore’s law is based on.”
I have no delusions about Moore’s Law holding true for the next 124 years. But this project has a very nice way of putting nanotechnology, and all of it’s degrees of smallness into perspective.
By Thomas Frey