“The next step in the silicon industry’s steadfast pursuit of ever smaller and faster chips has been unveiled. Intel has shown off what it says are the world’s first working chips which contain transistors with features just 32 billionths of a metre wide. Their production means the industry axiom that has underpinned all chip development for the last 40 years, known as Moore’s Law, remains intact. Speaking to BBC News, Dr Gordon Moore said that he expected the proposition that bears his name should continue ‘for at least another decade‘.”
I clearly remember gaming magazines around 1997 claiming that chip manufacturers had hit physical barriers at 0.35 microns, which is ten times thicker than today’s chips. I may remmber the numbers wrong, but the point is the same. There is clearly room for improvement and development.
Absolutely, although I do think that Doc. Moore has it right about reaching individual atoms being the limit. Anything subatomic and we are looking at a Schr~APdinger machine.
Man, that would rock!
I clearly remember in my early chip days some of the best minds in semiconductors (IBM, Bell labs) suggesting that 1u was near the end of the road and that you couldn’t build a DRAm > 64MB but that was back in the 3u and 64KB days. Those brick walls easily were crushed.
What could stop Moore’s law from continuing is not the scientific and engineering creativity of process engineers but the sheer capital required to do this and only Intel is really up to it, upto a point.
will eventually win.
in the past, the various supposed limits that were proposed and then beaten were always about how to assemble the transistors on the chip.
now they’re about leakage current and heat dissipation.
the smaller the feature, the higher the leakage current, and the greater the waste heat.
this is compounded by the density reducing the surface area available for cooling.
While there’s still some room left for improvement in heat removal, the 2nd law is not your friend.
That’s true with current chip material but whos to say that some clever spark / organisation wont develop a material that has dramatically better cooling and far less leakage at lower densities
Regardless of the material, the 2nd Law of Thermodynamics will get you in the end!
Only if we stick to our conventional chip specification.
You’re very much thinking ‘inside the box’.
Who’s to say that in the (distant) future we will even be using electron based processors? Might sound a bit sci-fi now, but look how fast IT technology has moved over the decades and how much our systems look science fictional to systems just within our recent memories let alone 30+ years ago or even before the invention of transistors.
So personally I think the only upper barriers to Moores Law is our imagination and motivation to develop.
Chips based on electrical current and semiconductors may not be the last word in computer technology, but there are some obvious physical limits which you simply cannot cross. What the exact limits are is debatable; in the case of using photons, the speed of light comes to mind, and since anything smaller than the current stable atomic particles (electron, proton, neutron [*] ) are either too unstable to be of use or simply cannot exist free (such as the uud or udd quarks that make protons and neutrons), another limit is the size of the smallest stable particle. You simply cannot make anything smaller than that, at least according to current knowledge.
Let us not forget quantum effects, either. Even though quantum computing, at least according to my limited understanding, makes use of said effects, they are also the ultimate limit in building anything small enough from atomic or subatomic particles.
Hypothetically you could probably make a Boolean circuit out of pure energy, but even then, quantum effects play a role, because, as the basic tenet of quantum physics says, energy is quantizied in packets called photons.
Even if you could somehow circumvent all these limits, one limit stands: you cannot build a computer that could simulate the entire universe it is in. I remember reading a better argument against this, but simple reductio ad absurdum using common sense works: Suppose you can build a computer that is able to simulate the entire universe. It could simulate every single subatomic particle, every photon, all forces of interaction, everywhere in the universe. However, because it simulates the entire universe, it would contain (a simulation of) itself. The simulation of the computer would also have to contain a simulation of itself, and so on ad infinitum. This is not possible unless the computer is infinite from the start, which man-made objects can never be. (The universe need not be infinite.)
When we hit the physical limits at the small end, let us think big. How about a computer the size of a city? An asteroid? A planet? A solar system? Massive parallelism would get a whole new meaning. Of course this has the obvious problems that information from one end of the computer to the other cannot travel faster than light, but those are implementation details.
[*] A neutron may spontaneously turn into a proton and an electron through beta decay, and a proton may spontaneously turn into a neutral pion and positron, but the likelyhood of this is extremely small and hasn’t been experimentally observed. http://en.wikipedia.org/wiki/Proton#Stability
If you want to take the argument to that insane level then one could argue that you’re looking at a processing unit as a strictly 3 dimensional object. The expansion of universe can be no distance at all if folded through the 4th dimension (just as folding a 2D sheet of paper through the 3rd dimension can make opposite corners of the paper adjacent.
Hell, you could throw science theory completely out of the window and argue that computers of the future will be built using TARDIS technology.
However all this is an absurd take on my original comment which was that if one limit is reached then humans will find a way to work around that limit (ie invent a device which doesn’t rely on a specific component with said limit – be it silicon, electrons or even the speed of light itself)
Old thread, but for future reference:
http://events.berkeley.edu/index.php/?event_ID=2216&date=2007-05-31…
In my (layman’s) mind the biggest limitation is the lithography technology that is used to fabricate computer chips.
They are essentially using photo-sensitives and chemical baths to run off copies on a finely detailed copy machine. Multiple layers notwithstanding this limits us to essentially two dimensionally constrained chip designs.
I believe that at some point (at which there will be a rather large temporary jump in the cost of the most tiny and powerful chips) the technology to fabricate microprocessors will need to move to scanning tunneling microscopes, or nano-robots. (or something that can actually build on that scale) Then who knows what we may see emerge.
Most people will tell you that light isn’t affected by gravity or that light always travels at the same speed.
Light IS affected (bent) when going past stars and light absolutely goes at different speeds. Eisteen predicted this and it has been proved. Scientists have actually caused light to move faster and slower. Also, light goes faster as it falls into a black hole.
There is also the fact that we can’t see everything. When people talk about light they are only talking about visible light. Light has many wavelengths and currently optical only uses visible light that I know of. There are faster forms of light than that.
Twenty years ago scientists would have said a lot of what is happening now is impossible. Obviously they were wrong. People will be amazed at what exists 20 years from now.
Picture 200 years ago. Think of what has been invented since then.