dsldragon2002: What is the technological limit of computing using current techology and when will it be reached?
Last i read Blue Gene was the fastest / most powerful supercomputer, but is there a theoretical limit to this line of technology?
Answers and Views:
Answer by imillard35
The upper limit seems to be imposed by the nature of matter itself.
How fast can the electrical signal travel through the circuit paths of the CPU? Even the speed of light is getting in the way. Scientists are looking at room-temperature superconductors to solve this problem – but that’s not available yet.
Another problem is how close can the circuit pathways be to each other. Current technology has these paths only a few microns apart. The individual molecules of silicon and metal are too large to allow greater density of circuitry.
The signals are essentially radio frequency, and there is problem with leakage between paths.
It will only be another couple of years before the rate of increase begins to slow, as we approach the boundaries of our present technology.
Answer by James HWell, with computers, you have to be
a little more careful to distinquish
between hardware, software, architecture,
and theory.
Supercomputers are mostly only used
by people in DOE and DOD.
And that’s mostly because there they’re
the only people who are obsessed
with the type of systems that used
to be called “Real Time” Systems,
but are now mostly called:
“IBM spinoffs”.Answer by Aaron777
Current computing technology relies on hundreds, if not thousands of different technologies to operate. Many of these technologies may still be able to scale for years, or decades, while some may have already “hit the wall” so to say. So it’s difficult to say that in “x” amount of years computers won’t get any faster. You may find that the rate of improvement slows however.
If you want to look at some of the most fundamental technologies today photolithography would definitely be one of the most important. This is the broad technology used to create silicon chips. Photolithography itself relies on hundreds of different technologies; spin coating, wet etching, dry etching, DRIE, doping, etc. But one of the most important factors in photolithography is the process size, this refers to the smallest features the process is able to produce. Chip manufacturers have been steadily able to shrink this minimum feature size over the years, this is what has allowed them to pack more and more transistors into a chip to increase computing power while lowering costs and electricity usage. There are limits to this scaling however and many believe it could happen in the next decade or so. After that? It’s hard to say, new substrates are being investigated that may allow further scaling, and as I said there are plenty of other avenues to improve overall computer performance. Faster memory, faster storage networks, improved bus speeds and more efficient software for example would all improve overall computer performance.
If you’re talking theoretical limits then the only limit with todays technology is money. If blue gene is the fastest supercomputer in the world then what if I bought two of them and hooked them together? Voila, new fastest supercomputer. Then I could buy 4, or 8. You see where I’m going. Realistically there are limits but I doubt they will be reached anytime soon. And anyone that does give you a set timeline doesn’t know what they’re talking about.
If you’re really interested in high performance computers you should research quantum computers. They’re a fundamentally different type of computer that could potentially perform at orders of magnitude greater than even the fastest supercomputers in existence today. Whether or not they will ever come to fruition is still up in the air but steady progress is being made.
Leave a Reply