In a new paper, the two document ample room for improving computational performance through better software, algorithms, and specialized chip architecture.
One opportunity is in slimming down so-called software bloat to wring the most out of existing chips. And they often failed to take full advantage of changes in hardware architecture, such as the multiple cores, or processors, seen in chips used today.
Thompson and his colleagues showed that they could get a computationally intensive calculation to run some 47 times faster just by switching from Python, a popular general-purpose programming language, to the more efficient C. Further tailoring the code to take full advantage of a chip with 18 processing cores sped things up even more. In just 0. That sounds like good news for continuing progress, but Thompson worries it also signals the decline of computers as a general purpose technology.
Indeed, the move to chips designed for specific applications, particularly in AI, is well under way. Deep learning and other AI applications increasingly rely on graphics processing units GPUs adapted from gaming, which can handle parallel operations, while companies like Google, Microsoft, and Baidu are designing AI chips for their own particular needs.
AI, particularly deep learning, has a huge appetite for computer power, and specialized chips can greatly speed up its performance, says Thompson.
But the trade-off is that specialized chips are less versatile than traditional CPUs. Quantum computing, carbon nanotube transistors, even spintronics, are enticing possibilities—but none are obvious replacements for the promise that Gordon Moore first saw in a simple integrated circuit. We need the research investments now to find out, though. A solution to P vs NP could unlock countless computational problems—or keep them forever out of reach.
The US government is starting a generation-long battle against the threat next-generation computers pose to encryption. Discover special offers, top stories, upcoming events, and more.
Thank you for submitting your email! It looks like something went wrong. David H. Bailey does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment. This says that the complexity of computer chips ought to double roughly every two years. The last two technology transitions have signalled that our cadence today is closer to two and a half years than two.
This was later modified to become:. That means memory chips today story around 2 billion times as much data as in Or, in more general terms, computer hardware today is around 2 billion times as powerful for the same cost. Yet even these analogies fall far short of a factor of 2 billion. This is in part because it is not at all a law in the sense a law of physics, but instead merely an observation.
But on the 40th anniversary , Intel was happy to celebrate it and Moore was pleased to note that it still seemed to be accurate. Even Intel is competing with itself and its industry to create what ultimately may not be possible. In , with its nanometer nm processor, Intel was able to boast of having the world's smallest and most advanced transistors in a mass-produced product.
In , Intel launched an even smaller, more powerful 14nm chip; and today, the company is struggling to bring its 10nm chip to market. For perspective, one nanometer is one billionth of a meter, smaller than the wavelength of visible light. The diameter of an atom ranges from about 0.
The vision of an endlessly empowered and interconnected future brings both challenges and benefits. Shrinking transistors have powered advances in computing for more than half a century, but soon engineers and scientists must find other ways to make computers more capable. Instead of physical processes, applications and software may help improve the speed and efficiency of computers. Cloud computing, wireless communication, the Internet of Things IoT , and quantum physics all may play a role in the future of computer tech innovation.
Despite the growing concerns around privacy and security, the advantages of ever-smarter computing technology can help keep us healthier, safer, and more productive in the long run. In , George Moore posited that roughly every two years, the number of transistors on microchips will double. What this means specifically, is that transistors in integrated circuits have become faster.
Transistors conduct electricity, which contain carbon and silicon molecules that can make the electricity run faster across the circuit. The faster the integrated circuit conducts electricity, the faster the computer operates. What this means is that computers are projected to reach their limits because transistors will be unable to operate within smaller circuits at increasingly higher temperatures.
This is due to the fact that cooling the transistors will require more energy than the energy that passes through the transistor itself. Bureau of Labor Statistics. Accessed August 20, MIT Technology Review. IEEE Spectrum. Moore, National Nanotechnology Initiative. Company Profiles. Your Privacy Rights. To change or withdraw your consent choices for Investopedia. At any time, you can update your settings through the "EU Privacy" link at the bottom of any page.
These choices will be signaled globally to our partners and will not affect browsing data. We and our partners process data to: Actively scan device characteristics for identification. I Accept Show Purposes.
Your Money. Personal Finance. Your Practice.
0コメント