The next century of computing

80 brief predictions for the future

In this article, I will give 80 brief predictions about the future of computing and its impact on the world at large. These are largely predictions that you won't find elsewhere, and this is certainly not an exhaustive list of my ideas. However, based on much of my theoretical research and various trends I've seen unfold, these are the places where I see things end up deviating from common expectations.

Many of them take the form of a niche that I see existing now or in the future. How quickly these predictions come true will heavily depend on how quickly people can find these niches and start filling them. In some ways this is a guide for people who may be interested in building a future which I hope you will agree with me is more interesting and inspiring than many visions of the future.

This is not the place for me to justify these predictions with long explanations of course. These explanations may be fleshed out in future articles.

Let's start by eliminating the obvious. Moore's Law is coming to an end. It does slow down rather than come to a screeching halt, but Dennard's scaling is already broken down, eliminating many of the real benefits of further scaling for chips that aren't nearly fully memory.

The end of Moore's Law will quickly result in much weirder material. The coming decades will be a Cambrian explosion of bizarre material.

Existing architectures will be dropped. More than x86, ARM or RISC-V. However, it will go much further than people think. The basic concept of computing as a machine executing a stream of instructions, shuffling data between processor and memory, would eventually be abandoned in favor of more exotic models. The models we have today will prove to be largely arbitrary, withholding potential efficiencies and theoretical insights with models that reflect the naive computer theory of the 1940s and 50s that has yet to die far more than any fundamental nature of the 'computing.

Legacy code will still be executable via emulation. Native hardware support will be dropped. Modern hardware already expends over 99.9% of its complexity and energy on smoke and mirrors to give the illusion of being an incredibly fast PDP-11. Sophisticated features such as out-of-order execution, cache coherency, etc. will be removed and replaced with simpler, more efficient and more scalable hardware.

General-purpose processors will get several orders of magnitude faster even after Moore's Law ends, but only by exploiting a trade-off between performance and familiarity.

Hardware will begin to conform less to human models and more to physics. Computers are made of real atoms, use real energy, produce real heat, occupy real space, and take real time to send bits to different places in that space. These are things that cannot be abstracted without incurring significant costs, and without Moore's Law providing free efficiency subsidies in other ways, real efficiency gains will come from deconstructing these old models and a deeper understanding of the relationship between calculus and fundamental physics.

Much of the mandatory hardware design and emulation-focused compatibility overhaul will fix old design flaws. For example, modern operating systems today only have millions of lines of code due to arbitrary hardware decisions in the 1990s that produced a performance advantage for integrating driver code into the operating system.

It will become convenient for ordinary people to write operating systems again from scratch.

Silicon compilers will become commonplace. In response, innovative fabs will expand wafer sharing services. It will soon be possible to design a custom processor or ASIC, upload the files to the TSMC website, pay $500, and have a 10-chip bundle delivered to your door a few months later. Hardware engineering will become almost as mainstream as software engineering.

In the short term, the inevitable JavaScript chip design frameworks will create massive security issues, embedded in immutable silicon. Software engineers are not prepared for hardware engineering. The low costs and low volumes associated with shared wafer prototyping will alleviate the problems somewhat, but eventually the powerful formal methods tools already used by computer hardware engineers for decades will be put into the hands of a much wider audience. wider.

The computer industry...

The next century of computing
80 brief predictions for the future

In this article, I will give 80 brief predictions about the future of computing and its impact on the world at large. These are largely predictions that you won't find elsewhere, and this is certainly not an exhaustive list of my ideas. However, based on much of my theoretical research and various trends I've seen unfold, these are the places where I see things end up deviating from common expectations.

Many of them take the form of a niche that I see existing now or in the future. How quickly these predictions come true will heavily depend on how quickly people can find these niches and start filling them. In some ways this is a guide for people who may be interested in building a future which I hope you will agree with me is more interesting and inspiring than many visions of the future.

This is not the place for me to justify these predictions with long explanations of course. These explanations may be fleshed out in future articles.

Let's start by eliminating the obvious. Moore's Law is coming to an end. It does slow down rather than come to a screeching halt, but Dennard's scaling is already broken down, eliminating many of the real benefits of further scaling for chips that aren't nearly fully memory.

The end of Moore's Law will quickly result in much weirder material. The coming decades will be a Cambrian explosion of bizarre material.

Existing architectures will be dropped. More than x86, ARM or RISC-V. However, it will go much further than people think. The basic concept of computing as a machine executing a stream of instructions, shuffling data between processor and memory, would eventually be abandoned in favor of more exotic models. The models we have today will prove to be largely arbitrary, withholding potential efficiencies and theoretical insights with models that reflect the naive computer theory of the 1940s and 50s that has yet to die far more than any fundamental nature of the 'computing.

Legacy code will still be executable via emulation. Native hardware support will be dropped. Modern hardware already expends over 99.9% of its complexity and energy on smoke and mirrors to give the illusion of being an incredibly fast PDP-11. Sophisticated features such as out-of-order execution, cache coherency, etc. will be removed and replaced with simpler, more efficient and more scalable hardware.

General-purpose processors will get several orders of magnitude faster even after Moore's Law ends, but only by exploiting a trade-off between performance and familiarity.

Hardware will begin to conform less to human models and more to physics. Computers are made of real atoms, use real energy, produce real heat, occupy real space, and take real time to send bits to different places in that space. These are things that cannot be abstracted without incurring significant costs, and without Moore's Law providing free efficiency subsidies in other ways, real efficiency gains will come from deconstructing these old models and a deeper understanding of the relationship between calculus and fundamental physics.

Much of the mandatory hardware design and emulation-focused compatibility overhaul will fix old design flaws. For example, modern operating systems today only have millions of lines of code due to arbitrary hardware decisions in the 1990s that produced a performance advantage for integrating driver code into the operating system.

It will become convenient for ordinary people to write operating systems again from scratch.

Silicon compilers will become commonplace. In response, innovative fabs will expand wafer sharing services. It will soon be possible to design a custom processor or ASIC, upload the files to the TSMC website, pay $500, and have a 10-chip bundle delivered to your door a few months later. Hardware engineering will become almost as mainstream as software engineering.

In the short term, the inevitable JavaScript chip design frameworks will create massive security issues, embedded in immutable silicon. Software engineers are not prepared for hardware engineering. The low costs and low volumes associated with shared wafer prototyping will alleviate the problems somewhat, but eventually the powerful formal methods tools already used by computer hardware engineers for decades will be put into the hands of a much wider audience. wider.

The computer industry...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow