Moore and more and more

Over 40 years on since Gordon Moore’s prediction that processing power would double every 18 months, there is no stopping the demand or satisfying the desire for computer memory, says Craig Barrett.

Technology has changed just about every aspect of our lives: the way we communicate; the way we access information; the way we conduct business; and the way we entertain ourselves. It has become so fundamental to those four basic human functions that it is impossible to get away from it.

I would like to suggest that technology will continue to advance, and cannot do that without talking about Moore‘s Law. When Intel co-founder Gordon Moore forecast this law in 1965 he did not believe that it would last much more than a few years.

However, it has indeed continued for more than 40 years, with memory density and computer power doubling roughly every 18-24 months. When I suggest that this will continue for a further 15 or 20 years, you can imagine the continued capability that we are bringing forth.

Smaller than viruses

The structures we are making today are substantially smaller than viruses. Just think — considering that, 20 years hence, there will have been 10 doublings between now and then — the electronic functionality that you could expect in about 2025. We are talking about devices that you cannot even imagine today, much as you could not have imagined current devices 20 years ago.

When I tell people this they always ask the same question Moore was asked, the day that I joined Intel 32 years ago: ‘What will you use all those transistors for? Do we not have enough already?’

When personal computers were introduced, the 80/80 PC was basically a few thousand transistors. Today a run-of-the-mill microprocessor has perhaps a few hundred million transistors in it and at the highest end it features about two billion. People like Lord Browne, group chief executive of BP, want more and more computer power for the petrochemical industry.

If you talk to people who are doing digital content creation, they want more and more computer capability. If you talk to the financial industry, they want more computer capability. There is no stopping the demand or satisfying the desire for more and more computer memory capability.

If you look at some of the interesting aspects of what you can do with these added capabilities, you see that computers can begin to act as more than stand-alone computational elements. By ‘stand-alone computational element’ I mean that you put some data in and the computer does something, and then spits the answer back at you.

What you would really like would be for computers to be self-aware, or user-aware. Even though our most powerful microprocessor has the essential capabilities to respond to external stimuli, equivalent to that of a cockroach perhaps, the human being is a very adaptable computer in terms of external recognition and response. It takes a great deal of computer power to try to emulate that capability.

However, as we have more and more computer capability, we come closer and closer to that in this area of proactive computing. We all know how effective computers are in terms of predetermined scenarios, such as playing chess, where there is only a finite number of moves that you can make.

It is pretty clear now that computers can do a very good job against the best human chess players. The challenge is to expand beyond that to where there are totally unforeseen circumstances, and the immense amount of compute and recognition capability required.

New capabilities

Technology is not slowing down. Moore‘s Law will continue to apply, and it will do so on an exponential basis — so you can expect everything else to follow on an exponential basis.

There will be new capabilities and new applications of technology around this. The base technology is essentially a tool and we keep applying this tool to different problems, and bringing wonderful results.

The best part is that the tool becomes twice as powerful every two years.

Edited extracts of the Future Technology Horizons lecture given by Intel chairman Dr Craig Barrett at the Royal Academy of Engineering