It's almost 40 years since Intel released its first CPU, the 4004, a ground-breaking processor that crammed all the computing power of ENIAC - the first general purpose computer - into a tiny silicon chip.
Of course, what was impressive in 1971 looks, well, rather less spectacular today. And now the Intel 4004 is most useful as an example of just how far CPU technology has come in recent years.
Take the clock rate, for instance: the 4004 ran at only 108KHz. In a mere eight years, the Intel 8088 (the power behind the first IBM PC) would run at 5MHz, something like a 50x speed boost. And a modern 3GHz CPU has a clock rate around 30,000 faster than the humble 4004.
And the increase in the transistor count, probably a more accurate indicator of CPU power, has been even more spectacular. The Intel 4004 contained a mere 2,300; the 8088 increased this to more than 29,000; and if we jump to a modern high-end Intel Xeon chip, then the transistor count is now more than 2 billion: a million times more than that first CPU.
WHERE IT BEGAN: Intel's first CPU, the 4004
The CPU world has generated plenty of other amazing statistics over the past few years, but of course this isn't just about the figures: what's more important are the technologies that they've made available.
Intelligent cameras with face recognition; spreadsheets with analytical powers that even governments couldn't achieve just a few years ago; cars that are beginning to be able to drive themselves; incredibly cheap and versatile tablets, and smartphones with more power than a 1970's mainframe are all becoming commonplace.
That's all great, but it does leave us with a couple of questions.
Can the CPU industry really maintain this level of performance increase?
And if so, what new technologies might it deliver in the next few years?
Moore's Law
In 1965, Intel co-founder Gordon Moore predicted that the number of transistors on a chip would double, every two years, an observation so accurate and important that it became known as "Moore's Law". And it's this rapidly increasing complexity and power that has driven the amazing gains in CPU performance that we've seen in recent years.
ON THE MONEY: In 1965 Gordon Moore predicted exponential growth in CPU power that would lead to "home computers", "automatic controls for automobiles" and "personal, portable communications equipment"
Can this continue? Antonio Gonzalez, director of Intel Labs Barcelona is confident, at least for the near term:
"The trend in CPU performance has been spectacular in all the key parameters: every year more power, more speed, transistors are smaller, energy consumption is down. There are always new challenges to be faced, but I'm sure the trend will continue. We are in good shape to keep evolving the technology for at least a decade, and perhaps much more."
It isn't going to be easy, however. Transistor sizes are already tiny, as Gonzalez explains: "Currently we're working to a 32nm process [structures that are 32 billionths of a metre in size], and this will be down to 22nm very soon. That's less than the size of the influenza virus, and 10,000 times smaller than the diameter of the human hair."
It's also approaching the size of an atom (around 0.1nm), though, and that's going to present us with a fundamental barrier before too long.
And Nandan Nayampally, director of product marketing at ARM, sees another problem. "Starting from around 40nm, the problem isn't really the transistors, it's the wire delay [the time it takes to send signals around the chip]. We are beginning to hit that limit, and it's a major issue for speed."
New approaches will be required to maintain the growth in CPU performance, then, and fortunately the industry is already working on a number of promising projects.
IBM has been researching 3D chip stacking, for instance, which sees CPU cores no longer placed next to each other, but stacked vertically. The cores can be linked by connections covering their entire surface, greatly improving transfer speeds, and they're more easily cooled so heat is less of an issue.
BUILDING UP: Stacking CPU cores on top of each other is a promising idea that could deliver significant speed increases
And Intel reports that it is investigating "the user of compound semiconductors (rather than silicon)... [which] show promise of greatly improving transistor performance while reducing power" (while many other researchers are looking at possible replacements for silicon).
Moore's law will be running into some rather fundamental roadblocks in the 2020s, then, but there are plenty of new technologies on the horizon, and these may enable it to continue for some time yet.
Multi-core
There's more to processor performance than simply packing transistors onto a die, of course, as John Moore, vice president of design and innovation at UK semiconductor trade association NMI, points out. "There's only so far we can go before heat density begins to cause problems", he says: that is, the more transistors on a chip, the hotter it gets, and that causes all kinds of reliability issues. "The days of the big, powerful single-core beasts are ending: the future is more about distributed processing; multi-core."
Intel's Antonio Gonzalez agrees. "We could use all these extra transistors to create a single big core," he says, "however, placing multiple cores on a single die delivers a much better solution in terms of performance and energy efficiency."
We've seen some of the results of this approach today, where even budget systems may come with quad-core CPUs, essentially delivering four processors for the price of one. But Gonzalez believes this is just the start.
Intel has already produced an 80-core "Teraflops Research Chip", he points out, and even this isn't an exceptional number: "We could easily have CPUs with hundreds of cores. There are still challenges to meet before this will be practical, but I'm optimistic that we are heading in the right direction, this is a very valuable trend."
HARDCORE: An Intel research project has already produced a chip with 80 cores - and we might see CPUs with hundreds more in the future
Not everyone is quite so enthusiastic about such high numbers of cores. Alex Katouzian, VP of product management at Qualcomm CDMA, comments: "Our systems can go up to at least four cores, 2.5GHz, and I'm sure we'll be able to push these up even more, but from a software perspective there's only so much extra performance this will buy. Because while some applications are truly multi-threaded - web browsing, gaming - most remain single-threaded. And so in the mobile space at least, there will quickly come a time when you stop adding more cores, and instead concentrate on power efficiency."
ALL IN ONE: Some CPUs are about pure performance. Qualcomm's Snapdragon aims to be a complete system, including a GPU, 3G/LTE modem wifi, GPS, Bluetooth, FM and more on one chip
John Moore also believes the move to multi-threaded applications is key. "The real challenge now isn't the hardware, as much as in creating the software that can take advantage of it. We have generations of software engineers that have been raised to think sequentially, and breaking a problem down into parallel threads instead - and then understanding how to test and debug that code - is no easy task."
However, Intel recognises these issues, says Gonzalez: "We're investigating new programming paradigms for the future," he points out, "as well as working on new tools that will help to parallelise these apps, much more easily than is possible right now."
But, crucially, multicore doesn't just mean adding more regular CPU cores to a die, as ARM's Nayampally points out. "ARM started this multi-processing push back in 2003", he said, "but these extra cores could be DSPs, a GPU, modem, GPS unit, whatever best suits your application."
And this approach, common in the mobile space, is now being taken up by Intel, says Gonzalez: "We're also looking to add more specialised components to the processor, like the GPU and multimedia support in the Sandy Bridge CPUs. This kind of integration cuts latencies, delivers a much better solution in terms of performance and energy efficiency, and delivers these benefits right now - even on single-threaded apps."
Future developments
So what new technologies will these future CPU improvements enable?
For the next few years it looks like business as usual. Transistors will continue to shrink; the number of cores will go up; and our desktops will increase in speed just as they always have.
Applications will become increasingly multi-threaded, and take advantage of other performance-boosting technologies, like GPU acceleration. And a strong focus on energy efficiency means that portable devices should see improvements in both CPU power and battery life.
After that, though, the availability and low price of fast, low-power CPUs will mean that they'll increasingly be used all around us, suggests Intel's Antonio Gonzalez: "In the near future we are going to see a much wider range of computing devices that we have today. There will be more powerful systems in our car, in the house, security systems, even simple robots that may help in domestic activities [that's cleaning the carpet, rather than preparing lunch, probably]. Just a host of devices that in some way analyse the environment around them and provide real world responses."
This isn't science fiction, either: it's happening now. So, for instance, accelerometer chips are being developed at the University of Southampton that will monitor how a section of track behaves when a train passes over it. It can then detect variations in response that might indicate a loss in structural integrity, and warn engineers before this becomes serious enough to cause an accident.
NEXT-GEN: A new generation of chips is being developed that will sense the world, and respond to it - for example, monitoring rail tracks and alerting engineers to imminent structural failure
The Royal Veterinary College is developing another accelerometer-based chip that, when attached to the leg of a horse, can analyse its gait pattern. Detecting small changes in these could help to quickly detect and rest injuries that might otherwise go unnoticed, at least initially, and the same technique may also be applicable to professional athletes.
Elsewhere, Qualcomm believes that our homes will become filled with new smart devices that will help us out in a host of interesting ways. And they won't be boring, static boxes: rather, just like your mobile, they'll have internet connectivity and you'll be able to customise them in a host of useful ways: "In many instances, consumers will be able to download applications to products such as home energy management systems, security systems and connected home products to add new features and functionalities,"writes Qualcomm.
Intelligence everywhere
This extra intelligence will be everywhere, presenting all kinds of new opportunities. We may see alarm clocks that have the connectivity and intelligence to detect traffic problems, for instance, and know to wake you a little earlier. NEC is currently testing smart advertising billboards which can identify the age and gender of passers-by, and tailor its display to offer something appropriate to their demographic. And many similar ideas will see smart devices everywhere adapt their functions to suit our presence and needs.
Qualcomm's Alex Katouzian suggested that augmented reality technologies will also become more important. "You might see a sign on a bus, advertising a newly released record," he says. "You point your mobile at it, and a view a piece of concert footage for that artist, a video, or whatever." The system is smart enough to provide instant context to what you're viewing in many different situations.
LIFE-CHANGING: Future CPUs will power smart devices that will change our lives in a host of interesting ways
And this is just the start. The University of Washington has, since 2009, been producing contact lenses with built-in electronics, including a small heads up display. These are early days, but link this with a sufficiently powerful CPU, and, as Microsoft's Steve Clayton points out, there are all kinds of amazing possibilities:
Heads up displays of all types could be imagined. You could walk in to a room and instantly be reminded of everyone's name, their kids' birthdays and their latest status updates on Facebook or Twitter - all without them seeing what you see.
When navigating a strange city, you could be directed without ever having to look down at a map. As an article on TechNet notes, there are other uses such as assisting diabetic wearers keep tabs on blood-sugar levels without needing to pierce a finger.
Amazing. But again, not just science fiction - all these ideas have a basis in solid projects that have delivered very real results. There remains a long way to go, but as we've seen, there's also plenty of scope for CPU improvements to make these kind of things happen.
So while the huge growth in processor performance over the past few years has undoubtedly delivered a host of interesting new technologies, there are many more just around the corner, some that we can barely imagine.
No comments:
Post a Comment