CLOSE

About Elements

为了美好的未来,
传播支撑社会的科技

TANAKA是“贵金属”的专家,为世界提供创造“社会价值”的“制造”。
“Elements”是主要提供符合我们的业务及价值观的有关“科技”和“可持续发展”
等方面信息的网络媒体。
在急速发生范式转换的现代,我们将不断传播促进实现更加美好的“社会”和富饶“地球”的未来的启示。

Elements

为了美好的未来
支撑社会的技术信息传播媒体

検索ボタン 検索ボタン
CLOSE

We’re approaching the limits of computer power – we need new programmers now

naka

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moore’s law, which for most people working in the computer industry – or at any rate those younger than 40 – has provided the kind of bedrock certainty that Newton’s laws of motion did for mechanical engineers.

There is, however, one difference. Moore’s law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. “In terms of size of transistor,” he said, “you can see that we’re approaching the size of atoms, which is a fundamental barrier, but it’ll be two or three generations before we get that far – but that’s as far out as we’ve ever been able to see. We have another 10 to 20 years before we reach a fundamental limit.”

We’ve now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, there’s been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called “cores” – in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore’s law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there’s a legend that for years afterwards he could recite the entire program by heart.

There are thousands of stories like this from the early days of computing. But as Moore’s law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed. Programming became industrialised as “software engineering”. The construction of sprawling software ecosystems such as operating systems and commercial applications required large teams of developers; these then spawned associated bureaucracies of project managers and executives. Large software projects morphed into the kind of death march memorably chronicled in Fred Brooks’s celebrated book, The Mythical Man-Month, which was published in 1975 and has never been out of print, for the very good reason that it’s still relevant. And in the process, software became bloated and often inefficient.

But this didn’t matter because the hardware was always delivering the computing power that concealed the “bloatware” problem. Conscientious programmers were often infuriated by this. “The only consequence of the powerful hardware I see,” wrote one, “is that programmers write more and more bloated software on it. They become lazier, because the hardware is fast they do not try to learn algorithms nor to optimise their code… this is crazy!”

It is. In a lecture in 1997, Nathan Myhrvold, who was once Bill Gates’s chief technology officer, set out his Four Laws of Software. 1: software is like a gas – it expands to fill its container. 2: software grows until it is limited by Moore’s law. 3: software growth makes Moore’s law possible – people buy new hardware because the software requires it. And, finally, 4: software is only limited by human ambition and expectation.

As Moore’s law reaches the end of its dominion, Myhrvold’s laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What I’m reading
John Naughton’s recommendations

What just happened?
Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says no
There’s a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beasts
“How to lose a monopoly: Microsoft, IBM and antitrust” is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

This article was written by John Naughton from The Guardian and was legally licensed through the NewsCred publisher network. Please direct all licensing questions to legal@newscred.com.