One good law deserves another, or why fly on one wing? Seriously, I intended to add some information about Moore’s Law here because it comes up in conversation often and did again today, so I took that as a cue to look it up and extract a blog post from the findings and yes there’s allot to be found when you go poking around Moore’s Law.
Murphy’s law is an adage or epigram that is typically stated as: “Anything that can go wrong will go wrong”. Just so that’s clear.
Moore’s Law is not and adage of epigram, it is a theoretic, scientific assumption.
Gordon Moore photo by Steve Jurvetson from Menlo Park, USA – Moore Fish, CC BY 2.0, Link
Moore’s law is the observation that the number of transistors in a dense integrated circuit doubles about every two years. The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and CEO of Intel, whose 1965 paper described a doubling every year in the number of components per integrated circuit and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years. The period is often quoted as 18 months because of a prediction by Intel executive David House (being a combination of the effect of more transistors and the transistors being faster).
Consequences
The primary driving force of economic growth is the growth of productivity, and Moore’s law factors into productivity. Moore (1995) expected that “the rate of technological progress is going to be controlled from financial realities”. The reverse could and did occur around the late-1990s, however, with economists reporting that “Productivity growth is the key economic indicator of innovation.”
Technological change is a combination of more and of better technology. A 2011 study in the journal Science showed that the peak of the rate of change of the world’s capacity to compute information was in 1998, when the world’s technological capacity to compute information on general-purpose computers grew at 88% per year. Since then, technological change clearly has slowed. In recent times, every new year allowed humans to carry out roughly 60% more computation than possibly could have been executed by all existing general-purpose computers in the year before. This still is exponential, but shows that the rate of technological change varies over time.
No comments yet.