Saturday, January 26, 2013

Is Moore's Law Slowing Down?


Man vs. Machine

by Rana Foroohar

Remember the booming economy of the 1990s? A big factor in that growth was technology, which fueled productivity gains at a much faster clip than it does now. Moore's law--the observation credited to Intel co-founder Gordon Moore that computer chips double in power roughly every 18 months--appeared to be squarely in effect. From 1995 to 2005, large companies invested heavily in technology that increased efficiency and productivity, eventually creating entirely new areas of business and boosting employment growth. The fact that American companies invested more than, for example, European ones is a key reason many U.S. multinationals increased revenue and market share during that time. So given the rise of social media, big data and other tech trends, can we expect a similar boost to growth sometime in the near future?

No--at least according to "Is I.T. Over?," a new report by JPMorgan Chase's chief U.S. economist, Michael Feroli. Using U.S.-government data, Feroli shows that prices for IT equipment--things like software, computers and networking technology--are declining at the slowest pace in over a generation. That's important, because a slower price decline for technology implies slower gains in the power of technology. As Feroli writes, an average computer may retail for about $1,000, but historically "the power of that computer has increased dramatically" over time. As the power of new devices increases, prices of old ones fall. The fact that they aren't falling so quickly now means that technology isn't increasing at the same pace it once did.

This doesn't mean that Moore's law is dead. Strictly speaking, it refers to the number of transistors that can be squeezed onto a chip. Other factors, like microarchitecture and memory, can constrain computer advances even if the sheer number of circuits continues to increase. The bottom line, though, is that slower tech-price declines and slower gains in computing power suggest that the pace of innovation in the near future is likely to resemble that of the recent past. In other words, it will be sluggish for the next few years.

Indeed, a number of economists, including Northwestern University's Robert Gordon, believe that we are entering an even longer period of slow tech gains and slow growth. Gordon argues that the productivity gains of the decade beginning in 1995 were nothing compared with earlier, arguably more cataclysmic tech shifts like the advent of the combustion engine, electricity and indoor plumbing. "Which changes your life more," he asks, "an iPad or running water?" What's more, even if innovation were to continue into the future at its pre-2005 rate, Gordon says, the U.S. faces new headwinds--including debt levels, an aging population, environmental challenges, inequality and lower levels of education relative to international standards--that will hinder growth more than in the past.

There may be a silver lining to this story. Despite the boost it has given to overall growth, the white-hot pace of tech advancement over the past few decades is also a key driver of higher unemployment and inequality, as less-educated workers lost their jobs to machines. Research shows that technology powers job growth only if educational levels keep pace with technological change--a relationship that began to break down in the 1970s in the U.S. If IT advances are finally slowing, "then workforce skills may be better able to catch up with the level of technology," notes Feroli. In an era when many economists believe inequality is an obstacle to growth, that's a rare bit of good economic news.

2 comments:

Anonymous said...

It seems technological advancement is more about improving what we already have invented, rather that revolutionizing. Things like YouTube and FB are great but they aren't revolutionary, although they have changed our lives.

Its dissapointing because the changes of today seem so small scale rather than what we actually need. We are still stuck on oil and green technologies IMO still don't have the answer.

The next big thing is meant to be 3d printing but sadly this may be more of a threat to our lives, and may contribute to unemployment.

Derek Mathias said...

Well, what you're seeing is only what's going on at the surface. All the real work goes on in the background before suddenly being sprung on us. Few people saw Siri or Watson coming, for example. And few in the public eye realize the massive potential of 3D printing. You can't look at what we have NOW and expect that to be it...you have to imagine integrating that capability with other technologies that are currently in development to get an idea of the true potential.

Siri and Watson presage artificial assistants and companions indistinguishable from human minds. 3D printing integrated with nanotechnology presages home nanofactories where we can build virtually ANY material goods.

If you're worried about unemployment, then you're likely in for a big disappointment, since mature nanotechnology should bring about the elimination of almost every job held today! New technologies USUALLY result in at least some unemployment, while opening up opportunities in additional fields. Nanotech promises to be a complete game-changer.

Quantum Teleportation Becomes Reality on Active Internet Cables

https://scitechdaily.com/quantum-teleportation-becomes-reality-on-active-internet-cables/