Wednesday, January 7, 2015

Major Innovations--Even IoT--Take Decades to Produce Clear Productivity Increases

Vodafone notes it has been 30 years since the first mobile call was carried on the Vodafone network in January 1985. About that time, Vodafone forecast it would sell about a million subscriptions. BT predicted the market at only about 500,000 subscriptions.

In 1995, after a decade of availability, U.K. mobile adoption had reached seven percent. By 1998 adoption reached 25 percent. By 1999 adoption had reached 46 percent. Just five years later, adoption exceeded 100 percent.

We might argue about when mobility became an essential service for consumers or businesses. But we might all agree that point has been reached, and that the bigger question is how much more vital mobility will become, and how it displaces older modes of communication, computing, shopping, working and learning.

Mobile usage in the U.S. market followed a similar trajectory, with 340,000 subscribers in 1985, growing to 33.8 million by 1995. By 2005, mobile adoption had grown exponentially to about 208 million accounts.

Those figures hint at the perception of value by consumers and businesses. Growing abandonment of fixed line voice, greater volumes of mobile-initiated Internet sessions, use of websites, email and social media provide other bits of evidence about mobile’s perceived value.

But even looking only at ubiquity of usage, it took 20 years for mobility to become something “everybody” uses. It took 40 years for electrification to change productivity in measurable ways.

Keep that in mind when thinking about the “Internet of Things.” Despite the fact that U.S. businesses and organizations made huge investments in information technology in the 1980s, many would argue the benefits did not appear until much later in the 1990s.

Most of us likely instinctively believe that applying more computing and communications necessarily improves productivity, even when we can’t really measure the gains.

But investments do not always immediately translate into effective productivity results. This productivity paradox was apparent for much of the 1980s and 1990s, when one might have struggled to identify clear evidence of productivity gains from a rather massive investment in information technology.

Some would say the uncertainty covers a wider span of time, dating back to the 1970s and including even the “Internet” years from 2000 to the present.

Computing power in the U.S. economy increased by more than two orders of magnitude between 1970 and 1990, for example, yet productivity, especially in the service sector, stagnated).

And though it seems counter-intuitive, some argue the Internet has not clearly affected economy-wide productivity.

Whether that is simply because we cannot measure the changes, yet, is part of the debate. To be sure, It is hard to assign a value to activities that have no incremental cost, such as listening to a streamed song instead of buying a compact disc. And many of the potential productivity gains we might be seeing are of that sort.

The other issue is that revenue is decreasing, in many industries, even if most users and buyers would say value is much higher.

A productivity gain, by definition, means getting more output from less input.

In other words, it is “how” technology is used productively that counts, not the amount of raw computing power or connectivity. And there is good reason to believe that new technology does not reshape productivity until whole processes are changed. Automating typing is helpful. But changing the content production ecosystem arguably is where the biggest productivity gains come, for example.

No comments:

Generative AI Will NOT have the Impact Many Expect

Generative artificial intelligence, to say nothing of machine learning or neural networks (and eventually general AI), might collectively re...