The genie's out the bottle: why the future is determined by technology laws

Dr Angus Hervey

21st August 2014
 

If you've been rummaging around the more geeky corners of the internet for any amount of time you'll no doubt have heard of something called Moore's Law. This owes its name to a guy called Gordon Moore who, back in 1965 published a famous graph showing that the number of 'components per integrated function' on a silicon chip (a measure of computing power) seemed to be doubling every year and a half. In practice this mean that computing power doubles once every 18 months. For many people Moore's Law is the backbone of any discussion around technological progress. Although it was originally based on only five data points it turned out to be an astonishingly accurate prediction, having now held true for almost 50 years. Today it's settled into an almost iron rule of innovation, and as the global economy moves into an era dominated by things that get done on computers its implications are becoming ever more profound. 

The reason people get excited about this stuff is that it means technological progress is exponential in nature. In 1969 the United States put two men on the moon. That mission required more than 3,500 IBM employees and the most sophisticated computer programs ever written. Today, the HTC Nexus One smartphone holds more computing power than all of that technology combined. They’re literally using these phones to launch satellites. And as we are fond of repeating at Future Crunch, our brains just aren’t wired to think exponentially.  We look at at our laptop and think that in ten years' time its going to be ten times better. But it won’t be – it’s going to double its computing power seven times in that period, so it’s actually going to be 128 times better. That means that in 2024 not only is your device way more powerful, it’s also cheaper, uses less energy and probably won’t be a laptop anymore. 

Image courtesy Trickvilla

Image courtesy Trickvilla

Ray Kurzweil (arguably the world's best known futurist) views Moore's Law as part of a longer and larger process of technological progress, stretching back a long way through time. Kurzweil claims that before the integrated circuit even existed, four previous technologies - electromechanical, relay, vacuum tube and transistor - had all improved along the very same trajectory. He formulated this as thLaw of Accelerating Returns; and his belief is that it will continue beyond the use of integrated circuits into new technologies that will lead to something called the technological singularity. If you're keen to go down the rabbit hole on this you won't find any shortage of material out there. Just be prepared to kiss goodbye to many hours of your life staring at the internet. It's also worth pointing out the common (but mistaken) belief that Moore's Law makes predictions regarding all forms of technology, when it was originally intended to apply only to semiconductor circuits. A lot of people now use Moore's Law in place of some of the broader ideas put forth by Kurzweil. This is a mistake - Moore's Law is not a scientific imperative, but rather a very accurate predictive model. Moore himself says that his predictions as applied to integrated circuits will no longer be applicable after about 2020, when integrated circuit geometry will be about one atom thick. That's the point at which Kurzweil says that other technologies such as biochips and nanotechnology will come to the forefront to move digital progress inexorably forward.

That's not what this blog post is about though. We're more interested in future crunch - the kinds of things you start to see happen when multiple technology laws begin to take hold at the same time. I use this term loosely, since most of these 'laws' are actually predictions or observations of patterns. And what's interesting is just how many of them are out there. In fact, it turns out that Moore's Law is just the tip of the iceberg. Koomey's Law for example, states that the energy of computation is halved every year and a half. This less well known trend in electrical efficiency has been remarkably stable since the 1950s - long before the microprocessor was invented. Moreover it's actually faster than Moore’s law, since the number of computations you can perform per unit of energy used has been doubling approximately every 1.57 years (as opposed to the number of components on silicon chip, which has been tracking closer to 2 years).

Image courtesy Lorenzo Todi

Image courtesy Lorenzo Todi

Kryder's Law is the storage equivalent of Moore's; it states that our ability to cram as many bits as possible onto shrinking hard drives is also doubling roughly once every 18 months. In 1980, Seagate introduced the world's first 5.25-inch hard disk drive (remember floppy disks?) which could store up to 5 MB of data at a price-tag of US$1500. Today, 34 years later, you can buy a a 6000 GB drive from the same company for $600. That represents a million-fold increase in capacity, combined with a seven-fold decrease in price (accounting for inflation). Not even Moore's silicon chips can boast that kind of progress.

In the field of biotechnology, advances are also outpacing Moore's. In 1990 the US government set out to complete one of the most ambitious scientific projects ever undertaken - to map the human genome. They committed more than $3.5 billion, and gave themselves 14 years to complete it. Seven years in, they had only completed 1% and were through more than half their funding. The government and sponsors started panicking. Yet by 2003, the Human Genome Project was finished ahead of schedule and $500 million under budget. This was made possible by exponential improvements in genome sequencing technology; past a certain point they even started outpacing Moore's Law. This kind of progress is astonishing when you think about it. It cost 3 billion dollars and took 13 years to sequence the first human genome. Currently, it takes less than a day and by the end of this year it’s predicted cost less than $1000.

Image courtesy IEEE

Image courtesy IEEE

Communications technologies are also progressing exponentially. For example, if you look at the number of possible simultaneous “conversations” (voice or data) that can theoretically be conducted over a given area in all of the useful radio spectrum, it turns out these have doubled every 30 months for the past 104 years. This observation was made by a guy named Marty Cooper, probably the most influential man nobody has ever heard of. He's the father of the mobile phone; the modern day equivalent of Alexander Graham Bell. While working for Motorola in the 1970s he looked at the cellular technology used in carphones and decided that this ought to be small enough to be portable. Not only did he conceive of the mobile phone (citing Star Trek as his inspiration) he subsequently led the team that developed it and brought it to market in 1983. He's also the first person in history to make a handheld cellular phone call in public. 

Image courtesy Daily Beast

Image courtesy Daily Beast

'Cooper's Law' is even more remarkable than Moore's Law since it's held true since the first ever radio transmission by Marconi in 1895. Radio technology a century ago meant that only about 50 separate conversations could be accommodated on the surface of the earth. The effectiveness of personal communications has improved by over a trillion times since then. And unlike Moore's Law, there's no physical limit since there's no limitation on the re-use of radio spectrum. These networks can be expanded indefinitely by merely running more lines, more bandwidth, to more terminals. 

Running side by side with this theoretically unlimited increase in wireless capacity is an exponential increase in the amount of data we can transmit through optical fiber cables. Butters' Law says that the amount of data coming out of an optical fiber doubles every nine months, meaning that the cost of transmitting a piece of data bit an optical network decreases by half during the same time period. Unfortunately, that rate of progress doesn't filter down to us as consumers - Nielsen's Law states that the bandwidth available to average home users only doubles once every 21 months. Still though, it's an exponential function, and is the reason why telecoms companies have been able to make so much money while still bringing down the cost of data traffic. Anyone remember dial up modems? Imagine trying to stream today's HD videos on something that still sounds like this.

So what happens when you get increased connectivity through improved data transmissions and falling costs? Well, you get larger networks; and according to something known as Reed's Law, the utility of large networks (particularly social networks) increases exponentially with the number of participants. The precise size of that increase by is a topic of debate - Metcalfe's Law for example, states that the value of a telecommunications network is proportional to the square of the number of connected users of the system. In other words if your network is 10 people, then its value is 100. But if that network then doubles to 20 people its value goes up by 4 times, to 400.

This idea was formulated in the 1980s and was based on connected devices such as computers, fax machines or telephones. Today though, critics point out that this is too simplistic since it measures only the potential number of contacts, (the technological side) whereas the social utility of a network like the modern day internet depends upon the actual number of nodes in contact (the social side). Since the same user can be a member of multiple social networks, it’s not clear what the total value will end up being. Research around Dunbar's Number also suggests that there's a limit to the number of connections our brains can manage - most people don't want or need networks larger than 150 or 200 other people. Very large networks pose a further problem, since size introduces friction and complicates connectivity, discovery, identity management and trust, making the effect smaller.

Image courtesy TrialPay

Image courtesy TrialPay

However, I'd argue that the exact proportion doesn't matter. All that matters is that the network effect is exponential in nature, i.e. a doubling of a network's size more than doubles its value. And when we take the effects of this into consideration along with Cooper's, Butters' and Nielsen's Laws we start seeing exponential increases in the ease and access with which people can connect and use such technologies. For example, there are now more mobile phones used by Africans than Europeans or North Americans, despite those two other continents having more than 20 times Africa’s per capita incomes.

This means that the so-called digital divide may disappear far more quickly than most people realise; it's why organisations like Quartz get so excited about the mobile web and their Next Billion events. And it's why tech giants like Google are pushing hard on things like Project Loon - the sooner they can get everyone connected, the more value they derive from the exponential increase in the size of the global internet. Even more interestingly, it may be that it's the tail wagging the dog - Silicon Valley investor Steve Jurvetson thinks that the shape of these laws only makes sense once you substitute 'ideas' for participants. In other words, technology is driving its own progress by steadily expanding its own capacity to bring ideas together. The implication is that the genie's already out the bottle; short of arresting half the planet's people, we couldn't stop the march of increased connectivity even if we wanted to. 

What about other forms of technology that don't involve computer chips and phones? Well, if you've been following renewable energy trends for any time you'll be familiar with Swanson’s Law. This states that the cost of the photovoltaic cells needed to generate solar power falls by 20% with each doubling of global manufacturing capacity. It's represented below, in a now famous graph showing the fall of solar costs in the last forty years. Today the panels used to make solar-power plants cost US$0.74 per watt of capacity - a 60% decline since 2008. Power-station construction costs add about $4 to that, but these, too, are falling as builders work out how to do the job better. And running a solar power station is cheap because the fuel is free. Obama's former Energy Secretary Steven Chu says that solar becomes price-competitive with fossil fuels at a cost of around $.50 per watt. Swanson's Law means we're pretty much already up and over the tipping point when it comes to making decisions about whether to build new power stations using renewables technology rather than fossil fuels. 

Image courtesy The Economist

Image courtesy The Economist

Then there's Haitz’s Law which states that every decade, the amount of light generated per LEDs increases by a factor of 20, and the cost per lumen (unit of useful light emitted) falls by a factor of 10 for a given wavelength (color) of light. In other words, we are seeing exponential improvements in LED technology, which means it's becoming the dominant way we produce light. At home we get brighter more efficient lighting at lower costs, while commercially it means that LED lighting can now be used for more specialized applications such as in large stadiums and amphitheatres. This results in lower electricity consumption, and a reduction in overall carbon emissions and the use of toxins used in old lighting technology such as mercury. It's no surprise that these kinds of technologies are improving so rapidly - they all run on silicon, the foundation for the semiconductor materials found in computers and communications networks.

So what does it all mean? Well, one of the difficulties in making predictions about the future is that it's almost impossible to know what people will actually do with new technologies. It's easy to predict a fall in the costs of computing power; it's far more difficult to predict that this will lead to things like the sharing economy or the rise of mobile personal technologies. That said, there are a couple of things that are possible to see on the horizon as a result of these converging laws. However, you'll have to wait until the next blog post to hear what those are (I know, I know. But one thing at a time right?). For now perhaps time to start keeping an eye out for other kinds of interesting technology laws out there? For better or for worse, science and technology have unleashed patterns and forces over which we have no control. Far better to understand and talk about them, in order to harness their power, than to stand by and let them wash over us, like islands in the stream.