Quantcast
Channel: The Great Debate UK » Technology
Viewing all articles
Browse latest Browse all 13

Heavy traffic on the information superhighway

$
0
0

– Jeff Smith is Senior Director Infrastructure Services, Global Crossing EMEA. The opinions expressed are his own.–

For many years now, number crunchers have obsessed over the growth of data, marvelling at the way that the computer age has generated enormous amounts of content and IT types have speculated as to how disks, tapes and other storage devices would need to evolve to accommodate this. Now, however, the problem has spread and the new fear is greater: could the digitisation of the world’s information lead to catastrophic communications breakdown?

Consider this head-spinning set of numbers. According to EMC, the data created in 2010 would be 1.2 zettabytes, the equivalent of 75 billion 16GB iPads, filling Wembley Stadium 41 times. And in the age of the Internet a lot of that data doesn’t just reside on physical media but instead gets repeatedly shunted around the globe. On mobile networks alone, 8,000 petabytes will be sent in 2011, says a May 2011 report by ABI Research, and that figure is set to grow by about 50 per cent annually for the next five years. Overall, IP traffic will grow to 767 exabytes in 2014, according to Cisco. A petabyte is over one million gigabytes and an exabyte is 1,000 petabytes.

This data growth is part of a broader picture of non-stop innovation that characterises the technology sector. The situation is exacerbated by the ease with which files can be exchanged using social networks, email, instant messaging and other systems.

However, there’s no need for panic and despair and it’s worth remembering that concerns over the Internets ability to withstand wave after wave of demand are nothing new. The Lawrence Berkeley National Laboratory newsletter reported that in October 1986 the net suffered a “congestion collapse” and “slowed to the pace of the telegraph” with emails taking a day to deliver. This was in spite of the fact that at the time it hosted only about 10,000 users sending data at up to 56 kilobits per second.

Even legends of the industry are not immune to spurious predictions of an apocalyptic meltdown. One of the founders of computer networking, Bob Metcalfe, once predicted in a magazine column that the Internet would “go spectacularly supernova and in 1996 catastrophically collapse”. Metcalfe had the self-deprecating grace to later blend that article and eat it in front of an audience. The scares have continued with think tank the Internet Innovation Alliance predicting Internet brownouts by 2012.

Certainly it’s true that additional network capacity is taken instantly, as soon as it is available, but there are many causes for optimism. Content delivery networks have proven effective ways to organise the Internets traffic by storing data where it is needed so that it does not have to take the long way around. ‘Fatter pipes’ (faster networks capable of carrying more data, faster) and sub-sea links have given us all more breathing space while compression techniques crunch data more effectively than ever and private networks offer back roads that take the weight off the main information thoroughfares.

Of course, to paraphrase Thomas Jefferson, the price of Internet freedom is eternal vigilance. We need to pay attention to the number and location of peering points so that political actions, acts of god or other extraordinary conditions do not cause a broader slowdown. Satellite networks will need to be bolstered and the eventual deployment of the Galileo mapping system will reduce our dependence on the US for GPS. And of course it would be nice if some programmers weren’t sloppy, because their bloated code creates unnecessary problems for all of us.


Viewing all articles
Browse latest Browse all 13

Trending Articles