There is an article in The Economist with three technology predictions for 2008. Normally they’re pretty good on technology, and the predictions seem sound enough, but the article contains a couple of bloopers.
1. Surfing will slow
PEERING into Tech.view’s crystal ball, the one thing we can predict with at least some certainty is that 2008 will be the year we stop taking access to the internet for granted. The internet is not about to grind to a halt, but as more and more users clamber aboard to download music, video clips and games while communicating incessantly by e-mail, chat and instant messaging, the information superhighway sometimes crawls with bumper-to-bumper traffic.
The biggest road-hog remains spam (unsolicited e-mail), which accounts for 90% of traffic on the internet. Phone companies and other large ISPs (internet service providers) have tolerated it for years because it would cost too much to fix. Besides, eliminating spam would only benefit their customers, not themselves.
How so? Because the big fat pipes used by ISPs operate symmetrically, with equal bandwidth for upstream and downstream traffic. But end-users have traditionally downloaded megabytes of information from the web, while uploading only kilobytes of key strokes and mouse clicks. So, when spammers dump billions of pieces of e-mail onto the internet, it travels over the phone companies’ relatively empty upstream segments.
That can’t last. For a start, millions of gadgets are joining the human hordes. Any gizmo worth its silicon these days has its own internet connection—so it can update itself automatically, communicate autonomously with other digital species, and anticipate its user’s every whim.
AFPPart of the solution?
Soon, portable media-players, personal navigators, digital cameras, DVD players, flat-panel TV sets, and even mobile phones won’t be able to function properly without access to the internet. Expect even digital picture frames to have a WiFi connection so they can grab the latest photos from Flickr.
Meanwhile, users are changing the way they use the internet: they are now uploading, as well as downloading, gigabytes galore—thanks to the popularity of social networks like Facebook, YouTube and MySpace.
Hailed by the industry as the wave of the future, “user-generated content” is proving to be a tsunami of unprecedented proportions. Everyone, it seems, is suddenly a budding Martin Scorsese, bent on sharing his or her home-made videos with fellow YouTubers.
Once the biggest files being shared via Napster and other P2P (peer-to-peer) networks were MP3 music tracks occupying a few modest megabytes. Today, music videos and TV episodes of hundreds of megabytes are being swapped over the internet by BitTorrent, Gnutella and other file-sharing networks.
And it’s all two-way traffic. The whole point of P2P is that everyone who is downloading is simultaneously uploading to others.
That’s just the beginning. Legal or otherwise, swapping multi-gigabyte high-definition video and movie files is becoming increasingly common.
In fact, it will soon be the norm. Television networks have found they can make more money from advertising while giving their show away for free over the internet than they can from broadcasting them. Now the movie studios are learning to do much the same.
The result is a gridlock. That the telephone companies are running out of bandwidth can be seen from their equipment orders.
Cisco, the leading supplier of core routers used to direct traffic over the internet’s backbone, has just had another bumper quarter, with net income up 37% over the same period a year ago. Juniper Networks, another information-technology firm, did even better. Both companies credit the proliferation of social networks, the craze for internet searching, multimedia downloading, and the widespread adoption of P2P sharing for the surge in new business.
While major internet service providers like AT&T, Verizon and Comcast all plan to upgrade their backbones, it will be a year or two before improvements begin to show. By then, internet television will be in full bloom, spammers will have multiplied ten-fold, WiFi will be embedded in every moving object, and users will be screaming for yet more capacity.
In the meantime, accept that surfing the web is going to be more like travelling the highways at holiday time. You’ll get there, eventually, but the going won’t be great.
2. Surfing will detach
Earlier this month, Google bid for the most desirable chunk (known as C-block) of the 700-megahertz wireless spectrum being auctioned off by the Federal Communications Commission (FCC) in late January 2008. The 700-megahertz frequencies used by channels 52 to 69 of analog television are being freed up by the switch to all-digital broadcasting in February 2009.
The frequencies concerned are among the world’s most valuable. They were used for broadcasting UHF television because they suffered little atmospheric absorption, could be beamed for miles, and could then penetrate all the nooks and crannies in buildings. Their relatively short wavelength makes the transmission equipment compact and the antennas small.
Mobile phone companies lust after the 700 megahertz frequencies because of their long range and broadband capabilities. They see lots of lucrative things like mobile television and other broadband services to offer customers.
But the 700 megahertz band is also the last great hope for a “third pipe” for internet access in America. Such a wireless network would offer consumers a serious alternative to the pricey and poor DSL (digital subscriber line) services they get from the likes of AT&T and Verizon, and to the marginally better cable broadband Comcast provides.
Over the past couple of months, techdom has been abuzz with rumours about Google getting into the mobile phone business—with a G-Phone to trump Apple’s iPhone. That’s highly unlikely.
The speculation was triggered by the company’s recent unveiling of its Android operating-system for mobile phones. But the whole point of Android is not to allow Google to make fancy handsets, but to make it easier for others to do so.
The aim, of course, is to flood the market with “open access” phones that have none of the restrictions that big carriers impose—like not being able to download software and games from other makers, or search the internet freely, or make free VoIP (voice of internet protocol) calls from within a WiFi hotspot.
Android has been made available to a group of manufacturers orchestrated by Google and known as the Open Handset Alliance. One of the nimblest of the group, HTC of Taiwan, has already started showing a BlackBerry-like prototype based on the Android operating system. Expect to see a raft of Android phones from other manufacturers over the coming months.
Nor is Google in the business of building a network of cellular antennas and fat communications pipes. Should it win the bidding for C-block, it would presumably team up with Frontline Wireless, a startup with serious expertise and money behind it.
That’s because Google’s core business is organising knowledge and giving users access to it. Google makes its money—and lots of it—from matching advertisers to consumers who use its search engine to look up things, not from tinkering with slim-margin ventures like wireless networks.
But despite owning the world’s largest knowledge base—with over 60% of the online search market—Google is at the mercy of others who control the on-ramps to the internet. That rankles.
Worse, it has no way of getting at the other billion users who rely more on mobile phones than personal computers to organise their lives. Clearly, the time has come to muscle into the moribund mobile-phone business.
Bidding $5 billion or more (the reserve is $4.6 billion) to beat out wireless heavyweights like AT&T and Verizon could give Google the option to become a cell-phone operator in partnership with Frontline, with a ready supply of handsets from its alliance partners and none of the hassles of running a network. Alternatively, it could become an internet service provider with a long-range wireless network to rival the WiMAX networks being built by Sprint and others.
But Google may want to do neither. Sceptics note that Google single-handedly persuaded the FCC to attach all manner of “open access” provisions to the C-block of frequencies—something that was anathema to the mobile-phone companies. Verizon even sued the FCC in a bid to block its move to open access.
Having failed to do so, Verizon now says it will open its network to third-party devices sometime in the future—and presumably for an additional charge. But the FCC is not just taking Verizon’s word for it.
The winner of the C-block of frequencies, whoever that may be (and Verizon is the odds-on favourite), will have to open the network to any device that meets the basic specification. And the devices themselves will have to be open to other suppliers’ software and services.
In short, win or lose, Google has already achieved its objective. Internet searches will doubtless be as popular among mobile-internet surfers as among their sedentary cousins. Owning at least 60% of the mobile search market is the prize Google has been after all along.
3. Surfing—and everything else computer-related—will open
Rejoice: the embrace of “openness” by firms that have grown fat on closed, proprietary technology is something we’ll see more of in 2008. Verizon is not the only one to cry uncle and reluctantly accept the inevitable.
Even Apple, long a bastion of closed systems, is coming round to the open idea. Its heavily protected iPhone was hacked within days of being launched by owners determined to run third-party software like Skype on it.
Apple’s initial response was to attempt a heavy-handed crackdown. But then a court decision in Germany forced its local carrier to unlock all iPhones sold there. Good news for iPhone owners everywhere: a flood of third-party applications is now underway.
The trend toward openness has been given added impetus by the recent collapse of the legal battles brought by SCO, a software developer. Formerly known as Santa Cruz Operations, the firm bought the Unix operating system and core technology in 1995 from Novell (which, in turn, had bought it from its original developer, AT&T).
Short of cash, SCO initiated a series of lawsuits against companies developing Linux software, claiming it contained chunks of copyrighted Unix code. Pressured by worried customers fearing prosecution, a handful of Linux distributors settled with SCO just to stay in business.
But IBM, which uses Linux, was having none of it, and fought the firm through the courts until it won. SCO is now operating under Chapter 11 of the American bankruptcy code.
The verdict removed, once and for all, the burden that had been inhibiting Linux’s broader acceptance. Linux is now accepted as being Unix-like, but not a Unix-derivative.
Bulletproof distributions of Linux from Red Hat and Novell have long been used on back-office servers. Since the verdict against SCO, Linux has swiftly become popular in small businesses and the home.
That’s largely the doing of Gutsy Gibbon, the code-name for the Ubuntu 7.10 from Canonical. Along with distributions such as Linspire, Mint, Xandros, OpenSUSE and gOS, Ubuntu (and its siblings Kubuntu, Edubuntu and Xubuntu) has smoothed most of Linux’s geeky edges while polishing it for the desktop.
No question, Gutsy Gibbon is the sleekest, best integrated and most user-friendly Linux distribution yet. It’s now simpler to set up and configure than Windows. A great deal of work has gone into making the graphics, and especially the fonts, as intuitive and attractive as the Mac’s.
Like other Linux desktop editions, Ubuntu works perfectly well on lowly machines that couldn’t hope to run Windows XP, let alone Vista Home Edition or Apple’s OS-X.
Your correspondent has been happily using Gutsy Gibbon on a ten-year-old desktop with only 128 megabytes of RAM and a tiny 10 gigabyte hard-drive. When Michael Dell, the boss of Dell Computers, runs Ubuntu on one of his home systems, Linux is clearly doing many things right.
And because it is free, Linux become the operating system of choice for low-end PCs. It started with Nicholas Negroponte, the brains behind the One Laptop Per Child project that aims to deliver computerised education to children in the developing world. His clever XO laptop, costing less than $200, would never have seen the light of day without its clever Linux operating system.
But Mr Negroponte has done more than create one of the world’s most ingenious computers. With a potential market measured in the hundreds of millions, he has frightened a lot of big-time computer makers into seeing how good a laptop they can build for less than $500.
All start with a desktop version of Linux. Recent arrivals include the Asus Eee from Taiwan, which lists for $400. The company expects to sell close on four million Eees this financial year. Another Taiwanese maker, Everex, is selling its gPC desktop through Walmart for $199.
When firms are used to buying $1,000 office PCs running Vista Business Edition and loading each with a $200 copy of Microsoft Office, the attractions of a sub-$500 computer using a free operating system like Linux and a free productivity suite like OpenOffice suddenly become very compelling.
And that’s not counting the $20,000 or more needed for Microsoft’s Exchange and SharePoint server software. Again, Linux provides such server software for free.
Pundits agree: neither Microsoft nor Apple can compete at the new price points being plumbed by companies looking to cut costs. With open-source software maturing fast, Linux, OpenOffice, Firefox, MySQL, Evolution, Pidgin and some 23,000 other Linux applications available for free seem more than ready to fill that gap. By some reckonings, Linux fans will soon outnumber Macintosh addicts. Linus Torvalds should be rightly proud.
Via The Economist