Monday, August 20, 2007

How to compete with BT in the Openreach model?

Comments on my previous post and the whole debate about BBC iPlayer have got me thinking. According to Ian Wild of Plusnet, see his comments, the amount of money that a Wholesale Broadband Access (IPstream)-provider will need to pay for backhaul is £180 - £200 per Mbps per month. The use of PPPoA also makes that you can't keep local traffic local and keep it off the backhaul. This can get very expensive very fast, since traffic per customer will grow 50% per year, minimum. (To get an idea if you need to budget 100kbps traffic per customer for peak times at £20, next year it will be £30 and the year after it will be £45. (Unless the regulator regularly pushes the prices down). So this is a no-win-situation for the ISP's.

What I am wondering now: Under what conditions is BT charged in area's where there is no ULL available? Is Openreach charging BT the fee they would charge the Wholesale Broadband Access providers? If so, than why doesn't BT complain about the iPlayer seriously hurting margins? Or is it a ULL provider everywhere in the UK and doesn't it feel the pain on its backbone? Doesn't it need to pay the high backhaul charges. Or better yet part BT's backhaul is paid for by the rising charges for the backhaul of the WBA-providers.

Maybe somebody can explain things to me. There might very well be no conspiracy here. :-) And though it might explain the position of the likes of Tiscali, it still doesn't put them in the right. (It might put Ofcom on the spot!)

Wednesday, August 15, 2007

BBC's iPlayer as the posterchild for net neutrality

It's interesting to see how in the UK some of the lesser ISP's (Tiscali and their lot) have complained in the press about the public broadcasting behemoth BBC and its iPlayer. iPlayer is the BBC's attempt to copy the Dutch public broadcasters success of "Uitzending Gemist" (=missed my program!). It does this by using a Peer to Peer program Kontiki. The ISP's claim that this is at their expense. IPdev-Blog and Telebusilis have analyzed this in some detail.

Jeremy Penston of IPdev has analyzed very well why ISP's won't invest themselves in new networks and network expansions. The process in short is one of mutually assured destruction. If two companies build the same network, they create an over supply of network connections and bandwidth in the market. They will end up in a price war where neither can bail out and both will go bankrupt in the end. (even if one of them wins the first round, the loosing network can be revived from bankruptcy at marginal cost and start the second price war) The solution seems to be a regional or national public infrastructure. I agree with his ideas and hope to publish a paper along those lines soon.

However both Telebusilis and IPdev argue that the content creators should finance in one way or the other the build-out of extra capacity in the network. They argue it is not fair for the BBC to come up with a new service that taxes the networks of ISP's in Britain (up to 67pence per hour of viewing). I couldn't disagree more with them. I think it's only the ISP and it's customers that should do so. It's the end-user that creates the costs and it's there the costs should lie.

We live in great times, on a daily basis people al around the net invent new high bandwidth services to use over the internet. I'm watching my 3 day old cousin in a hospital on a high def webcam. You can watch live concerts at Fabchannel. People dress up in Second Life. In Twente security companies watch their customers premises using dedicated light paths. Every tv-channel and production company is looking into the on-demand opportunity. These new ideas have ever higher bandwidth demands.

In order to minimize the costs for content producers there are several strategies. Bill Norton of Equinix has made a very good analysis of the costs of video distribution over the internet. His analysis shows that using a Peer to Peer model (like the iPlayer) is the most cost effective version for the content provider. Or as Cringely paraphrases it:

Norton's analysis, which appears to me to be well thought-out, concludes that P2P is vastly cheaper than any of the other approaches. He concludes that distributing a 1.5 gigabyte movie over the Internet in high volume will cost $0.20 using the current transit model (a single huge distribution server), cost $0.24 using an edge-caching CDN like Akamai, cost $0.17 with a homemade CDN like I used last season to distribute NerdTV, or cost $0.0018 to distribute using P2P. That makes P2P 35 times cheaper than any of the alternate approaches. And (...) Norton further makes the point that none of these distribution models does anything to soften the blow on the ISP. CDNs in particular cost more -- that more being revenue to the CDN -- yet do nothing for the ISP.

Well the BBC could also do the calculations and came up with the advised solution. Which might actually be a better solution for ISP's as well mind you. This not often mentioned, but a well designed p2p-protocol keeps local traffic local. So if your neighbour wants to watch a movie that you so happen to have on your pc in an ideal world he would not need to burden the backhaul-links from your town to the main switch office, but keep everything local. This relieves the network of the ISP from heavy backhaul traffic. Just imagine if an entire town would be streaming from the servers of the BBC. At 1 megabit a town with 10.000 parallel streams would be hitting 10Gbit/s on the backhaul. This way the ISP can save on its backhaul and also on its interconnects with eg the BBC. (How perfect the world of p2p protocols is can be seen at IPdev here and here)

So why shouldn't the bandwidth hogs be paying for their bandwidth? The BBC has enough money and they do pay for satelite capacity, so why should they get away for free. Well, the BBC isn't they only one designing high bandwidth services, as said it's everybody. All those new services mentioned contribute to the networks creaking under heavy loads. Remote security cams, baby cams, people with no First Life. All of them break the network. Even normal surfing the web helps. The question about who pays than quickly becomes a question of who can we extort money most easy from. Well auntie Beeb is old and wealthy, so it might be easy to beat her up for her pension. It's much harder for a UK ISP to do the same from a Dutch hospital, or security company or a Japanese public broadcaster, though they might contribute as much to the demise of individual links as the BBC does. (Think of it as cars on the road, all the cars contribute to congestion, foreign and domestic, business or pleasure). So what you get is that the costs are disproportionally allocated to those companies that are easiest taxed.

Another reason against using a taxation of content providers is that the revenue stream will be so attractive to improve the competitiveness of the ISP that there is no reason to assume the money will go into network upgrades. They might just as well go in more advertising or lowering prices. Even better, there is no reason to expect the taxation to cease once the network has increased its capacity. Like so many taxes they tend to linger long after they´ve done their job. For the economist, it's kind of like a terminating monopoly and will require equal amounts of regulation.

A third reason is that imposing a `Save the ISP`-tax is detrimental to innovation. Think of it, would you want to father the new Skype if the bandwidth tax bill ends up on your doorstep? Ofcourse not. That would be ridiculous.

By now people will be confused. It must be expensive to get a new network that can handle this amount of traffic they think. But again they are wrong. You can get a nationwide fiber to the home network for roughly 35 euro per house per month (or an investment of between 1000 and 2000 euro). For most countries that is signficantly less than their investments in roads and it is equal to what it would cost now to build an electricity network from scratch. Yes there are upfront costs, but it would last 50 years, allow for all kinds of innovations etc. If the market doesn't provide this, you have a market imperfection that might require limited government intervention in the civil engineering part of the physical network, if the benefits outweigh the costs eg. Stokab in Sweden. But there are billionaires around willing to do cherry picking in FTTH networks (Dik Wessels with Reggefiber). And there are even smart incumbents upgrading there networks to VDSL2 (KPN, Deutsche Telekom) or FTTH (Verizon) and new entrants (Free). Though we are still a bit away from universal 1 gigabit home connections for 35 a month.

35 euro per month buys you the fiber network (less if we fix it partially with government money). Interestingly it doesn't matter whether you use this at 1 mbit/s or 100 mbit/s or even a gigabit. It all costs exactly the same. Different speeds of your ADSL line eg 8mbit or 1mbit are only a way of price differentiation, but have nothing to do with sending more bits over the network being more expensive. It doesn't get you the traffic yet. International and interregional traffic costs money. The way this is dealt with in many countries is with monthly traffic caps eg of 40 Gigabyte and if you use more you pay more or there is an acceptable use policy. The way this could be fixed in the future is that you have a gigabit line to your house and a terabyte of traffic per month of interregional/international traffic (local traffic is free). If you go over, you pay more.

Now we arrive at the problem with high bandwidth applications like P2P applications or babycams. The way Joost and BBC's iPlayer work is that they exchange traffic even when users use it. Users actually have no way of knowing or limiting the amount of traffic it uses. With a babycam you could calculate it, but it's not intuitive. This should be fixed. A user should know how much costs they are incurring by using innovative appplicatons. They can then limit their usage according to their needs. It will also push ISP's to increase the monthly traffic cap to offer their customers more than the competitor. ISP's can now extract the money from their customers based on the amount of bits and not on the type of application or which granny to beat up. If a customer wants to use more they pay the ISP and they get the bits, regardless what they use them for.

Alright this seems too easy. Networks get paid for by the customer and it seems like content providers are getting a free ride on the network innovation train. The content providers have all this income from advertising and they should share... shouldn't they? There are several arguments against this. First of all, it's highly questionable if there really is so much money in advertising. The total turnover of the Dutch advertising industry is 6 billion and this supports Ten TV-channels, around 10 national newspapers and a couple of hundred magazines, thousands of websites etc. Some of it doesn't even support content, like billboards and classifieds systems like Monsterboard. (In comparison the mobile telecoms sector makes 6 billion a year too with 4 networks) Secondly efficiency in distribution leaves room for innovation elsewhere. Just like containers revolutionized shipping and realized China's position factory of the world. So too will new networks and p2p decrease transaction costs and revolutionize the delivery of content. This will lead to globalisation of the content market and the infrastructure will lead enable all kinds of innovations from babycams to immersive content. If there are excess profits to be made in the content market by advertising and pay-per-view models, there will be new entrants into the market, the breadth and hopefully the quality of the content will go up. This will redistribute the wealth in the market to such an extent that the big advantage of content owners over ISP's that some see will disappear. Efficient markets hate long term excessive profits for an entire industry. Though one compnay may prosper because of enormous economies of scale and network effects.

Therefore the conclusion is:
New applications will demand more and more bandwidth. Their combined usage will compound to the problem. This will push ISP's to deliver more bandwidth and traffic. Users will be paying for this one way or another. If the market doesn't provide for bandwidth, government should. ISP's taxing those who design applications that use high bandwidth is not a solution, it would be a disaster. We need innovation both in content as in applications and services In order to relieve backhaul local traffic should stay local and local interconnection should be possible between ISP's and private networks, see NDIX for a great example (yes I once worked there).

Saturday, August 04, 2007

Wishlist for Google Apps Enterprise

This is a wishlist of stuff I would like to have in my company to make my life easier. It's all about how we deal with information in organisations. There is so much information in companies. Most of it is tacit knowledge. This kind of knowledge is locked away in peoples minds, mailboxes, bookmark lists, rss-readers, implicit references in memo's, discussions, interactions. In the end it comes down to google's mission, to make the worlds information accessible. Microsoft gave us the office tools to make information, but failed us badly in making it accessible. I've written it with an eye to Google, because they seem best postioned to deliver us some of these advances, but hey anybody can try and realize this dream be they Microsoft, Zimbra or Open Office.

My Google Wishlist:

- Google Reader with Google Apps for enterprises. This way it should be possible both to see what feeds your coworkers subscribe to, what is hot on those lists, share the most important articles with your coworkers etc. And for good measure it should include a company Digg/delicious function.
- desktop and company wide search
- Google Reader Enterprise version with sharing, searching, mining, statistics on what is most read, shared, dugg etc.
- in-company social network like pages to replace those tired phonebooks with myspace/orkut like pages. This can also provide clues on the projects we're in and therefore a web of relevance
- Google proxy sniffer (might be a privacy/security concern) that analyzes via the proxy what webpages are read most and therefore are important for our company.
- Google Wiki - Well they own Jotspot already, give it back to us and let every company grow it's own wiki or else we'll use socialtext, confluence and centraldesktop
- Google Grandcentral to be finally able to manage our internal telephone system including IM and let that be well integrated into our Calendar function, so that when somebody calls us, the system knows what to do and reach us properly
- Google Blackberry functions. For the love of me I don't understand why the Crackberry can only function in such a limited way for incompany use... make it useful. let me access all my company information on it, not just my mail, but also my intranet
- Google IM... Buy Jabber.com and build the best incompany IM system, that can interact with other incompany IM systems just like e-mail systems can interact, without the need for a third party to be in the middle
- GMail/Calender etc Enterprise, without the need to host it at Google, but to be able to do this in company or at a third party. The apps are cool, but big companies never want to give everything to Google. They just want to give it to Suresh of Accenture in Bangalore.
- Google Spreadsheets that can actually integrate the data of the spreadsheet with the real world out there. So if I make a spreadsheet showing sales per region, I can push one button and get a map overview projected on Google Earth, integrate it with stats from the national bureau of statistics, or hook it up with data from Google Finance. Or that can actually animate the information in the spreadsheet just like the Gapminder software they bought of Prof. Hans Rosling. (Google him, he's brilliant)
- Google Document Management System, that actually allows us to manage documents in the way we want instead of in the way the idiots of hummingbird and documenta want us to do things. I don't want to fill in a gazillion fields to store one document. I want to make it, store it, retrieve it and share it, without everything becoming too hard.
- Google company blogs
- Google subscription manager. Companies have many subscriptions to magazines and newspapers which allow the access to archives. However employees never have the list of usernames and passwords. Help us manage this.

Sounds like a rather nice businessplan for the Google Apps division.