Thursday 5 November 2015

Web page weights, and the rise of the baby hippo

Web pages, like our large friend on the left there, are big and getting bigger. Once upon a time, web pages were just text, but these days they may include many high-res image, Java script, fonts and many other elements that all contribute to the total amount of data that needs to be transferred to create the web page. This is leading to concerns over 'web bloat'.

Not all of these files to be downloaded before you start using the page. 'Below the fold' content (which initially sits off the bottom of your screen) can be downloaded while you're reading the content at the top.

For some sites, below the fold content is massive. The 'height' of the Daily Mail homepage is 5.16 meters, with less than 10% of the content initially visible. In one sense this approach is quite wasteful of internet traffic- the Daily Mail will send you all 5.16 meters, even if you never scroll past the top 30cm (assuming you don't click away elsewhere). But internet traffic is cheap, so the Mail isn't unduly worried.

The net result of larger and richer pages has been steady growth in 'page weights' - the amount of data that makes up a web page. They are now averaging a little over 2MB on the desktop:

Source: HTTP Archive
Technical change in Oct 2012 means data on either side not comparable

It's a toss-up whether this growth is exponential (20-25%) or linear (+345 KB/year), but either way it's substantial and ongoing. That means more traffic for networks to carry, and more bandwidth needed to ensure web pages load briskly. (In practice, for technical reasons, latency is often a more important factor than bandwidth, and beyond 5 Mbps there seem to be diminishing returns).

However, the growth in desktop website page weights is not the whole story - the mother hippo has been joined by a baby hippo. In recent years there has been a massive shift to mobile consumption, and page views from mobile devices now represent almost 40% of the total. (In Africa it's over 60%).

This matters in the context of page weights because mobile pages tend to be much lighter - roughly half the weight of fixed pages. For mobile devices web page designers need to be conscious of higher consumer data charges, they need to fit their content into smaller screens and so on. Consequently both the number and size of files transferred are lower.

Source: HTTP Archive, StatCounter, author's analysis
Weight average based on UK traffic mix

Clearly mobile page weights are growing steadily too, but because they start so much lower than desktop page weights, the shift to mobile is suppressing the growth in average consumed page weight, just as that hippo calf has reduced the average hippo weight in the enclosure. While desktop pages have been growing at +345KB, the average consumed page is only growing at +230KB. 

However, baby hippos don't stay baby hippos, and once the transition to mobile devices is complete, the growth in average page weight will accelerate again - unless of course we've shifted all our usage to apps by then, which are even lighter than mobile pages.

Thursday 22 October 2015

4K TV : 0.004K TV after compression

High resolution video is often cited as a driver for ultra-fast broadband. Here, for example, is Hyperoptic (a UK ISP) suggesting that if you want to watch 4K TV, you need 1 Gbps. 100 Mbps supposedly won't be enough.

A 4K TV isn't for everyone - apart from anything else it's very large, as you can see (though it isn't mandatory to install two Korean ladies with each set). However, it's certainly becoming more popular, and by 2020 over a third of West European households are expected to have a 4K set.

But what is frequently glossed over in broadband discussions is how little bandwidth is currently required for 4K TV, and how much less will be required in future. To be sure, how much bandwidth is needed for 4K is not a simple question. It depends on (at least) three things: the resolution of the video, the nature of the content and the time you have to compress it.

Uncompressed, high quality 4K can require 3 Gbps or more. However, in practice 4K is never delivered to consumers uncompressed. A compression algorithm (codec) is used to convert the raw digital video into a far smaller data stream. Many techniques are used in such algorithms. For instance, if a portion of the image is unchanged since the previous frame, the algorithm may (effectively) say 'for this portion of the screen, same again'. This requires far less data than retransmitting each pixel in that part of the screen, Or, if a large part of the image is all the same colour, then the algorithm may transmit the boundaries of the colour block, rather than separately transmitting the colour of each pixel within it.

The effectiveness of such techniques depends on many things, including the sophistication of the algorithm, the available processing power & time and the nature of the content (content with lots of movement is inherently more difficult, for instance).

However, the reduction in bandwidth is generally dramatic. Netflix, who know as much about 4K streaming as anyone, say they average 15.6 Mbps. However, sports content (which has lots of movement and must be compressed in real-time) can require more. BT's 4K Sport currently uses 20-30 Mbps.

Thus even today 4K is well within the capabilities of sub-FTTH broadband, and it is baffling that Hyperoptic think 100 Mbps is insufficient. Moreover, 4K's requirements are only going to fall. Moore's Law means we have ever more processing power to play with, which can be traded-off against bandwidth, to maintain picture quality while using fewer Mbps. In addition, processing algorithms grow ever more sophisticated. As a result, roughly 9% less bandwidth has been needed each year to support a given picture quality. Simply because video is such an important component of traffic these days, it appears as if investment in codecs is growing, meaning that the 9% rate may actually accelerate.

Companies are already claiming dramatically lower bandwidths for 4K in trials. For instance, V-Nova has reported streaming 4K at just 6 Mbps in a trial with EE (the UK's largest mobile operator). Tveon, a Canadian start-up, is even more aggressive, suggesting that with their technology 2 Mbps will be enough. (That's better than a 1000:1 compression of the raw stream).

While these claims will need to be proven out, they nonetheless suggest the potential for dramatic improvement. Indeed, even at double V-Nova's 6 Mbps, most ADSL lines would be able to support 4K TV.

Your future TV may or may not be 4K, and you may or may not be able to see the difference even if it is. However, that monster TV won't be a justification for bring fibre to your front door.

Monday 12 January 2015

Killer Gigabit Apps - and why 1,259 experts are wrong

Sandy Lindsay, Master of Balliol College Oxford (1924-49), was once locked in debate with the fellows (professors) at the college on a contentious issue. It came to a final vote, in which the fellows, to a man, voted against the Master. He scowled around the room, saying “Gentlemen, we appear to have reached an impasse.”

In this post I’m going to take a similarly hubristic approach, by disagreeing with 1,259 experts. The 1,259 experts are cited in a recent report from the Pew Research Center, Killer Apps in the Gigabit Age. The Pew Research Center is a US non-partisan body which publishes much valuable material on media and the internet (among other topics). I’ve frequently cited their work. This report too is full interesting ideas – my main problem with it is its title, for reasons I’ll come on to.

For the report Pew took responses from 1,464 experts, of whom 1,259 said they believed major new applications would capitalise on a significant rise in US bandwidth in the years ahead – the Gigabit Age of the title.

Pew also asked the experts what those applications might be – and here’s where it gets interesting. The experts had many many responses – Pew needs almost 50 pages just to summarise them. But almost none of the proposed applications need gigabit speeds or anything like it.

To take one example, telepresence is a recurring theme in the responses. This may or may not become widespread in the future -but the key point is that it does not require a gigabit. Even professional telepresence systems with a screen down the middle of the conference table seating six at your end and another six in Timbuktoo (or wherever your counterparts are) require just 18 Mbps according to Cisco and Polycom, who make such systems. So if you decide to chop your dining table in two and install multiple hi-def screens so you can have permanent telepresence with your Auntie Ethel, bandwidth will be the least of your worries.

Virtual reality is also oft mentioned in Pew's report. Oculus Rift is the closest we have to usable VR. It's in advance prototype stage, and is already impressive. The official verdict of this 90-year-old tester (having a vitual  tour of Tuscany) is 'holy mackerel!'

I haven't been able to track down official views on the bandwidth required for Oculus Rift, but the displays are 1,000 x 1,000 pixels per eye. In combination that's about a quarter of the resolution of a 4K TV (with similar frame rates). Given that 4K requires 16 Mbps, this suggests that VR may actually be a relatively low bandwidth application.

Some of the experts mentioned holographic displays.Bandwidth for these? Who knows. We'll put them in the 'maybe' category.

A number of the experts mentioned e-health, including monitoring vital signs, remote consultation and so one. Again, these are not high speed apps – they require kilobits or a few megabits at most. Several of the respondents cited that old chestnut, remote surgery. Does anyone seriously think this is enabled by improved home bandwidth?

Wearable computing, the internet of things, life logging and a wide array of other possibilities were mentioned in the report – but again, there is no reason to expect these to need gigabit speeds or anything like it.

So the real story here is not that there's a cornucopia of apps that require gigabits. Rather it is a respected research institute could ask over a thousand experts, and still not find a single clear case of an application requiring gigabit speeds. Change the title to 'Lack of Killer Apps for a Gigabit Age', and the Pew report is spot on.