91Èȱ¬

« Previous | Main | Next »

Brandon's History Of Online 91Èȱ¬

Post categories: ,Ìý

Brandon Butterworth | 13:27 UK time, Tuesday, 18 December 2007

Brandon Butterworth is a Principal Technologist in the 91Èȱ¬'s research and development team and the man who first registered the bbc.co.uk domain. He's such a key figure in the history of the 91Èȱ¬'s technical infrastructure that he has a room named after him at bbc.co.uk towers. This post is part of the tenth birthday celebrations of bbc.co.uk. [Update 24/9/09: Brandon is now Chief Scientist, 91Èȱ¬.]

Imagine there's no interweb...

...that's unpossible. We have a lot to thank the internet for - besides a new language and LOLcats.

The first 10 years were the best

When I put the 91Èȱ¬ on the net, it wasn't a moment; we had to for a while. We had email, file transfer, , and piracy since the late 1980s, ours via dial-up through to .

The USA had proper internet.

I wanted it.

First was the grand unification of internal networks, sharing a connection and the cost and bringing together radio, TV and the World Service. I installed a circuit from the first fully commercial ISP in the UK -  - and we were on the net. It didn't do much, but what had been terminally slow over batch dial-up became fast and, for the first time, we had direct connectivity to other hosts on the net.

Then the web arrived. We set up www.bbc.co.uk and started playing with mostly laughable content until we found a few like-minded people around the 91Èȱ¬ who had more time and material for producing content. Little of the content is still online - except one of the first sites which is available at .

From then on (summer 1994), programme-makers joined in or started their own projects, such as the . We grew on donated time and hardware, experimenting with technology and content.

Having drawn radio and TV content into the website, we also put internet into programmes. Email feedback seems trivial now, but being able to respond to a programme and have the presenter respond to you on air was far simpler than a phone-in. questions into live political chat shows hooked News and Radio 3's programme produced live from user-generated content and streamed the programme.

We had our first foreign () language site too, though we later found that the producer had jumped the gun on a co-ordinated effort by World Service to do many languages, such was the enthusiasm.

Some projects, including the 91Èȱ¬ Networking Club, found the expense of running their own independent services too much to continue and returned to the main site. This aggregation was to prove essential in growing the site and making it one of the most popular and cost-effective on the net. As 91Èȱ¬ budgets were tight, we reduced the cost of adding a new site by having it share the same resources and technology as the rest: add a new feature for one, and they all could add it. This meant less waste, too: no idle resources another needs.

Keeping it together let us negotiate better prices and had a huge impact on use - there was one brand and everyone knew where to get 91Èȱ¬ content. There had been lots of discussion over one domain or many. This was partly decided by squatters' use of some obvious addresses: if we used random domains it was likely that we'd eventually find that an obvious one was already a porn site. Guessing a site name within the site meant the site navigation and search could help them find the right one, guessing the domain either got you an unhelpful message, someone else's site - or just an error message.

That's not to say there weren't disasters.

This is actually the 1998 logo for Radio 1 onlineRadio 1 inhaled deeply for its June 1996 site relaunch. The redesign was leading-edge, dynamically generated from Dynamo - a Java server engine for publishing dynamic content. Everything came from the database, which also stored user data for the really dynamic elements such as message boards. It also crashed when more than a few people used it at once, leading to a quick hack to make a regular site from the content. Sadly, lots of interesting features lost their edge but it now worked and the Shockwave elements were fine.

john_peel_glastonbury.jpgThis shifted the risk tolerance for many years. It was realised that a working site was more valuable than the shiniest possibility, and that what is impressive in development may not scale when faced with a 91Èȱ¬ size audience.

It wasn't all bad, though: the site also led in other areas that did work. It featured an early version of the Radio Player. We set up streaming of 30+ live programmes per week (around 80 hours) that played on demand until next live transmission. There were also lots of live webcasts from events such as Glastonbury made up of audio with a updating webcam, as video streaming hadn't been invented yet.

The UK servers over time: one rack, to three, then two rows of nine

The UK servers over time: one rack, to three, then two rows of nine

As the site grew, we hit the limit of our Pipex line and set up some servers in Telehouse Docklands where higher-speed connections were cheaper. This was important with all the streaming that became popular as the webcasts had content not entirely covered by Radio or TV.

Budget

199704xx_newyork_telehouse_.jpg"Nothing special," we thought in 1996, "who wants to listen to a boring Budget speech?" That November, though, lots of people did; the 10-times peak for events became the norm, and the traffic started to stick. It helped News win the argument for the Election 97 site, though there were only six weeks left when we got approval.

The budget was too small and bandwidth was cheaper in New York; I designed an architecture to exploit that with two sets of servers - one in Telehouse Docklands and one in Telehouse New York, with their own special server than directed people to appropriate servers farm for their continent.

The growth of the New York servers

The growth of the New York servers

We also made a system for distributing content to the servers, called ftpborg. The servers were scavenged from other projects. Most of the site - around 8,000 pages - was made automatically from the election computer via CPS built in a few days.

bbc_politics1997_hk_ani.gifThe split site met a criterion that was to figure heavily later - keep the UK users' traffic independent of the rest of the world's. The UK license-fee payers got preferential access; the rest of the world got whatever was left that we could afford to give them.

Chuffed with their success, News followed on with the Politics 97 site, which had our first public video streaming of the Hong Kong handover.

diana_real_stream.pngSo far, we all had other jobs to do as the site was neither official nor funded. Options were considered including not having one or giving others the content. Pressure mounted, meetings went on and on... and then something happened. There had been a car crash and reports claimed that Diana was in it. There was nobody around to update the homepage, so I worked with a friend at World Service, who was in on the Sunday updating their site, until News arrived. We added a video stream while they built a tribute site. On Monday, we started planning the funeral webcast and by the Saturday, we'd organised a syndicated stream fed from our site to EU ISPs and .

By a week later - 10 September - the response to the Diana coverage had convinced everyone that the internet would be big and that the 91Èȱ¬ would be there - properly. With an October deadline, there was no point continuing with meetings. A committee wasn't going to make it. A ninja squad was needed.

I got a small bucket of cash and got told to do whatever was needed.

We got a bunch of shiny new Sun servers - 64 bits, too - installed them in Docklands and New York along with large internet links and updated lots of the software ready for the launch at the end of October.

The next 10 years were the best, too

News launched first in mid-November; not expecting the main infrastructure to be ready, they'd rented servers from an ISP. They had our first video-on-demand service: the One, Six and Nine O'Clock in streamed video - as the iPlayer does today.

The main site launched on 15 December.

1998 was year of events that were fun to webcast - Glastonbury, , - though less interesting on infrastructure development, as some applications were requested that failed to deliver: frustration all round.

In April we set up a 24/7 stream for World Service in case war broke out again. It didn't, but the stream was left running as it was too popular.

News was having performance problems with its outsourced servers, including ISPs having trouble reaching them. News's own ISP didn't have an open peering policy, leading to questions from others as to why they should pay to get to the News site when they peered with us on the main site for free. This came to a head in November; our offer was to add News hosting to the New York server farm, relieving their servers of non-UK traffic. This helped, and as their ISP was pulling out of the hosting market, News set up a set on the 91Èȱ¬ network in the UK too. This worked well with the huge growth.

The technology stayed the same for some time; we spent the next year tweaking it as applications and sites were added. I built up an operations team - one that had started in early 1998 with a couple of contractors - to handle the day-to-day support needed by the content producers and developers. An important part of their work became security checking developers' code. The 91Èȱ¬ was promoting internet use in the UK, bringing in new, naive and vulnerable users. The last thing we wanted was tabloids running "91Èȱ¬ hacked" headlines and users being put off.

I was aiming for one team looking after operations on a technical and editorial (webmaster) level, but after hiring them, politics pulled them apart into two divisions - though they worked closely together.

Kingswood Warren, August 1999

Kingswood Warren, August 1999

The site was getting quite busy; armed with the traffic we'd generated, it was time to play the next level: our own .

So far we'd gone through ISPs for internet connectivity. Rather than use one to forward all of our traffic to all of the others, we had the option of delivering it to each of them directly - but that was only sensible in certain circumstances. For the UK, it meant that we had to be an ISP and join where most exchanged traffic.

See the network develop.

See the network develop.

Above is May 2000: see also Oct 2000; Jan 2001; May 2001; Dec 2001; Nov 2003; Jul 2004

I'd been looking at this for some time, but finally in late 1999 found a way through the LINX membership rules that had prevented us so far. LINX worked well, and similar arrangements which we made in New York later expanded to EU and USA west coast. We still took transit from a couple of ISPs to cover failures and to reach ISPs not present at locations we'd built our network into - around 15% of the total traffic - as it was cheaper.

As other high traffic sites grew, many followed the same path to self-hosting.

The Domecam

The Domecam

Having installed new infrastructure, we spent 1998-99 using it, taking on more content and applications with entertaining live webcasts. At Glastonbury, we fitted a remote control camera to a pole by the cow shed with visibility of the whole site (nobody wanted to clean that kit when it came back). We set up a on a lighthouse - - to watch the Millennium Dome being built; it lasted longer than the Dome. We did webcasts from all over - bird sanctuaries where BT delivered a phone line to a box in the middle of a field, and the Zambia desert for the 2001 eclipse, streaming live video via a satphone link. We set up Outside Broadcasts with a IP-over-satellite link so more webcasts could be done from around the UK.

Plymouth webcam, May 2001; One Big Sunday, July 2001

Plymouth webcam, May 2001; One Big Sunday, July 2001

Our Kingswood site did need some attention. The computer suite was completely rewired; the now inadequate wall-mounted aircon replaced by large floor standing units. This was done with everything live: replacing the power wiring and main feeder meant running for a day on a mobile generator with lots of hired-in theatre lighting wiring around the floors.

An attempt to have a unified CPS didn't work out, though quite a lot of work went into testing packages. reviewed the infrastructure delivery and operation reporting we were considerably underspending compared to market norms, which quelled thoughts of outsourcing it for a while.

reared up and distracted us for a while: we were certain that the infrastructure was fine, but to comply with 91Èȱ¬ policy we had to upgrade systems. As a test, we left one unpatched. It was fine. The only problem was content: scripts displaying 19100 instead of 2000 on web pages. It was quickly fixed as we had most of the ops team on site for our own New Year's Eve party and to be around just in case the world failed as some predicted. We didn't use the special phone BT installed.

Kingswood Warren, July 2001

Kingswood Warren, July 2001

In September 2001, I was sat in an operations meeting when the pager went off and didn't stop: something big was happening. There was a massive influx of traffic to the site - a , it seemed. Damion called us back: "there was this plane...". We turned on a TV and saw a burning World Trade Center tower. Then another plane. Ops worked on keeping the servers happy, raising the webmaster and News to agree sheddable load. This was the first time, so it took a while to get a new light home page in place. Our New York server farm was two blocks from the WTC site; it survived but suffered as power failed. The dust eventually clogged the generators and there were problems getting in fuel. The only outage was in the days after; we covered that by moving all traffic to London. The sites were designed to operate as hot spares for each other. We had planned around London suffering at some point, but it was the opposite.

91Èȱ¬T's first birthday party, April 2002

91Èȱ¬T's first birthday party, April 2002

We resurrected podcasts in 2002. Until then, we'd been providing audio in Real, originally downloadable until we had to disable that due to rights problems. All along, we'd been trying to promote open standards for access to our content, but sadly support was low. I'd wanted to use for audio streaming in 1998. Some of our audio engineers were working on the standard, but the market was dominated by proprietary systems promoting their own codecs instead.

When Ogg Vorbis started, it seemed that finally an open standard would emerge and be widely supported. of streamed and downloadable programmes which some used on portable audio players - podcasts in all but name. Really, we needed to do MP3 to make it easier for general use, but due its associations with piracy, it wasn't acceptable to the rights holders.

I had similar ambitions for video [pdf] as a common format which is now starting to take off many years later. To control costs and make content universally available we've always looked for common standards so that - as with TV and radio - the audience can choose which manufacturer or player to use and so that the content only needs to be produced once, rather than in many formats. There is a significant cost for additional formats that could go towards new services instead.

Packing to leave Kingswood, September 2002

Packing to leave Kingswood, September 2002

So far, I'd been running the department in Kingswood which housed the operations and development teams, master content servers, streaming audio and video coding and content ingest. Then, in 2002, 91Èȱ¬ Technology formed, bought the remains of a dead dot.com and decided to move both to a site in Maidenhead. We lost half the staff in the move and new management was brought in to look after the combined operation, later to be sold to Siemens.

The master content servers and stream encoding were at Kingswood until 2003

The master content servers and stream encoding were at Kingswood until 2003

The infrastructure remained much the same after that, with obvious increases in capacity needed to keep up with demand. In 2007, to reduce costs, the USA network was closed down and all hosting consolidated over here...

...taking us back to where we started: commercial hosting in just the UK.

Brandon Butterworth is Principal Technologist, Kingswood Warren.

Comments

  1. At 02:31 PM on 18 Dec 2007, wrote:

    Brandon thanks so much for that in depth look at the last ten years i think you could entertain me for hours going into more detail although i am glad you didnt as i have work to do :)

    Thanks
    Matt

  2. At 05:14 PM on 18 Dec 2007, wrote:

    Brandon, thank you so much for your insight. It is great to hear (and see) the story behind the headlines. Obviously during September 11th, as a web reader you do not care about the 91Èȱ¬ infrastructure coping with demand but I am glad there was a team worried about all these things!

  3. At 06:13 PM on 18 Dec 2007, James Gawn wrote:

    Just wanted to add my thanks for the fascinating blog post.

  4. At 12:36 AM on 19 Dec 2007, Dickon Hood wrote:

    Yup, that's pretty much how I remember it...

    Much fun.

  5. At 09:17 AM on 21 Dec 2007, Alan Crosby wrote:

    Ahhh, memories. I look so much younger in the 91Èȱ¬T First Birthday picture! And I'm pretty sure I can make my reflection out (along with PaulB) in the reflection on the screen of the One Big Sunday webcasting kit (what's scary is that we actually used that webcasting kit the other day for an internal comms webcast. Apparently, it needs a new CMOS battery -- after 6 years, I'm unsurprised).

  6. At 12:45 PM on 09 Jan 2008, Viv Gregory (Layer3) wrote:

    Thanks for an amazing view of how it all came together, its highly likely that we (Layer3 Systems Ltd) wouldn't have got started with out the inspiration of seeing what you were doing with your team inside the 91Èȱ¬!

    Best Wishes.

  7. At 10:21 AM on 09 Apr 2008, James Porritt wrote:

    The best days of our lives - doubt I'll work anywhere with that sort of feeling again.

    (I'm in the bottom right 'Packing to leave Kingswood' shot, and in the 91Èȱ¬T first birthday shot. The denim jacket is long gone :)

This post is closed to new comments.

More from this blog...

91Èȱ¬ iD

91Èȱ¬ navigation

91Èȱ¬ © 2014 The 91Èȱ¬ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.