Facebook’s datacenters at the edge of the Arctic Circle in Lulea, Sweden need to have sufficient connectivity to provide performance, now and in the future, for all the activities Facebook believes its users will be carrying out. Your average wide area network connectivity is not an option for Facebook – leased lines would just be far too expensive, while repeating metro Ethernet connections would introduce too much latency into the network.
Because of these challenges, Facebook has had to look outside “normal” approaches to how Lulea is connected to the rest of its network. It has chosen to implement a wide area network capable of carrying up to 8Tb/s of traffic from Lulea across its major European hubs. 8Tb/s is enough, Facebook says, to carry over 1 million concurrent high definition video streams.
The Technology Behind Facebook’s WAN
There are only a few technology providers on the market who could provide this capability. The need for an almost 4,000 km network where no data regeneration is involved is no simple matter. An Intelligent Transport Network (ITN) from Infinera, using its DTN-X platform and its FlexCoherent network optimization modules, enables Facebook to run greater-than-terabyte connections over a single optical fiber. Using DTN-X, multiple photonic integrated circuits (PICs) can be aggregated to provide the desired bandwidth – meaning Facebook can grow its inter-facility bandwidth as its needs grow… without the need for constant forklift upgrades.
What Does This Mean To Facebook Users?
Facebook has around 1.4 billion active users worldwide, with around 350 million of those in Europe. Flexible inter-facility bandwidth capability for Facebook itself will certainly make data flows from one datacenter to another fast and effective. Simply by improving data accessibility across the European hubs, overall performance should improve for Facebook’s end users.
However, it is not just a case of speeding up the core – the connection between the user and Facebook may well be across a 3G (or worse) connection.
Does this matter?
Well, yes – as organizations are becoming greater users of Facebook for their outreach, the performance of Facebook is just as important as the performance of the brand’s own website. Whereas an organization can do something about its own web site (using content delivery networks (CDNs) and optimized code), it can’t do much about the overall performance of Facebook.
But is the use of such a long-distance optical network enough for Facebook? Not really — it also needs to go even further to enable the same kind of connectivity across the Atlantic, bringing the US and Europe together as peer systems. It also needs to extend this improved connectivity to other regions — such as from the US to Asia/Australasia, and from Europe to Africa —to complete a low-latency, high speed peer global network.
Software-Defined Social Network?
The social networking giant must also address the issue of the connection from the user to the network. Infinera’s aforementioned ITN can be used in a Software-Defined mode…and this may give a hint to where Facebook could go in the future. New clients could be created across a range of devices that use Software-Defined WAN (SD-WAN) constructs to optimize the user’s end-to-end experience, using WAN acceleration approaches across the public internet.
This will become more important as users move toward sharing higher quality video content. Facebook may choose to downgrade video quality (which requires in-line conversion), or it could embrace 4k content and take on the likes of Netflix, Amazon, and others in the content space – but only if it can get the required data across that public network.
With a single stream of 4k content requiring around 12 Mb/s, Facebook’s European backbone should have no problems in supporting its European user base when it comes to making 4k content available. However, with relatively low levels of 20 Mb/s connectivity across Europe, it could find itself with business users who are led to believe that high quality video streams are worth paying for on Facebook’s network, yet are finding that users accuse them of being responsible for buffering, poor lip synch and skipping issues.
New technologies and video codecs can help ease the burden in this area, with a hybrid hardware/software approach (Perseus) from V-Nova showing particular promise. If such an approach can gain enough acceptance to become a de facto standard, then for those companies such as Facebook looking to provide a solid end user experience via a high-performance, low-latency global network, a mix of SD-WAN plus advanced codec support may well ensure that users remain happy.
Image credit: Luleå Data Center