The demand for fast, flawless network connectivity continues to expand across the globe as our world becomes increasingly digitally interconnected. High-speed 4G and 5G networks provide instant access to data-intensive apps, games and videos on billions of devices across the world, while the Internet of Things (IoT) allows them to seamlessly interact with one another. Augmented Reality (AR), autonomous vehicles and smart home systems, such as virtual assistants and smart appliances will soon be the standard in our homes. New analytical tools, which include real-time aircraft tracking & maintenance, credit card fraud detection and high-frequency financial trading are made possible thanks to the powerful predictive abilities of big data.
In short, we are addicted to our data. To put things into perspective, there are about 4 billion smartphones that are currently in use today. As the world is currently facing a pandemic, internet traffic has increased dramatically. Netflix, Youtube and Amazon Prime are reducing the quality of their streams to deal with the influx of internet traffic. These devices are connected to a giant data centre somewhere on this earth. That is why all data-driven companies are constantly focusing on finding ways to manage their user data demand and the high traffic that comes with it. Hence, data centres are considered an astounding, fast-growing service industry.
However, data centre growth creates a different set of problems. The only thing that makes connectivity possible is an effective and reliable network to manage the large volume of big data. Data-driven companies are constantly trying to meet demands, and one solution is to use “hyperscale” data centres – i.e. data centres with more than 200,000 square feet.
More Fibre in Data centres
There are millions of servers inside these hyperscale data centres that are operating together using fibre optic networks to control the massive amount of data traffic required by users. These networks typically consist of hundreds of thousands of metres of fibre optic cables, and hundreds of thousands of optical connections, all of which enable fast and efficient data handling. Due to this massive volume of servers, they run very hot and thus require extensive cooling systems. Hence, companies often utilise climate-control systems to lower the indoor temperature, which in turn produces massive quantities of greenhouse gases and requires massive electricity supply.
Another problem is their massive footprint. Due to their large size, they are often built in remote areas where wide-open spaces and land are widely available, thus allowing for a cheaper cost. However, having a far-away data centre can result in performance problems such as latency delays. These time-degraded responses are unacceptable for many critical applications, such as GPS responses for military systems, remote medical monitoring and diagnostic results, or vital financial transactions.
One solution in reducing the land footprint of hyperscale data centres is to squeeze more fibre into a smaller cable. Many fibre-optic cable makers have changed their construction in a way that allows them to pack thousands of optical fibres into one cable. Recently, an UHCF (Ultra-High Count Fibre) cable was introduced, which packs 6,912 fibres into a single cable. Such cable carries double or triple data in even less space and hence, reduces the land size of data centres. This in turn allows for more accessible data centre locations and better energy efficiency.
With the higher fibre count cables, more of the connectors’ end faces become vulnerable to contamination. All connectors are essentially dirty due to the moving parts – such as springs, connectors, and latches – which generate wear debris. This could potentially lead to fibre network problems, such as insertion loss (i.e. weakened signal), back-reflection (signal is diverted back to its source), or even a complete system shut down. Moreover, multi-fibre connectors are more prone to micro-scratches, and the moulded plastic could attract dust which can be difficult to clean.
Surge in Internet Traffic
Another challenge faced by hyperscale data centres is the recent unprecedented surge in internet traffic caused by an influx of work-from-home activity. Many businesses require employees to engage in video conferencing, schools require students to attend online classes via similar platforms, while others turn to online streaming services, such as YouTube and Netflix, to fulfil their entertainment needs. Voice communications, particularly Wi-Fi calling, has seen an activity increase of up to 100%. Wireless network operators are working tirelessly to handle these growing levels of activity, however it undoubtedly causes network congestion and affects internet speed.
Despite the internet activity hiccups due to the pandemic, the crisis is actually driving the biggest internet expansion in years. People are using the internet more widely and there are now more spread-out internet usage peaks during different times. Internet health checks from Ookla, creator of the popular internet speed-test, stated that internet broadband is in overall good health despite some minor speed slowdowns. Last but not least, the current situation may encourage newer technology to be deployed earlier than expected. The increased demand has pushed the need for more robust, faster, next-generation wireless internet service. The new 5G technology has download speeds that are 100 times faster than 4G and thus, the crisis may actually fast-forward 5G adoption.
Benet Leong is the Technical Solutions Manager for fibre instruments and also leads the business development efforts for enterprise & data centres in the SEA region for VIAVI Solutions. He has 15 years of professional experiences working in Japan and SEA in the network test & measurement industry including fibre testing, network infrastructure testing and management as well as software development for enterprise systems.