In a very short period, wireless local area networking (WLAN) has moved from an all-too-often insecure bonus connection or luxury for a few, to an essential need that has to be managed and delivered like any other part of IT infrastructure.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Even public Wi-Fi networks have mostly lost their hobbyist feel and marketing strategies that were once oriented on creating new income streams for coffee shops.
Wireless networks might not be using licensed spectrum in the same way that licensed public carriers operate, but they are becoming professionally delivered services that are relied upon as much as cellular mobile networks.
In buildings and across campuses – offices, factories, colleges and hospitals – wireless networks deliver high-bandwidth mobile connectivity where there is little or no cellular coverage.
Where once it was sufficient to reach formal and informal workstations and desks, mobile devices such as tablets and small smart things connecting to the internet via the internet of things (IoT) means wireless coverage is expected and demanded everywhere.
4A quality service
Increasingly, demands for voice and video over WLAN means that capacity to scale has become as important as coverage, and user expectations of a 4A quality service – anyone, anytime, anywhere on anything – have soared.
Delivering this is a growing challenge. While many in the industry still have a focus on the wireless end of radio signals and access points, there is also much more activity in the area of fidelity – delivering integrated management and security.
Enterprises looking to extend or deploy new WLAN need to ensure they use wireless technology that will extend to every corner where access is required and provide sufficient capacity for users and their growing networked applications, yet still be straightforward to manage and secure.
Wireless standards are still emerging and evolving rather patchily from an enterprise perspective, and many would have been affected by the poor way 802.11n was deployed. Older wireless devices, which only support older standards, can have a negative effect on how wireless networks are implemented, either through interference or signal degradation, which is sometimes necessary to support older standards. This means some legacy and especially very early WLAN deployments needed to be ripped out and replaced – not just at the WLAN equipment level, but also at the device level.
However, the current prevailing flagship standard, 802.11ac, is designed to extend and co-exist with earlier deployments such as 802.11n and can be regarded as an evolutionary improvement.
The latest iteration of the implementation of this standard, known as 802.11ac Wave 2, has been supported in main brand smartphones and other mobile devices since late 2016 and is appearing in access point products from all major suppliers. Key to the increased performance is the multi-user implementation of multiple input multiple output (MU-MIMO) delivered by an array of antenna streams. Several suppliers were early proponents of this approach, including Ruckus (now part of Arris as its carrier grade wireless offering) and Xirrus (now part of Riverbed).
Standardisation of the techniques means that not only can any multi-antenna device beam form to any other device at any time, but now the receiving device can assist in the process. This bi-directional approach of radio sounding ensures the radio energy is steered towards its intended recipient.
This gives Wave 2 a boost in capacity and performance by simultaneously supporting multiple devices, additional channels and providing dedicated bandwidth per user. Although there is more work ongoing in implementing the whole 802.11ac standard, a third wave and a further standard beyond, 802.11ax, this should not hold enterprises back.
Wireless radio systems have moved on from narrow channels, slow and “amateurish” performance to multichannel, fast and professionally formed services. Those wanting ultimate performance in noisy and high-demand environments should look towards systems that offer bi-directional beam forming, and look for the larger number of concurrent streams – currently 4×4, but will increase in the next wave of 802.11ac to 8×8.
In fact, 802.11ac is not the only wireless standard that is becoming important in many working environments and the higher frequency (60GHz) 802.11ad, also known as WiGig, is an interesting emerging proposition. Higher frequency means more capacity but shorter range because the signals are absorbed very easily by construction materials in a modern office, factory or home. However, in a similar, but much higher bandwidth way to how Bluetooth allows ad-hoc short-range wireless connectivity for audio, this standard could offer the same for video and audio as a cable replacement technology.
WiGig may have a widespread impact in organisations, especially in investments involving AV/IT integration or other IT to room facilities – docking stations, monitors, and so on. Increasingly, those responsible for wireless networking will need to be collaborating with those delivering AV experiences and those managing the building fabric of the workplace, as both wireless and the IoT bring all these elements closer together. Expect WiGig to become more important over the next couple of years, but not an immediate concern for those deploying WLAN today.
Wireless started off as a technology deployed in “islands of need”, which has led to it having its own tools for management and security. Over time, this becomes less tenable and so wired and wireless management needs to blur into a unified platform delivering connectivity. This is the route that some companies are taking through acquisition or as wired and wireless suppliers merge, such as HP with Aruba, Extreme Networks with Enterasys several years ago, and latterly Avaya Networking.
WLAN architectures have changed over the years as access points have increased in number and slimmed down, with centralised controllers taking over more of the effort. The problem is that this can become unwieldy and difficult to apply policies, such as security, quickly enough. Flexibility to deploy and be able to adapt rapidly as needs shift dynamically requires a different architectural approach to networking.
Software-defined networking (SDN) began in the datacentre but has now spread to WAN. The
premise of a software-defined wide area network (SD-WAN) is a network virtualised and built on distributed open hardware, managed by centralised, remote software in the cloud. It is not necessarily about replacing exotic technology with low-cost simple hardware, but separating the capabilities. The hardware still provides the “muscle” required for high performance, with the cloud taking the “brains” to make management simpler.
Stable radio technology
This software-defined approach to networking is moving into WLAN and will no doubt lead to further mergers and acquisitions. Enterprises should feel assured that the radio technology is standardising and stable, but management systems will continue to change. For this reason, as well as the benefits of ensuring that consistent security policies can be propagated across the whole network, enterprises should increasingly look at cloud-based approaches to wireless network management, such as Ubiquiti airControl, ExtremeCloud, Aruba Central or Aerohive Connect.
As larger organisations virtualise their IT infrastructure, a cloud-managed architecture for WLANs comes into its own. It is more flexible, easier to scale and simpler to apply consistent policies across the network.
This makes it cheaper to deploy, manage and keep secure. If architected well, the control of policies is managed from the cloud, but individual access points are still smart enough to make critical decisions about the radio environment for load balancing, controlling interference, and so on.
This combination of central control and local power and intelligence makes the WLAN more resilient and reliable.
With wireless, rather than a fixed network, becoming the primary mode of access, it becomes increasingly important to understand the radio landscape from the experience of real users and their devices. Different mobile devices not only support different wireless standards, but their antennae have different characteristics. Traditionally, for example, laptops have better signal reception and wireless range than either tablets or smartphones. Increasingly, these smaller devices are being used in the workplace and for rich applications with high demands on the network capacity and latency, such as voice and video.
Measuring the coverage and capacity to deliver acceptable user experience in real time in all locations has become a very useful addition. This is important across campuses, hospitals and other multi-building estates and tools such as Ubiquiti’s airLink simulator, Netscout’s AirCheck or Cisco’s CleanAir provide ways to make the invisible visible. This is useful not only for ensuring legitimate users have the coverage and capacity they require, but also spotting rogue radio waves – whether from hostile security threats or simply from signal interference.
The WLAN landscape is changing. Many of the specialist Wi-Fi companies have been acquired by companies with a wider proposition, offering either standardised network infrastructure or management. Looking for a wireless solution should involve fewer considerations of the radio element, but increasing use by different types of devices and users in unusual locations – tablet, smartphone and IoT users could be anywhere – which means radio wave capacity is in high demand.
Higher frequency bands, beam forming and MU-MIMO are helping the situation, but users have high demands and expectations. The vital element of any deployment will therefore be the mechanisms that assist in management, and increasingly this will mean a virtualised model with control in the cloud.
Rob Bamforth is a principal analyst at Quocirca.