In the beginning (the 1980s) there was 1G, although it was never called that, the name came later as the other Gs began to take shape. Mobile telephony started with the analogue phone, often referred to as “the brick”. They were bulky and limited to making telephone calls, no photo messages, texts or internet browsing was available then. Motorola was the first company to produce a handheld mobile phone.
On April 3, 1973 Martin Cooper, a Motorola researcher and executive, made the first mobile telephone call from handheld subscriber equipment, placing a call to Dr Joel S Engel of Bell Labs. The prototype handheld phone used by Dr Cooper weighed 1.1 kg and measured 23 cm long, 13 cm deep and 4.45 cm wide. The prototype offered a talk time of just 30 minutes and took 10 hours to re-charge.
If you’d have asked someone in the telephony industry back in the early days of digital what the G stood for, they’d probably have said GSM – originally a European standard called Groupe Spéciale Mobile. The name changed to Global System for Mobile Communications when the industry realised those people who didn’t speak French might also want to buy it.
GSM arose from a European Union mandate which declared that there should be a single interoperable mobile standard available to anyone. Previously travellers could have theoretically required many different phones to use while travelling throughout Europe if they wished to stay in touch with the office.
The original GSM became what is now referred to as 2G. It stood for Second Generation. This new digital approach to telephony brought in audio clarity for phone calls and enabled the transfer of data at 9.6kpbs, later increasing to what was then a “blistering” 57.6kbps. In reality, few networks or phones supported speeds of more than 28.8kbps. GSM/2G heralded the start of mobile digital data.
But, there was a problem preventing the mobile providers from using 2G to capitalise on the machine to machine (M2M) business and, therefore, earning themselves regular, recurring revenue from devices which wouldn’t be calling up customer support every time something went wrong. That problem was that GSM data was circuit switched, meaning that, like a dial-up modem, the caller had to establish a phone call before they could send any data.
This led to the development of 2.5G, more commonly referred to as GPRS (Generalised Packet Radio System), which enabled the Internet model of packet connectivity on a mobile, and bringing with it a maximum data rate of 114kbps. Few networks supported rates above 56kbps however. The development of GPRS led to the first appearance of an internet browser for the mobile phone.
It became clear that 2G was limited and so development began on 3G, which would be able to cope with an increasing number of users and higher data throughput, making usable data rates of up to 100kbs and beyond a reality. But there was a spanner in the works.
Governments around the world had been licencing the spectrum for GSM networks (basically the frequency bands or frequency ranges on which cellular devices broadcast) without really taking note of what was being done with them. Once they saw money was being made from the granting of these licences, they threw open the playing field, resulting in bidding wars and an explosion of potential Mobile Network Operators (MNOs) trying to justify why they should be granted a licence to operate.
In the sea of chaos which followed, good things did arise; the creation of video calling and Multi Media Messaging being two of the most memorable and when things settled down, 3G carried on in much the same way as it had when it was still 2G and began to justifiably call itself a mobile broadband experience.
Operators, notably Cellnet in the UK, had tried to pre-empt this using GPRS when they rolled out WAP (Wireless Application Protocal) in 2002. More commonly known as the Worthless Application Protocol (attributed to The Globe and Mail: “Survivor’s guide to wireless wonkery”, published on September 23, 2005) or Wait and Pay because of its sluggish response. However, as technology improved, so did WAP.
the iPhone phenomenon
Then something happened to mobile telephony which could never have been predicted.
Apple launched the iPhone in 2007. The iPhone was a revelation. It was, to all intents and purposes, a mini-personal computer and it took off like a rocket. However, its higher price meant many users weren’t also willing to pay higher rates to access the mobile network to use it and there was an explosion in the use of Wi-Fi networks to get around paying for a data package contract.
The faster Wi-Fi data rates began to make mobile video a more compelling experience, which led to market pressure to replicate it when using 3G. Mobile operators began offering “all you can use” deals and, as a result, the network capabilities started to reach capacity.
Wi-Fi wasn’t new. It had been around as far back as 2004, when the industry was working on UMA (Unlicensed Mobile Access), a method to allow phones to seamlessly roam from mobile to Wi-Fi networks.
Just two years after the launch of the iPhone, it had become clear that 3G networks were going to be overwhelmed by the growth of bandwidth-intensive applications like streaming media. As a result, the industry began looking to data-optimized 4th-generation technologies, with the promise of speed improvements up to 10-fold over existing 3G technologies. The first two commercially available technologies billed as 4G were the WiMAX standard (offered in the US by Sprint) and the LTE standard, first offered in Scandinavia by TeliaSonera.
Then another spanner in the works arose with the development of “over the top” (OTT) apps like Skype, WhatsApp and Snapchat, which replicated the “all you can use” data packages but were either free or far cheaper to use. This coincided with the rise of social media networks like Facebook and Twitter and suddenly the world was wanting to use its phones for mass communication in an instant.
Again there was a need for expanded capacity and the development gurus at the newly renamed 3GPP (3rd Generation Partnership Project) came up with LTE (Long Term Evolution), another variant of the 3G standard offering yet more data. But, bizarrely, it had no voice capability. A sure sign of the rise in importance of data transfer via social network instead of the old-fashioned “making a call”. The MNOs quickly realised it was going to be a problem as they were looking at having to switch off some of the old 2G and 3G networks in order to enable the new 4G technologies to have enough spectrum and VoLTE (Voice over LTE) was developed.
One of the main ways in which 4G differed technologically from 3G was in its elimination of circuit switching, instead employing an all-IP network. Thus, 4G ushered in a treatment of voice calls just like any other type of streaming audio media, utilizing packet switching over internet, LAN (Local Area Netword) or WAN (Wide Area Network) via VoIP (Voice over Internet Protocal).
As Nick Hunn, CTO at WiFore, says in his What is 5G? And do we need it? blog on LinkedIn:
“This highlights another aspect of the change in power within the industry, with phone manufacturers taking decisions away from network operators. For many years the network operators bought the vast majority of phones and sold them on to consumers. The operators strongly resisted any feature going into the phone which didn’t make them money.
They couldn’t see the point of GPS, sensors or cameras. They eventually relented on cameras when they tried to promote video calls using 3G, only to find that users had a totally different idea about how to use them. Whereas the operators used to control the specification of the phones they sold, today those decisions are made by phone manufacturers who now design them for customer appeal, not for network operators.”
According to Hunn, this same change in the balance of power towards the phone manufacturers themselves instead of the network operators, has come because of Wi-Fi. Before 2005 few handsets had Wi-Fi chips and the MNOs resisted the additional cost. As the internet became commonplace, this changed rapidly with phones using Wi-Fi to support data rates of multiple Mbps and today nearly every smartphone has Wi-Fi capability – driven almost entirely by user demand. Hunn adds:
“The irony is that many of those who were responsible for its early years of growth still fail to realise that they are no longer leading, rather than being led. The original pioneers of the mobile phone experience – Nokia, Siemens, Philips and Ericsson are gone. Motorola soldiers on in name alone, having been sold to a Chinese laptop manufacturer and Blackberry is as bitter as its namesake after Michaelmas.”
So what is 5G and where will the mobile telephony industry go next? It’s unclear at the moment what will focus the development of the next generation protocols. We don’t know what the 5G network will look like or where it will come from.
But there are development signs already.
said Prof Rahim Tafazolli, who is the lead at the UK’s multimillion-pound Government-funded 5G Innovation Centre at the University of Surrey. That means the opportunity for properly connected smart cities, remote surgery, driverless cars and the “internet of things”. Prof Tafazolli now believes it is possible to run a wireless data connection at an astounding 800Gbps – that’s 100 times faster than current 5G testing.
The huge rise in connected devices will be due to a boom in inanimate objects using the 5G network – known as the “Internet of Things” (IoT). It won’t be just products like remotely controlling your heating or that mythical fridge ordering you more milk, trains could tell you which seats are free while they are in the station. Devices will be able to choose dynamically between which of three still-to-be-determined bandwidths they use to avoid any of frequencies from becoming overloaded, explained Prof Tafazolli.
Samsung Electronics announced in 2013 that it had successfully developed the world’s first adaptive array transceiver technology operating in the millimeter-wave Ka bands for cellular communications. The new technology sits at the core of 5G mobile communications system and will provide data transmission up to several hundred times faster than current 4G network. 5G will be capable of providing a ubiquitous Gbps experience to subscribers anywhere and offers data transmission speeds of up to several tens of Gbps per base station.
Ofcom, the UK’s telephone watchdog, has announced that it is making early steps to see how 5G technology might be made available, despite the fact many British mobile phone users are only just switching to 4G networks.
Multiple input multiple output (MiMo) technology is set to be a key part of these efficiency measures, according to researchers. MiMo uses several small antennae to service individual data streams. Samsung’s impressive download speeds were delivered using the technology.
5G is also likely to use many more base stations, including macro sites and smaller stations employing a range of radio technologies, to ensure better coverage. The Australian minister for communications, Malcolm Turnbull, even suggested that there could soon be a 5G base station on every home and lamppost. If it works well down under, the measure could be adopted throughout the world.