From 1G to 5G: A Brief History of the Evolution of Mobile Standards

On December 1, 2018, South Korea became the first country to offer 5G (the fifth generation mobile wireless standard) and it’s fair to say that the mobile industry has made breathtaking advances since the first mobile phone call was made back in 1973. Mobile devices have reshaped our world in ways that we never could have predicted. Most countries plan to start adopting 5G in 2020 and this is set to help drive the Internet of Things (IoT) and big data.

Every successive generation of wireless standards – abbreviated to “G” – have introduced dizzying advances in data-carrying capacity and decreases in latency, and 5G will be no exception. Although formal 5G standards are yet to be set, 5G is expected to be at least three times faster than current 4G standards.

To truly understand how we got here, it’s useful to chart the unstoppable rise of wireless standards from the first generation (1G) to where we are today, on the cusp of a global 5G rollout.

1G: Where it all began

Where it all began: 1G

The first generation of mobile networks – or 1G as they were retroactively dubbed when the next generation was introduced – was launched by Nippon Telegraph and Telephone (NTT) in Tokyo in 1979. By 1984, NTT had rolled out 1G to cover the whole of Japan.

In 1983, the US approved the first 1G operations and the Motorola’s DynaTAC became one of the first ‘mobile’ phones to see widespread use stateside. Other countries such as Canada and the UK rolled out their own 1G networks a few years later.

However, 1G technology suffered from a number of drawbacks. Coverage was poor and sound quality was low. There was no roaming support between various operators and, as different systems operated on different frequency ranges, there was no compatibility between systems. Worse of all, calls weren’t encrypted, so anyone with a radio scanner could drop in on a call.

Despite these shortcomings and a hefty $3,995 price tag ($9,660 in today’s money), the DynaTAC still managed to rack up an astonishing 20 million global subscribers by 1990. There was no turning back; the success of 1G paved the way for the second generation, appropriately called 2G.

2G: The Cultural Revolution


The second generation of mobile networks, or 2G, was launched under the GSM standard in Finland in 1991. For the first time, calls could be encrypted and digital voice calls were significantly clearer with less static and background crackling.

But 2G was about much more than telecommunications; it helped lay the groundwork for nothing short of a cultural revolution. For the first time, people could send text messages (SMS), picture messages, and multimedia messages (MMS) on their phones. The analog past of 1G gave way to the digital future presented by 2G. This led to mass-adoption by consumers and businesses alike on a scale never before seen.

Although 2G’s transfer speeds were initially only around 9.6 kbit/s, operators rushed to invest in new infrastructure such as mobile cell towers. By the end of the era, speeds of 40 kbit/s were achievable and EDGE connections offered speeds of up to 500 kbit/s. Despite relatively sluggish speeds, 2G revolutionized the business landscape and changed the world forever.

3G: The ‘Packet-Switching’ Revolution


3G was launched by NTT DoCoMo in 2001 and aimed to standardize the network protocol used by vendors. This meant that users could access data from any location in the world as the ‘data packets’ that drive web connectivity were standardized. This made international roaming services a real possibility for the first time.  

3G’s increased data transfer capabilities (4 times faster than 2G) also led to the rise of new services such as video conferencing, video streaming and voice over IP (such as Skype). In 2002, the Blackberry was launched, and many of its powerful features were made possible by 3G connectivity.

The twilight era of 3G saw the launch of the iPhone in 2007, meaning that its network capability was about to be stretched like never before. 

4G: The Streaming Era


4G was first deployed in Stockholm, Sweden and  Oslo, Norway in 2009 as the Long Term Evolution (LTE) 4G standard. It was subsequently introduced throughout the world and made high-quality video streaming a reality for millions of consumers. 4G offers fast mobile web access (up to 1 gigabit per second for stationary users) which facilitates gaming services, HD videos and HQ video conferencing.

The catch was that while transitioning from 2G to 3G was as simple as switching SIM cards, mobile devices needed to be specifically designed to support 4G. This helped device manufacturers scale their profits dramatically by introducing new 4G-ready handsets and was one factor behind Apple’s rise to become the world’s first trillion dollar company.

While 4G is the current standard around the globe, some regions are plagued by network patchiness and have low 4G LTE penetration. According to Ogury, a mobile data platform, UK residents can only access 4G networks 53 percent of the time, for example.

5G: The Internet of Things Era

IoT - 5G

With 4G coverage so low in some areas, why has the focus shifted to 5G already?

5G has actually been years in the making.

During an interview with Tech Republic, Kevin Ashton described how he coined the term "the Internet of Things" – or IoT for short – during a PowerPoint presentation he gave in the 1990s to convince Procter & Gamble to start using RFID tag technology.

The phrase caught on and IoT was soon touted as the next big digital revolution that would see billions of connected devices seamlessly share data across the globe. According to Ashton, a mobile phone isn’t a phone, it’s the IoT in your pocket; a number of network-connected sensors that help you accomplish everything from navigation to photography to communication and more. The IoT will see data move out of server centers and into what are known as ‘edge devices’ such as Wi-Fi-enabled appliances like fridges, washing machines, and cars.

By the early 2000s, developers knew that 3G and even 4G networks wouldn’t be able to support such a network. As 4G’s latency of between 40ms and 60ms is too slow for real-time responses, a number of researchers started developing the next generation of mobile networks.

In 2008, NASA helped launch the Machine-to-Machine Intelligence (M2Mi) Corp to develop IoT and M2M technology, as well as the 5G technology needed to support it. In the same year, South Korea developed a 5G R&D program, while New York University founded the 5G-focused NYU WIRELESS in 2012.

The superior connectivity offered by 5G promised to transform everything from banking to healthcare. 5G offers the possibility of innovations such as remote surgeries, telemedicine and even remote vital sign monitoring that could save lives.

Three South Korean carriers – KT, LG Uplus and SK Telecom – rolled out live commercial 5G services last December and promise a simultaneous March 2019 launch of 5G across the country.

As we’ve seen, 5G stands poised to act as the mobile network of the future, helping to make the IoT a reality. This wouldn’t have been possible without the steady march of technological progress from 1G to the present day. As Ashton points out, the IoT isn’t just “the refrigerator talking to the toaster”; it’s a way to facilitate countless increases in human productivity.

One caveat is that unlike previous generations like 3G and 4G that could piggyback off the infrastructure left by the previous generation, 5G is far more expensive and complicated to implement. 5G requires many more base stations than 4G and these must be positioned closer together, potentially leading to possible, as-yet unstudied, health complications. According to Bloomberg, upgrading to 5G could collectively cost the tech industry over $200 billion and the benefits may not be worth the costs.

I’ll dive into controversies and complications surround 5G in a future post…Stay tuned!