What Is Edge Computing?
It used to be that computers were these complex and bulky machines that you only see in offices. If you were lucky enough to use a computer terminal, you were a pioneer in computing technology. It was the personal computer that first gave people the chance to own the hardware that comes with a computer terminal.
Since then, a lot of technologies came and went. We now live in a world where computers fit the palm of your hand, and you can take it anywhere you want. There are still a few people who own a personal computer, but these are used for a limited number of instances. Everything is on the cloud now – from your files, to your documents, to the movies you watch, and the music you listen to. When was the last time you bought a DVD set for a TV show you liked?
The thing with cloud computing is that most organizations and people rely on a few providers. The names IBM, Amazon, Google, and Microsoft are the giants when it comes to machine learning, hosting, compute resources, and infrastructure in the cloud.
Among public cloud providers, Amazon is king, with over $12 billion revenue.
All that being told, it just means that there are very limited opportunities for growth when it comes to cloud computing. Everything that you can put on the cloud is already there. This means that there is no new service, no new product to be offered.
And this is the reason why edge computing is fast becoming THE buzzword right now.
What is edge computing?
The thing with the cloud is that it is all centralized. If you need something, you access it from the cloud. If you need to process something, you do it through the cloud. If you want to store something, you put it on the cloud. All that information going back and forth is going to take time, eat up bandwidth, and make you more vulnerable to cyber attacks.
Edge computing helps you avoid all that. It optimizes cloud computing systems by processing your requests near the source rather than send it to data centers hosted in a faraway place. For most cloud services, this means relying less on the provider’s servers, and having locally powered artificial intelligence or machine learning services.
But edge computing does not really just involve local processing. It could also entail having a server closer to where you are. So instead of sending your data halfway across the world (which in turn takes more time, more bandwidth and more chances for hackers to get your data), you can send it to a server that is much closer, usually 100 sq. ft. or less.
A good example to illustrate this is Amazon Alexa speakers. When you ask Alexa if it is going to rain today, Alexa will send your query to Amazon’s data centers in a compressed file. Amazon’s systems will need to uncompress it and figure out what you asked (the weather) then connects to a weather API to get the answers. It will then send the answer back to your Alexa speaker. All that work!
With edge computing, the processing happens locally. Instead of going through a middleman (Amazon’s servers), Alexa would be figuring out what you said and get the answers you want on its own. This is the reason why Amazon is currently working on artificial intelligence chips for its speakers.
Aside from making things a whole lot faster, edge computing is also more secure. Every bit of information you send to the cloud is an attack point that hackers can use to steal your data. Transmissions may be intercepted in man in the middle attacks.
With edge computing, you are sending nothing over the air. For instance, smartphones send account details and passwords over the air. But take a look at how iPhones are encrypting your biometric and facial recognition data on the device itself. With the information you need to unlock your phone stored on the device itself, the whole process is more secure.
Furthermore, you also save on bandwidth. When you send nothing to the cloud, you use up no bandwidth.
Just think about it. A typical home Internet connection would not have problems sending the feeds of one security camera to the cloud. But what if you have 20 security cameras, all of them sending HD video feeds to the cloud. You would probably curse at the amount and frequency of buffering you would experience on YouTube.
When you have 20 security cameras, you would definitely want to cut down on the bandwidth requirements. Instead of sending everything to the cloud, your cameras would be processing the videos locally and then decide on which footage is important and which ones just show your empty living room for hours on end. By sending only the important video clips to the cloud, edge computing has saved you a whole lot of bandwidth.
All in all: What edge computing would entail?
With the Internet of Things (IoT), sensors are pretty much the most important components. The sensors get the data you need and sends it to the cloud.
The cloud would then process all that data to give you the information and reports you need. Which of your factory’s machines are close to breaking down? The cloud would answer them for you.
With the advent of edge computing, mere sensors are not enough. Each of your machinery would have a module that not only gathers information about possible issues, but also be able to process the information and alert you directly without having to go through the cloud.
In the same manner, the age of installing software would be over. You would not need to put the software on your device; you will just use it as is provided by Amazon, et. al.
So what exactly is edge computing?
Edge computing is the tech world’s way of saying that we did everything we did on our end to get the answers we need, rather than sending you the things that could have been done better locally. It allows for faster and more secure cloud transactions, without expending too much bandwidth.
Put in another way, cloud computing is centralized computing. You send all the data to the cloud provider’s servers where all the processing, AI, machine learning, and all other processes are performed, and then the results are sent back to you. Edge computing is just the opposite of centralized computing.
Photo courtesy of Brookhaven National Laboratory.