You might have heard the term “encryption backdoor” in the news recently. We’ll explain what it is, why it’s one of the most hotly contested topics in the tech world, and how it could affect the devices you use every day.
An Access Key into a System
Most of the systems consumers use today have some form of encryption. To get past it, you have to provide some kind of authentication. For example, if your phone is locked, you have to use a password, your fingerprint, or facial recognition to access your apps and data.
These systems generally do an excellent job of protecting your personal data. Even if someone takes your phone, he can’t gain access to your information unless he figures out your passcode. Plus, most phones can wipe their storage or become unusable for a time if someone tries to force them to unlock.
A backdoor is a built-in way of circumventing that type of encryption. It essentially allows a manufacturer to access all the data on any device it creates. And it’s nothing new—this reaches all the way back to the abandoned “Clipper chip” in the early ’90s.
Many things can serve as a backdoor. It can be a hidden aspect of the operating system, an external tool that acts as a key for every device, or a piece of code that creates a vulnerability in the software.
RELATED: What Is Encryption, and How Does It Work?
The Problem with Encryption Backdoors
In 2015, encryption backdoors became the subject of a heated global debate when Apple and the FBI were embroiled in a legal battle. Through a series of court orders, the FBI compelled Apple to crack an iPhone that belonged to a deceased terrorist. Apple refused to create the necessary software and a hearing was scheduled. However, the FBI tapped a third-party (GrayKey), which used a security hole to bypass the encryption and the case was dropped.
The debate has continued among technology firms and in the public sector. When the case first made headlines, nearly every major technology company in the U.S. (including Google, Facebook, and Amazon) supported Apple’s decision.
Most tech giants don’t want the government to compel them to create an encryption backdoor. They argue that a backdoor makes devices and systems significantly less secure because you’re designing the system with a vulnerability.
While only the manufacturer and the government would know how to access the backdoor at first, hackers and malicious actors would eventually discover it. Soon after, exploits would become available to many people. And if the U.S. government gets the backdoor method, would the governments of other countries get it, too?
This creates some frightening possibilities. Systems with backdoors would likely increase the number and scale of cybercrimes, from targeting state-owned devices and networks to creating a black market for illegal exploits. As Bruce Schneier wrote in The New York Times, it also potentially opens up critical infrastructure systems that manage major public utilities to foreign and domestic threats.
Of course, it also comes at the cost of privacy. An encryption backdoor in the hands of the government allows them to look at any citizen’s personal data at any time without their consent.
An Argument for a Backdoor
Government and law enforcement agencies that want an encryption backdoor argue that the data shouldn’t be inaccessible to law enforcement and security agencies. Some murder and theft investigations have stalled because law enforcement was unable to access locked phones.
The information stored in a smartphone, like calendars, contacts, messages, and call logs, are all things a police department might have the legal right to search with a warrant. The FBI said it faces a “Going Dark” challenge as more data and devices become inaccessible.
The Debate Continues
Whether companies should create a backdoor in their systems remains a significant policy debate. Lawmakers and public officials frequently point out that what they really want is a “front door” that allows them to request decryption under specific circumstances.
However, a front door and encryption backdoor are largely the same. Both still involve creating an exploit to grant access to a device.
Until an official decision is rendered, this issue will likely continue to pop up in the headlines.
- › Should You Enable “Advanced Data Protection” for iCloud on iPhone?
- › Everything You Need to Know About “Reset This PC” in Windows 10 and Windows 11
- › SysJoker Has Been Attacking Computers for Over Six Months
- › Your COVID Passport App Might Put Your Privacy at Risk
- › How to Turn On or Off the Always On Display for Android
- › How to Find and Delete Your Google Assistant Voice History
- › How to Upload an Instagram Reel From a Computer
- › How to Add Apps to Your Discord Server