One of the changes Apple has made in iOS 12 is much tighter protection against devices designed to brute-force iPhone passcodes. Unless the device has been unlocked within the past hour, the USB port will be restricted to charging, requiring the phone to be unlocked before it will permit data access.

From much of the reporting on this, you could easily get the impression that Apple’s aim here is to thwart law enforcement investigations – and that simply isn’t the case …

Let’s look at a few examples of the coverage this is getting …

The New York Times: Apple to Close iPhone Security Hole That Police Use to Crack Devices

Reuters: Apple to undercut popular law-enforcement tool for cracking iPhones

The Verge: Apple will update iOS to block police hacking tool

Mashable: Apple’s officially making it harder for cops to bust into your iPhone

But according to a new Reuters report, Apple is planning to release a new feature to iOS that would make those devices useless in the majority of cases, potentially sparking a return to the encryption standoff between law enforcement and device manufacturers.

I could go on (and on), but you’ve probably seen other headlines yourself.

Apple intends to update its iOS with a new feature that will make it significantly more difficult for law enforcement agencies to access data on locked iPhones.

To be fair, many pieces that start in this vein do go on to point out that tools like GrayKey are used by criminals as well as law enforcement. But the overwhelming impression given is that Apple is out to make life hard for law enforcement.

The reality is that Apple is dealing with one simple fact known to every security professional but seemingly not to the law enforcement agencies that are complaining about the move: you cannot have a security hole that is used only by the good guys. Anything law enforcement can use with good intentions, criminals can use with bad intentions.

You could argue that Apple could have a special law enforcement mode, but again: any backdoor into iOS intended for use by the good guys will inevitably fall into the wrong hands.

Some persist, suggesting Apple could have this mode require a special device available only in a locked strongroom at Apple Park, with law enforcement agencies having to go there (with a court order) to access it. But, as I’ve said before, this simply isn’t a realistic scenario.

And even if we assumed not one single bad apple among those million people, you’d also be trusting every courier not to lose one in transit – and if you do that, I want access to your courier companies!

So soon, the FBI would hold the key. Then other law enforcement agencies. In time, that key would be held in every police precinct house. We would then be trusting more than a million people with access to that key to abide by the rules. Government agencies don’t always have the best of track-records in doing that.

But it’s worse than this. The very fact that a backdoor exists means that hackers know it can be done. Sooner or later, they are going to figure out how, and then they can create their own devices.

So no, this cannot be safely done. Apple has no desire to hinder the work of law enforcement agencies, but it’s not them the company is trying to thwart: it’s the bad actors who would use the same vulnerability for nefarious ends. That’s why Apple is doing this – not to make life harder for cops.