There’s been a lot in the news over the past few months regarding how much information the NSA/FBI/etc. want to gather. They want our data and they want all of it in real time. Unsurprisingly, quite a few people have an issue with that stance and companies have started to use privacy as a feature.

Apple has led the privacy charge. Last year with iOS 8, Apple implemented security in a way that even Apple cannot break it. Previous to iOS 8, a government agency could issue a formal request to Apple to decrypt the phone so its data could be accessed. It took a while and there was a long waiting list but it was possible.

This year Google has joined Apple on the privacy front. New versions of the Android OS have encryption turned on to default and the encryption is solid. If you own an Android phone or an iPhone and have a strong password protecting it, you can be reasonably certain that your data is safe.

Our government agencies tell us that they need to be able to decrypt phones to gather evidence for investigations. That’s all well and good. I want our law enforcement agencies to catch bad guys. But what they’re asking for is practically impossible to implement.

Right now we encrypt things and then use a key or set of keys to decrypt the thing. You do this every time you enter your password into your iPhone. Your password is the key. It’s not just stuff you keep on your phone. You send data in the form of text messages and status updates out on to the Internet all the time. Law enforcement wants access to that, too. Apple’s implementation of iMessage is also secured. When you send an iMessage, only you and the recipient(s) can read it. No one else can. Apple does this via key exchange between your devices.

The term that everyone uses when they hear that someone else wants access is backdoor. Typically a backdoor into secured software is supposed to be kept secret and quite often has worse security (because it’s supposed to be secret). Journalists and bloggers all use that term. They say things like “The NSA wants a back door into your private life.”

Apple has actually been sued over the security of its devices. The Department of Justice wants access to these devices. So far, Apple has fought the suits and won. But if Apple were to give the DOJ access to its devices, it need not be through a backdoor. Apple could simply create a second set of keys for the front door.

A second set of keys to the front door sounds like a reasonable solution on paper but in practice it’s impossible to trust. How many agencies are going to get a copy of the keys? How many people in each agency should get a copy? Can we trust these people to not make more copies and hand them out? Can we trust the agencies to keep the keys secure?

The last question is the biggie here. The answer is “no.” We cannot trust the NSA or the FBI to make reasonable efforts to keep the keys secure. The DOJ has a pretty tarnished track record when it comes to data security, I don’t want them to have any keys to my information.

Even if we could trust the DOJ with keys to our information, I wouldn’t want them to have it. Most people are doing mundane things like texting about groceries and don’t seem to care. They say things like “If you have nothing to hide, then you have nothing to fear.” That’s like saying “I don’t care about the right to free speech because I have nothing to say.”

I’d rather not have my privacy impugned upon without rigorous due process.

Jason Ogaard is a software engineer who formerly lived in Hutchinson. He welcomes your technology questions, and he’ll answer them in this space. Please send your questions to technobabble@hutchinsonleader.com.