Encryption -- the scrambling of private data to keep it private -- is more popular than law enforcement prefers.

This July 27, 2014, photo provided by U.S. Customs and Border Protection shows Tashfeen Malik, left, and Syed Farook, as they passed through O'Hare International Airport in Chicago. A U.S. magistrate has ordered Apple to help the Obama administration hack into an iPhone belonging to one of the shooters in San Bernardino, Calif. The ruling by Sheri Pym on Feb. 16, 2016, requires Apple to supply highly specialized software the FBI can load onto the phone to cripple a security encryption feature that erases data after too many unsuccessful unlocking attempts. Federal prosecutors told the judge they can't access a county-owned work phone used by Farook because they don't know his passcode. (U.S. Customs and Border Protection via AP)

Credit: George Mathis

icon to expand image

Credit: George Mathis

New Apple and Android devices are, by default, encrypted to block people who aren't you -- say your mother-in-law or the local district attorney -- from seeing the files stored on your phone without your permission.

A lot of phones have fingerprint sensors. After the FBI paid $1 million for an iPhone security hack their experts don't understand I figured we'd soon hear of police using the fingers of unconscious suspects to unlock phones.

Or maybe we won't. The U.S. Senate is drafting a bill that requires all hardware and software developers to provide the government a "back door" to private data.

The Feinstein-Burr bill has been called "The Bill That Bans Your Browser" because it makes modern web browsers -- which use encryption to keep things like your credit card number a secret -- illegal .

According to a leaked early draft of the bill obtained by The Hill  it requires companies to provide "intelligible" data of user activities on demand, or provide technical assistance. For free, I presume.

In other words, federal law would require Apple to unlock any iPhone accompanied by a court order. The same would go for Google, Mozilla, Snapchat, WhatsApp, or any other company that employs encryption technology.

The problems with such a law seem numerous.

The Bill of Rights, which lays out how citizens are protected from their government, ensures a right to privacy, but that has never stopped a warrant. There's also the right to not incriminate yourself, which is probably why the bill targets companies instead of individuals.

More problematic is how does the U.S. government force a Chinese manufacturer to obey U.S. law?

Last time I looked, most phones were made in other countries. The Feinstein-Burr bill would likely make U.S. software developers keep more than just their profits offshore.

How does U.S. law affect data traveling through a Dutch Tor server?

What happens when people start using open source software not created by an identifiable person or company?

The biggest problem? Consumers want to keep their information private and successful companies like to give people the things they want. And Apple and Google can afford better lawyers than the government.

Rep. Darrel Issa, a California Republican, says the bill "is flawed and technically naive."

"Americans deserve to know that the information retained on these devices will remain secure. This legislation would effectively prohibit any company who wants to improve the security of its products from doing so because government’s ability to access our personal and private information is more important than innovation," said Issa.

Wired magazine went a bit further, calling the bill "ludicrous, dangerous and technically illiterate."

One provision of the law requires Google and Apple to ensure that every "secure" app offered to the public have a "back door" that can be used by law enforcement.

What are the chances those security loopholes would be discovered by people who are not the police? Higher than Obama choosing his major is my guess.