Showing posts with label encryption. Show all posts
Showing posts with label encryption. Show all posts

Thursday, February 18, 2016

Can Keybase.io save the internet?

When used correctly, encryption works really well. It works so well that most people can't wrap their brain around how powerful it is.

The biggest gains we can make in improving online security and privacy won't be the result of making encryption better. Most of our problems are the result of how encryption is used.

At the moment, using encryption is complicated. Its not complicated to the point where you need special training to use it, but its complicated enough that its a pain in the ass for non-technical people to adopt. So non-technical people don't adopt encryption.

That's a real problem - because most people are non-technical. Actually, its worse than that, because it means only communications between technical people can be secured with any regularity. Even technical people can't communicate securely with non-technical people.

Encryption can only be truly successful with massive levels of adoption. Even with all of its problem, HTTP-based SSL has been enormously helpful because it doesn't require end-users to understand it or even know it exists in order for it to work.

Unfortunately, there are essentially no options that allow transparent encryption of communications between social network users; and the options for the transparent encryption of email are all piss-poor, and do not protect user messages from service providers outside of very specific edge cases.

There is another problem. Encryption is used not merely to make it impossible to read communications; it is also used to verify the identity of individuals involved in that communication. There are two approaches identity verification. With SSL, identify verification is performed by a centralized, corporate authority. This is why people pay money for SSL certificates that anyone can generate on their home computer - because the certificate from Verisign comes along with a guarantee that anyone using your certificate is really talking to you.

Quite a few companies make huge sums of money providing this sort of "service", which is in large part why it has existed for so long. But it can't continue for much longer. Along with centralized trust comes centralized vulnerability. The Snowden revelations confirmed that SSL certificate providers have been targeted by state actors - stealing the keys held by those providers lets the thief defeat the encryption provided by their certificates. But state actors aren't the only problem. It is in the immediate self-interest of SSL providers to issue a certificate to everyone who asks for one, and as a result fraudulent SSL certificates are issued regularly to spammers, phishers and other bad actors. In theory, issuing fraudulent certificates was supposed to devalue the level of Trust that each provider commoditizes. In reality, information about the volume of SSL-related fraud is opaque. Few people know which SSL companies issue the most fraudulent certificates, preventing fraud from playing a direct role in pricing. 

Peer-to-peer based trust resolves these problems. The idea is that instead of purchasing an encryption certificate from a centralized authority like Verisign, you generate your own and then people who know you confirm the validity of your certificate. Users can then decide whether to trust a certificate's validity based on the same sets of criteria they use to accept a social media "friend" request (e.g. Do I know you? Do I know someone who knows you? If we don't have direct social connections are you popular enough that you can be trusted anyway?).

A platform called Keybase.io is being designed in the hopes of addressing all of these problems; and it has a lot going for it. Keybase is in pre-release development - so accounts are invite-only. I recently received an invite code and was able to create an account fairly quickly. You can take a look at my online profile here: https://keybase.io/joshwieder

My user profile on Keybase
So far, there are a few elements to Keybase. There is a website, a client application and most recently - what the developers are referring to as the Keybase File System (KBFS). Within each of those elements are a few very interesting initiatives - for example, the website uses kbpgp.js - a PGP build for JavaScript. Transactions made through Keybase now write to the Bitcoin blockchain. And the signature architecture is based on a hash tree.

I've just gotten started using Keybase today, and the latest build available for the two operating systems that I am using to test Keybase do not yet include some of these features, like KBFS.

But here's the 1,000-foot-high view of what Keybase is trying to accomplish. Signing up for Keybase will eventually provide users with their own set of encryption keys, access to the website, the client application and a deployment of the Keybase File System. You can then 'sign' your Keybase public key by proving your ownership of social media accounts - right now Twitter, Github and a few others are available. This is a sort of hack-ey-ish way of accomplishing that peer-to-peer style of identity verification. 

Using the Keybase client (or the website if you choose to trust the Keybase java library with your private key ... which I haven't and won't during the pre-release), you can "track" other Keybase users. Tracking a user demonstrates that you have verified that user's identity - it also creates a directory within KBFS allowing you to share encrypted files with that user. This allows you to exchange secure communications one-way with a user knowing nothing other than their social media account - unlike the DM features of Twitter, for example, which is both insecure and requires two-way authorization to exchange messages. 

Keybase has a long way to go before it can address the user interface problems that make social-media networks the self-imposed wiretaps they are today. The client applications are solely CLI-based, even on Windows (the fix for this will be based on Electron by the looks of things). Certain key functions, like importing keys, require the installation of GPG. Until these two issues are resolved non-technical users will not be capable of using Keybase. 

But maybe they won't have to. I'm most excited about the possibility that social media services will integrate the sort of application infrastructure that Keybase is developing into their existing applications. There's no reason why maintaining a separate Keybase account is necessary. Twitter could integrate a Keybase-like service, whereby following a Twitter account "verifies" their identity and setting up a Twitter account, or logging in from an independent device, also generates a PGP keypair that is saved on the user. It is not necessary that an individual only has one PGP key. They can have many - one for each device that authenticates to a given service. 

I've been bummed about online privacy since 2005. Since I first found out about the AT&T fiber tap, there has essentially been no good news; what gains in privacy have been made largely surround edge cases and specific applications. Keybase.io is the first project that I've come across in recent memory that has me hoping that privacy can stop being something we remember fondly and become something that we once again take for granted.

PS - I have a bunch of invites for Keybase.io to give out. If you want one, leave a comment or message me and let me know what you need it for. Cheers!

Sunday, May 24, 2015

Secure your Apache server against LOGJAM

Some time ago I wrote a post about the dismaying history of US government attempts to regulate encryption out of existence. I had to omit quite a bit; it was a post and not a book after all. One of the details left out of the story was the DHE_EXPORT cipher suites. During the 90's, developers were forced by the US government to us deliberately insecure ciphers when communicating with entities in foreign countries (readers will remember from the last post that law makers were convinced that encryption should fall under the same rules as weapons technology, and thus could not be shared with anyone outside the Father Land). These insecure ciphers became DHE_EXPORT. The DH stands for Diffie-Hellman; the key exchange system that bears their name was first published in 1976.

Along with the cipher suite was a mechanism to force a normal encrypted transaction to downshift to a lower-bit DHE_EXPORT cipher. As so many short-sighted technology regulations have done in the past, this silly bit of Washington DC-brand programming has come back to haunt us in the form of the LOGJAM vulnerability. Until just a few days ago, all major browsers continued to support these deprecated DHE_EXPORT ciphers, as have a variety of applications as fundamental to web infrastructure as OpenSSL.

The exploit is described in detail on a website hosted by the researchers responsible for its discovery - weakdh.org which also hosts their paper on the same subject (PDF).

Meanwhile, patching your Apache server is simple: Apache HTTP Server (mod_ssl)
SSL parameters can globally be set in httpd.conf or within specific virtual hosts.
Cipher Suites
Disable support for SSLv2 and SSLv3 and enable support for TLS, explicitly allow/disallow specific ciphers in the given order :
SSLProtocol             all -SSLv2 -SSLv3

SSLCipherSuite          ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:DHE-RSA-AES256-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA

SSLHonorCipherOrder     on
DH Parameters
In newer versions of Apache (2.4.8 and newer) and OpenSSL 1.0.2 or later, you can directly specify your DH params file as follows:
SSLOpenSSLConfCmd DHParameters "{path to dhparams.pem}"
If you are using Apache with LibreSSL, or Apache 2.4.7 and OpenSSL 0.9.8a or later, you can append the DHparams you generated earlier to the end of your certificate file.
Reload configuration
sudo service apache2 reload

Saturday, November 29, 2014

Chess, Encryption and Comic Books (Mind MGMT)

Lately, I've been hooked on a brilliant comic book from genius Matt Kindt, called Mind MGMT. In a nutshell, Mind MGMT follows a cold war era intelligence service based on the conceit that Men Who Stare at Goats-style ESP spook tactics work, and have silently and secretly played a role in the machinations of world politics throughout the 20th century. Mind MGMT is really clever, the art is striking and the whole business is worth a read on its own.

Part of the fun of the comic book is that the creators seamlessly weave the sort of subliminal messaging they use in the plot, into the layout of the comic itself. Fake advertisements in the back of issues contain hidden text, while the margins themselves are formatted like Scantron documents with little limericks where the dotted "fold here" lines usually go.

Just today I read through issue 23, which opens with a tale of a man gifted with the fore-mentioned spying super-powers; a reclusive Bobby Fischer type who communicates through the world with messages encoded in the notation of championship chess game layouts. Have a look for yourself (click to enlarge):  
MIND MGMT, Josh Wieder, comic book, chess, encryption
The story fills in the picture a bit, while also providing a series of six chess boards with notation beneath each one. I don't want to spoil the fun of decoding the image for you - what do you think the chess boards spell out? Some things to consider - does each board, or does each notation spell a unique character? Does every board / notation spell the same character every time?

Any way, seeing this inspired me. I don't have much in the way of formal education in cryptography, but even I know that chess boards have been used for cryptography before. What I think would be cool is creating a simple program that would allow you to export a chess board from a computer chess game and use it as part of a cipher for an encryption system. There's even been some more recent publishing being done with chess board cryptosystems (which I have yet to read ... I've got a lot on my plate lately).

Not necessarily the most practical project but IMO a fun distraction / way to sharpen development skills for integrating ciphers into applications.

Thursday, October 23, 2014

Why is the Washington Post Publishing Pro-Surveillance Propaganda? Can Government Surveillance Revelations Decrease Encryption Adoption?

For the last few days I've had great fun watching James Comey and his pack of Keystone Cyber Cops failing to convince the world that they should be CC'd on everyone's calls, tweets and texts and generally exposing himself as the incompetent, braying ass that he is.

James Comey, Braying Jackass, josh wieder
Keep in mind the camera adds 10 pounds
Dan Froomkin and Natasha Vargas-Cooper over at The Intercept exposing each of the examples that Comey used to indicate the necessity for breaking cell phone encryption as fabricated - the cases were real, but none of them relied on cell phones or computers to obtain a conviction.

In one case of infanticide, the parents who were eventually found guilty had been previously convicted of child cruelty and had the deceased child previously taken from their custody for neglect. Not only did the state not need to read the parents' phones for evidence, if they had read their own files and demonstrated some inter-agency cooperation they could very likely have prevented the killing entirely.

In another case, the defendant confessed to a hit and run when cops pulled him over for a DUI and noticed his car had just been in an accident almost immediately following discovery of the victim.

Comey has been calling in a few favors for his little power play. Assistant Attorney General Leslie R. Caldwell testified before Congress on July 15th, relying on some rather dramatic and almost Zoroastrian language to convince legislators of the evils of privacy advocacy:

"All the while, technological advances, including advances designed to protect privacy, such as anonymizing software and encryption, are being used to frustrate criminal or civil investigations and, perversely, protect the wrongdoers. Our cyber crimefighters must be equipped with the tools and expertise to compete with and overcome our adversaries."

Perhaps we should forgive Caldwell as a clearly incompetent simpleton. Its more difficult to understand what was going on over at the Washington Post when they published a now completely discredited op-ed in support of the Comey Conspiracy. 

Last month the Post printed a piece penned by Ronald T. Hosko. Ronald is currently the President of the Law Enforcement Legal Defense Fund (LELDF), whose primary mission is to pay for expensive lawyers for police who kill innocent and/or unarmed people. Without groups like LELDF, police officers might one day be held accountable for their crimes - but not while Ronald's on the case! In addition to his current hobby, Ronald is the former Assistant Director of the FBI Criminal Investigative Division. He was named Assistant Director in July of 2012. Before that, he was special agent in charge of the Washington Field Office (WFO) Criminal Division. Ronald has been a life-long cop, joining the FBI 30 years ago in 1984, with his first big assignment coming with his transfer to the FBI's Chicago Division, where he investigated white-collar and financial crimes in addition to serving on the SWAT team. One paragraph of his CV sticks out:

In 2003, Mr. Hosko was promoted to assistant special agent in charge of the Philadelphia Division, where he was responsible for investigations into criminal matters. While in this role, he led the division’s surveillance and technical operations, and he served as the program supervisor for crisis management. In 2005, Mr. Hosko served as the on-scene commander of FBI personnel deployed to Afghanistan in support of Operation Enduring Freedom. Later that year, he served as deputy to the senior fellow law enforcement official following Hurricane Katrina.

In other words, Ronald developed his surveillance bona-fides during the early years of the Bush Jr administration; an administration that is responsible for sparking he current FBI trend of creating fake terrorist plots to entrap young muslim men who they cajole and bribe into cooperation. Ronald was one of the "on-scene" FBI commanders in Afghanistan who failed to locate Osama Bin Laden or his top lieutenants before being shipped back to the states in time to play a law-enforcement role in the Hurricane Katrina disaster - the only hurricane in the United States in recent memory that is well known for police murdering residents trying to escape the flood zone and escaping any legal consequences for the killings

Ronald Hosko is no stranger to controversy. Rumors of Ronald Hosko's ever-present appearances at Furry conventions are all over the Internet. Of course the rumors of Hosko's Furry compulsions play no part in this debate. The Washington Post, if for no other reason, should be applauded for disregarding rumors of Ronald T. Hosko being an incorrigible fan of Furry Love. People who can only achieve arousal by dressing up as cartoon animals, as Ronald T. Hosko is frequently alleged to, have political opinions just as valid as the rest of us. I, for one, think these rumors are completely without merit. Even if I am wrong and Ronald T. Hosko is, in fact, a Furry, any rumors about his personal life are completely inappropriate and shouldn't play a role in this or any other debate. 

In his op-ed, Ronald ran through Comey's part line: The introduction of encryption in consumer devices are allowing violent criminals to walk free. Not all of the piece is bogus. Comey admits, for example that:

"Encrypting a phone doesn’t make it any harder to tap, or 'lawfully intercept' calls. But it does limit law enforcement’s access to a data, contacts, photos and email stored on the phone itself."

In spite of this admission, Ronald still makes it clear that tapping the phone isn't enough. The data, contacts, photos and email are pivotal for convictions. To illustrate his point, Ronald relies on an example: the case of a kidnap victim in Wake Forest, North Carolina. The kidnappers were tracked down through a lawful intercept of their cell phone's SMS. In the original version of his op-ed, Ronald argues that without the ability to intercept SMS messages, police may never have been able to to identify and arrest the kidnappers. This is another point that is only fair to concede to Ronald. It is quite clear that without the texts the kidnappers could have very well escaped.

That said, Ronald's conclusion is  that encryption would have prevented the police's ability to track the text messages, is completely fantastic. Even a basic understanding of mobile networks and SMS connections forces us to realize that encryption would play no role in the Wake Forest investigation. 

Let's consider how the police got the text messages and what they did with them. First and foremost we must note that police sought and obtained a search warrant for the text messages. The search warrant enabled the police to go to the cell phone companies and request the SMS messages and the location of the handset when they were sent. SMS connection data is transmitted to the cell phone company, where it is stored. Police obtained the SMS data from the cell phone company, not from the cell phone hand set. Remember: at the time the police requested the warrant, they had no idea where the hand set was. The encryption policy that Apple implemented that is the target of Comey and his buddies ire encrypts information stored on the phone hand set, not information transmitted to and from the cell phone company. SMS messages transmitted using a mobile carrier will typically be stored by that carrier for some time. While some GSM carriers encrypt their SMS traffic while it is in transit, they do so using a stream cypher (typically A5/1 or A5/2). A5 stream cyphers are instrinsically weak; Cryptanalysis work containing resource-conservative attacks are well circulated and published. Such cyphers have been in use since the adoption of GSM SMS messaging years ago, and have nothing to do with Comey's attacks on encryption standardization. FBI agents who, unlike Ronald T. Hosko, know sh*t about computers would find breaking such cyphers to be a trivial task if asked to do so as part of an ongoing investigation. 

But all that is a bit besides the point. The FBI had a warrant for SMS data in the Wake Forest case. All of the data they received was provided to them by the cell phone company, including the geographic location of the handsets, which the cell phone company stores along with unencrypted logs of the SMS messages (because cell phone executives don't care about you or your privacy and when they do they have a funny way of ending up in prison).

The kidnappers could encrypt their phone all day long, and the FBI could still have gone to the cell phone carrier and gotten the information they needed to find them. At worst, such a claim is a deliberate lie. At best, Ronald T. Hosko, former FBI Philadelphia Division's director of "surveillance and technical operations", lacks a basic understanding of how the FBI uses cell phones to apprehend suspects. 

The Washington Post didn't bother to fact check Hosko's op-ed. They went ahead and published it, a shocking concession to a government official seeking to greatly expand government surveillance powers and shooting off a bunch of half-truths to justify it. Eventually someone with technical experience read the article and pointed out the piece's complete lack of credibility. As a result, the Post rewrote some of the more incredulous claims and providing readers with this non-apology to its readers: 

* Editors note: This story incorrectly stated that Apple and Google’s new encryption rules would have hindered law enforcement’s ability to rescue the kidnap victim in Wake Forest, N.C. This is not the case. The piece has been corrected.

The editors note was placed below the fold, at the very end of the article. A more ethical correction would place the editors note above the fold, at the beginning of the article to ensure that readers are not mislead and that the large percentage of readers who do not read the entire piece understand what happened. 

So what did these "corrections" consist of? In the original story, Ronald had not just incorrectly made the case that encryption would have hindered the ability of the FBI to locate the kidnappers. Hosko breathlessly alleged that: "Had this [encryption] technology been used by the conspirators in our case, our victim would be dead". The message is clear. Apple and Google, the two companies that Hosko cites in the lead as examples of companies using this dangerous encryption, will have blood on their hands if they continue to protect their user's privacy. 

Here is the original graph compared next to the still-incorrect "corrected" graph, which online periodical Techdirt first pointed out on their coverage of this debacle: 
Last week, Apple and Android announced that their new operating systems will be encrypted by default. That means the companies won’t be able to unlock phones and iPads to reveal the photos, e-mails and recordings stored within.

It also means law enforcement officials won’t be able to look at the range of data stored on the device, even with a court-approved warrant. Had this technology been used by the conspirators in our case, our victim would be dead. The perpetrators would likely be freely plotting their next revenge attack.
 Thats the first version.
Last week, Apple and Google announced that their new operating systems will be encrypted by default. Encrypting a phone doesn’t make it any harder to tap, or “lawfully intercept” calls. But it does limit law enforcement’s access to a data, contacts, photos and email stored on the phone itself.

Had this technology been in place, we wouldn’t have been able to quickly identify which phone lines to tap. That delay would have cost us our victim his life.The perpetrators would likely be freely plotting their next revenge attack.
And that is the "corrected version". Note how the writer (at this point its unclear who wrote the corrected version, Hosko or a Post employee) *still* hangs on to the disproved claim that SMS data subpoena'd from a cell phone carrier has anything to do with an encrypted filesystem on a cell phone by saying that the FBI "wouldn’t have been able to quickly identify which phone lines to tap".

Its at this point that I find it very difficult to forgive the Washington Post for their involvement in this. Not only have they allowed the FBI to manipulate their readers by betraying the public trust developed by actual journalists who have provided real reporting for the Post over the years; they have stood by their man in his hour of need, despite obvious evidence provided by a multitude of technology experts.

Corrections should correct a story, not reword lies to make them more palatable. Yet that is exactly what the Washington Post has done here.

Since the Snowden revelations, evidence of government malfeasance in their approach to surveillance supporting both foreign intelligence and domestic law enforcement has continued to mount. A significant number of Americans have made it clear that they support even the most totalitarian excesses of the intelligence-gathering community, dismissing centuries-long traditions of English-speaking rule of law with slogans like "I have nothing to hide". Authoritarianism has always been popular with a certain type.

What I have to admit is completely unexpected is evidence that I have found of individuals whose response to disclosures of government surveillance have lead them to dismiss the use of encryption as untrustworthy.

In the comments section of the Washington Post story discussed above, for example, one user added the following to the fray: 

Washington Post, Josh Wieder, encryption, user comment

Take note: ALL encryption is compromised! Those mathematicians? They're all on the payroll! There is a certain theatrical flourish that always seems to accompany the conspiracy theory. A "You May Think You're Smart But You're Not" sneer behind the 9/11 truth videos, the reptile photographs, the rest of it. We have all been fooled.

But there are reasons for concern that are not based in psychosis. A Web of Trust; one of the original components of Phil Zimmerman's PGP, can be viewed as a proto social network. Police love Facebook because it shows the people you trust and communicate with. A public key Web of Trust provides all the same data to the state just as readily. Public Webs of Trust should only be used with great care; and in a number of circumstances, should be abandoned entirely.

Another skepticism is that of the hosted provider using encryption. Apple and Google, whatever ire may be directed to them by the FBI now, are two of the founding corporate members of the NSA's PRISM program. Neither company has stopped responding to FISA court requests. If anything, encrypted storage seems like a concession - a way to change the narrative being foisted on consumer tech companies; a way to remind users that such companies are on the side of their customers and not the state; a way to do all these things without actually fighting any legal battles or compromising pre-existing relationships with agencies more politically connected than even the FBI.

The sense of compromise is pervasive, and leads to statements like this one: 

Hacker News, Josh Wieder, Ycombinator, encryption

So many companies have promised privacy to their users, and lied; encryption strikes users as just another scheme.

Added to this is the constant wave of half-explained media coverage of open source security research. How many readers, unfamiliar with internet technology, are struck by reports of  the discovery of the Poodle vulnerability as a bad thing - a failure? Encryption can easily appear to the layman as a flawed technology that depends on dishonest corporations for development and application.

Finally, we have a new wave of mobile applications and their associated startups. The vast majority of such startups are promising their users a new safety and privacy online through the use of whatever snake-oil they happen to be selling, and providing it using the same free-from-upfront-payment model that all of the most dangerous companies rely on. Satan requires no upfront payment, either. Is it any surprise that these companies engage in the same surveillance practices as the firms before them? Whisper, of course, stands out among firms that promise privacy while stealing it. It is my suspicion that Whisper's practices are nothing special.

As our knowledge of surveillance scandals continues to expand, confidence is shaken not just in the state. The public knows that the intelligence community and law enforcement has established extra-legal partnerships in the business community, using their customers as pools of data. The public knows that the intelligence community and law enforcement recruits from the same universities that develop encryption algorithms, providing cryptographers with the highest-paying jobs in the field and generously financing research and handing out grants. 

Is it possible to encourage skepticism in organizations whose approach to technology has been corrupted, while building trust that the same technology can protect us from those organizations?

There's only one thing I know for sure, no matter what anybody else may have to say about the matter. Ronald T. Hosko is not a furry.

Wednesday, October 22, 2014

Congress to Comey: Leave Encryption Alone

Congress appears to have abandoned FBI Director James Comey's bungled attacks on consumer adoption of encryption. Its a rare glimmer of sanity from Capitol Hill; press reports quoting congressional officials using language not ripped from the pages of an Orwell novel.

Readers may remember that in a recent post we mentioned some danger signs indicating that the executive wanted to take some more aggressive action to ensure that the commoners and foreign-folk don't have access to encryption tools that would help keep their data free from snooping. Top brass from the FBI and the Attorney Generals Office were telling anyone who would listen that unless tech companies stopped trying to protect their customer's data, law enforcement would be powerless in the face of modern "cyber" criminals.

Congress has refused to jump on this alarmist bandwagon. Darrell Issa, a member of that rarest of species - California Republicans - had this to say about federal law enforcement's bungled power play:

Darrell Issa on federal law enforcement attacks on encryption
A surprising voice of reason

There are a few things I disagree with Issa about, but this is one topic we both appear to be on the same page about. Its not just Silicon Valley Republicans that are displeased about the grumbling from federal cops, either. West Coast GOP representatives have a long history of fighting on behalf of the large tech companies that keep them in the style to which they are accustomed. But even previously pro-spying Republicans don't want to be a part of the effort to once again criminalize what are now industry-standard encryption practices. Consider, for example, Patriot Act author James Sensenbrenner, better known as "The Schmuck From Wisconsin" among MoveOn liberal types. In an interview with The Hill last week, Sensenbrenner simultaneously sought to distance himself from Comey while tepidly supporting the idea of "privacy" reforms that sound like anything but:

“While Director Comey says the pendulum has swung too far toward privacy and away from law enforcement, he fails to acknowledge that Congress has yet to pass any significant privacy reforms [...] Because of this failure, businesses have taken matters into their own hands to protect their consumers and their bottom lines. [...] If this becomes the norm, I suggest to you that homicide cases could be stalled, suspects walked free, child exploitation not discovered and prosecuted.”

Meanwhile in the South, the Republican Representative from Kentucky Thomas Massie is a clear opponent of Comey's efforts. Massie, along with California Democrat Zoe Lofgren, authored an addendum to the defense spending bill that bans the NSA from using back door mechanisms from being used for the purpose of domestic spying. Lofgren has made her intentions clear on efforts to distort CALEA regulation into a modern Clipper Chip:

“I think the public would not support it, certainly industry would not support it, civil liberties groups would not support it [...] there’s just no way this is going to happen."

Will congress maintain this ethical stance after the mid term elections, or is this a fast one to get a few quick dollars out of tech companies and some much-needed votes from those who want big brother out of their iPad? Regardless of the answer to that question, the abandonment of law and order chest thumping for privacy advocacy during a critical election period represents a sea change in Congressional politics; a change that privacy advocates should view as a good sign. 

Tuesday, October 21, 2014

Is Encryption Becoming Illegal Again?

Way back in 1993, the Internet was a very different place. SSL would not be released for another two years; it would take some time after that until it was used commonly. The Clipper Chip project had just been announced, threatening to offer an explicit, physical back door to all electronic communications devices for the US Justice Department and anyone with a basic understanding of computer science.

In 1993, Encryption was a weapon.

Washington viewed encryption's only function as a wartime tool to protect military and intelligence communications. The notion that encryption could or should be used as a foundation of protecting online commerce and banking simply did not occur to Big Brother.

Into this situation came Phil Zimmerman. Phil had designed and programmed an encryption application called Pretty Good Privacy in 1991. Before that time, cryptography tools were almost entirely the purview of those with the biggest of Smarty Pants: mathematicians, logicians, researchers, hackers. Things had started to change a little bit. The internet was taking networking technology out of the university and placing it in peoples homes. Some computer enthusiasts were becoming aware of encryption, but tended to use tools relying on outdated algorithms that were easily broken. After all, who was watching?

Phil Zimmerman, Josh Wieder, 1990, 90's
Phil Zimmerman
The Clipper Chip made the public aware that the United States was watching; they wanted to see everything, monitor everyone. As one military official would later describe this totalitarian data lust: "Let’s collect the whole haystack. Collect it all, tag it, store it. . . . And whatever it is you want, you go searching for it." Lots of people were uncomfortable with this idea. A domestic market for encryption was born. But to meet demand, the encryption used would have to not suck. It didn't necessarily have to use the absolute best, military-grade algorithms available, but it did have to be tough enough to confound government decryption efforts enough to make it unattractive for snooping. The encryption would have to be Pretty Good.

Phil's program became widely popular; it quickly dominated this new domestic encryption market. However, there were already encryption companies in the US. Unlike Phil's company, these companies sold encryption only to the US government and government contractors. Because of this business model, their interests were closely aligned with the government. They really didn't like the idea of some average Joe giving encryption that was as strong or stronger then their own to anybody who asked for it. One company that really didn't like Phil was RSA.

To fast forward for just a minute, RSA was in the news very recently. RSA is still around today. Things have changed, of course. Today encryption is widely used throughout the internet, by everyone. Just by doing a Google search you use encryption. RSA has adapted to this new world; they now sell encryption products to companies in addition to the US government. They even sell encryption to people outside of the United States (a particularly eye-rolling development, as we will understand in a minute). Despite these changes RSA has never forgotten where they came from. They still do business with the US government. And when the government asks them nicely, RSA will do things for the government that endangers all of their commercial relationships. A recent expose uncovered that RSA had received a secret payment from the US intelligence community of $10 million. In return, RSA used a flawed random number generator in the encryption software that they sell to companies. Its a clever flaw - you would have to look very closely at RSA's software, and know a lot about programming and encryption, in order to catch the flaw. None of RSA's customers caught the backdoor. It hadn't occurred to anyone to look. People trusted RSA. Using the flaw, the US intelligence community, and RSA, could decrypt things that had been encrypted with the product. RSA and the US government are very close.

Let's go back in time again and pick up where we left off. We are back in 1995. RSA knows about Phil Zimmerman and his PGP program, and they don't like Phil. In its early versions, PGP used an RSA has algorithm to protect session keys and create digital signatures. RSA was horrified that their technology would help lead to the distribution of military grade encryption "for the masses" (Phil liked to use that phrase in his press releases and marketing). RSA quickly claimed that Zimmerman was breaking RSA licensing rules.

But a licensing dispute wasn't enough to make PGP go away. And it wasn't just RSA that didn't like Phil - the US government was increasingly distressed by Phil's popularity. The entire executive branch was plugging the Clipper Chip, explaining diligently how police investigators were at a disadvantage. Technology had rapidly outpaced the law - there were processes in place to deal with phone wiretaps, rules forcing phone companies to help, case law. But what if crooks were using email? What if they used PGP? Terrorists could be using PGP to hide their plots. They could be selling PGP technology to Saddam Hussein or the Ayatollah. And don't even get them started about the pornography. Phil was interfering with this full court press lobbying effort by telling people that the government's proposed rules would let them read everybody's messages and that they could protect their privacy using cheap and simple encryption tools.

RSA increasingly began to panic. Would the White House blame RSA if Phil killed the Clipper Chip? Losing a few contracts to a competitor was one thing - Phil was threatening the whole business model, and he was using RSA to do it.

He had to be stopped.

Remember at the beginning of this article, how I said that in 1993, encryption was a weapon? Like the war on drugs and the war on terrorism, this metaphor was treated literally in legislation. Washington claimed that encryption technology was protected under the United States Arms Export Control Act. Encryption had long been at the center of armed conflict - the cracking of the German Enigma Code by Alan Turing during World War II is widely believed to have been pivotal to winning the war - as if not more important than any specific gun. Throughout the Cold War, Warsaw Pact and NATO intelligence services assigned some of their brightest minds to code breaking to get a glimpse into the other empire's government. Now, in the 90's, there was the middle east to think about. Saddam Hussein could have been using encryption to hide his attempts at building weapons of mass destruction; Russians could be using encryption to sell off military assets to third world countries. In the post-Soviet world, the US was the last super-power left standing, and to find its next enemy it neaded to be able to sniff through the mails.

To get rid of Zimmerman, RSA and the government would have to portray him not as a privacy advocate for US citizens, but as a shadowy double-agent, looking to take valuable American military secrets and sell them to the highest, Foreign bidder.

RSA had been watching Phil closely, and they believed they had evidence that the Department of Justice could use to indict him. The PGP website allowed visitors to download their PGP software from anywhere. There were warnings and promises on the page making downloaders understand they would be breaking the law by downloading PGP from outside of the United States, but that was it. An Iraqi spy only had to click a box to get the 128-bit goods? This was too dangerous to continue. Zimmerman was a terrorist. 

RSA took their findings to the Department of Justice (DOJ), who promptly began an investigation, looking to indict Phil under the Arms Export Control Act (AECA).

To outsiders, it looked like a fairly open-and-shut case. Privacy advocates, security experts and constitutional lawyers might have viewed the investigation as the opening aria to a miscarriage of justice, but it appeared unstoppable.

People outside of the US had in fact downloaded PGP. At the time, the AECA mandated that encryption had to be limited to the use of flimsy 40 bit keys in order to allow international transfer. PGP's weakest keys were 128 bits. At times, Zimmerman appeared to thumb his nose at prosecutors. He wrote a book about PGP, and his publisher distributed the book internationally. The book contained the entire source code to PGP. By tearing off the covers, typing the text on the pages int on a computer and compiling the resulting file, anyone with the book could have a working copy of PGP. The book sold for $60: a lot to ask for a book, but a bargain for cutting-edge encryption software.

The press loved Phil. Zimmerman and PGP was featured prominently in publications ranging from technical journals, to consumer electronic porn like Wired, to the Washington Post. The investigation of Zimmerman continued for years. Washington clearly hated the idea of taking on Phil with his profile this high. No one was buying the Phil-as-spy narrative. The public saw Phil as an idealistic computer nerd; a story they had become used to during the Dot Com boom. People like Phil were enabling the public to do amazing things and enriching the economy to heights unheard of for generations. The nation had a budget surplus for the first time that anyone could remember. It became increasingly clear: imprisoning Phil would risk transforming him from idealistic nerd to a human rights martyr. Clearly, Washington didn't want to play de Klerk to Phil's Mandela. And that was the best-case scenario. What if they lost their case? The investigation had dragged on for three years. It was now 1996: an election year. Phil had bipartisan support. Liberals wanted to use encryption to protect dissidents in third world countries. Conservatives wanted the government to stop trying to bankrupt profitable tech companies with decades'-old regulation. After a three year investigation, DOJ walked away from Zimmerman without filing any charges.

That didn't stop Washington from going after others working with encryption that were not media darlings. In 1995 Daniel J. Berstein was criminally charged for publishing an academic paper related to his encryption program Snuffle while studying at Berkeley. The next year charges were brought against Peter Junger, a professor at Case Western Reserve University, for his university course on computer law, which included class materials on encryption regulation. Five years before the Patriot Act, mere discussion of the law had become a crime.

Junger was initially found guilty in Northern District of Ohio (Junger v. Daley, 8 F. Supp. 2d 708). The case's Judge Gwin ruled that software is not expression because software is "inherently functional" and a "device". Fortunately, Junger successfully sought relief from Appellate Court in the Sixth Circuit, who agreed with Junger that his class was speech protected by the First Amendment, and not a weapon (Junger v. Daley, 209 F.3d 481). This case is vitally important to the recent developments we will discuss shortly, because the regulations that were used against Junger was not part of the Arms Export Control Act that was the basis for the complaint against Zimmerman. With Junger, the complaint was filed by the Department of Commerce. Junger's accusers said that he had to apply for permission from the Department of Commerce in order to discuss the law with his students over the Internet. An "International Traffic in Arms Regulations" (ITAR) license was required, as part of the Department of Commerce's "Export Administration Regulations" (EAR). As Peter Junger and his attorneys explained the rules in a 1997 press release: "Under the EAR [...] one is permitted to export such software in books and other ``hard copy'', but is still required to obtain a license before publishing the same software on the Internet or the World Wide Web or in other electronic form." Write a book about the law, and it is protected speech. Take that book and post it on a website, and the book becomes a weapon.

This brings us to today. Over the last 17 years (1997-2014), encryption has changed from weapon of mass destruction to a fundamental internet protocol. Netscape's SSL RFC was updated to version 3, then deprecated by TLS. Hash functions are now a basic component of operating systems distributed to every individual with a computer. Encrypted storage is a cross-industry recommended best practice when dealing with customer information as simple as a name, phone number and address.

Today, the controversy is when a company does not use encryption. Even more surprisingly, government regulations for a variety of industries, such as HIPAA and Sarbanes/Oxley, now compels companies to use encryption as part of their operations. Every reputable E-Commerce transaction uses encryption. Without encryption, its doubtful there would even be such a thing as "E-Commerce".

These regulations apply to large multi-national corporations doing business in the United States. For example, it is taken for granted that a large bank will have foreign customers. And yet, the government requires that large bank to protect all of their customers using encryption.

Such customers must have a basic understanding of encryption technology in order to rely on encrypted services; alternatively, they must purchase products from people with such an understanding to assist them with these tasks. So for example, lets consider a Canadian citizen who works in upstate New York. She commutes while living right across the border in Canada. In order to get paid, this Canadian citizen has an American bank account. When she is at home, she checks her bank account balance using the bank's website.

Our Canadian friend is not very technical, but like most folks today she is familiar with life online. She has a social media page, uses search engines and email. When she checks her bank account online, she barely notices the little green lock icon appear in the top left hand corner of her browser, which she downloaded from the website of an American company based in Silicon Valley.

If we consider this for a moment, what has happened here is that two American companies has exported encryption technology to our Candian friend. Her bank and her browser. If she used a search engine to remember the URL of her bank, and if like most search engines that search engine uses a TLS connection by default, a third company enters the conspiracy. Each of these companies exported to a foreigner encryption software that is exponentially more powerful than the PGP of 1993 - todays keys are usually between 1024 and 4096 bits. When Zimmerman was investigated the limit was 40 bits, and PGP's default was 128 bits - 512 bits was the really strong stuff. Today, 1024 bits is considered weak.

The regulations have changed to accommodate the new reality. The Department of Commerce (DoC) now maintains a black list - a list of individuals, corporations, governments and entities that no technology company can provide encryption tools to without facing consequences. DoC refers to its ominous blacklist as the "BIS List" - BIS being the department within DoC that handles the list, the Bureau of Industry and Security.

Within the BIS List are a number of more specific lists. There is the Entity List, the Denied Persons List and the Unverified List. And thats just the DoC. Different Federal Bureaucracies like the Department of State and the Department of Treasury have their own separate black lists with which American firms may not provide encryption tools. Helpfully, Washington posts this "Consolidated Screening List" on a website where you can download the whole business in a CSV. I have my own copies of these documents for anyone who would like to review them.

It is unclear what lands someone on one these lists. DoC states the following on their website:

"[...]the Entity List in February 1997 as part of its efforts to inform the public of entities who have engaged in activities that could result in an increased risk of the diversion of exported, reexported and transferred (in-country) items to weapons of mass destruction (WMD) programs. Since its initial publication, grounds for inclusion on the Entity List have expanded to activities sanctioned by the State Department and activities contrary to U.S. national security and/or foreign policy interests."

So originally this was explained as a WMD anti-proliferation measure. The BIS List kept companies from selling aluminum tubes and suspiciously-colored cakes; sounds quite prim and proper, frankly. And yet, in the very next sentence DoC dismisses the WMD mandate - expanding its mandate to hassle anyone involved in "activities contrary to U.S. national security and/or foreign policy interests." Does this mean Pizza Hut needs to apply for a license to deliver to Michael Moore's house?

Its been unclear what these rule means to firms dealing in encryption, because these rules have remained firmly outside of the public eye, until this month (October, 2014). This month the Department of Commerce's Bureau of Industry and Security sent out a Press Release. In the release, DoC bragged of how they shook down Intel for $750K. Intel has been a pillar of US IT infrastructure and development for decades; the Federal Government does billions of dollars in business with both Intel and Intel's partners. The specific allegations were stranger than the target of the shake-down. DoC claimed that between 2008 and 2011 Intel had provided encryption tools to "governments and various end users" in China, Hong Kong, Russia, Israel, South Africa, and South Korea. Its a bizaare list of countries with which to form a basis of export allegations. China, though consistently unpopular politically, is on the short list of top US trade partners. Russia, while spending less than China in US markets, perhaps, is still an official US ally and trade partner. Israel and South Korea are two the closest allies of the US in their respective regions. Hong Kong, while the odd man out in a few ways, is certainly not an enemy of the US and US firms spend huge amounts in Hong Kong markets. There is no official embargo for any of these countries. The United States government sold nuclear weapons to China during the Clinton administration, around the same time that they were crucifying mathematics professors and students for violating weapons export laws. The US has been trying for ages their own nuclear weapons in South Korea. The National Security Agency has given Israel raw, uncensored data from its massive domestic spying program. Washington huffs and puffs at Russia over its cruel adventures in Ukraine, and snipes behind the back of the Chinese for human rights abuses. Never in my lifetime has armed conflict between these nations and the US ever been even a remote possibility.

It remains unclear why the government pursued Intel for behavior that is practiced so widely by so many US firms, but the similarities between the DoC's approach this month has obvious parallels to its behavior in the 90's during its initial campaign to limit the distribution of encryption technology. Following the Snowden leaks, the reality of pervasive domestic spying has changed from tin-foil-hat conspiracy theory to an unassailable fact. Like with the Clipper chip, demand for privacy is increasing. There are calls to push back against domestic surveillance, using legislation and through more direct action using more advanced and easy to use encryption. The latter scenario - a world where domestic surveillance is rendered useless through the widespread use of encryption - is much more terrifying to Washington than Phil Zimmerman ever was. FBI Director and confirmed bachelor James Comey has gone on a bit of a press junket, claiming that companies as servile in their relationship with Washington as Apple and Google have "gone too far" by setting basic encryption measures as default for even their least savvy of users. Federal law enforcement is once again pushing a Clipper Chip to monitor digital communication before they are encrypted in transit. This time, the regulation is being pushed under a framework called CALEA - a requirement that internet service providers install so-called "lawful intercept" capabilities that allow cops to snoop on their customers. Encryption must be bypassed in order to meet CALEA's lawful intercept requirement, argues Comey and his White House allies.

The situation remains in flux; it remains unclear how far the US government is willing to go in its desire to "collect it all". The domestic spying infrastructure they have constructed up to this point is indeed massive - much larger than any measures the Stasi or KGP had ever dreamed of - yet was constructed in absolute secret. Though network industry insiders were aware of what was happening as early as Total Information Awareness (TIA) and AT&T's secret room 641A, those outside of the industry dismissed claims of domestic spying as paranoid conspiracy theories. What did AT&T core network engineers like Mark Klein know about communications, anyway? The public has seen Law & Order - the system works when the Good Guys (Police) whack the Bad Guy Who Obviously Did It (Detainee) with a phone book and he confesses. If anything, the system is broken because too many Bad Guys Who Obviously Did It "get off" because of "activist judges" and their "technicalities". The system isn't broken because of a top-secret surveillance industry of some 2 million people monitors every electronic communication on the planet, using their snooping as justification to torture people for being born in the wrong country or to assassinate an American child for having the wrong father or as a reason to bomb a wedding (or eight weddings) or maybe just to shoot a couple of pregnant women and then cut the bullets out of their bodies with a knife while their family watched so that we could tell reporters that the Taliban did it.

Secrecy and disinformation allowed the construction of a global infrastructure to support pervasive surveillance, torture and assassination. Much will depend on Washington's desire and ability to continue building that system in the light of day.

UPDATE: Comey isn't the only top executive branch official calling for the expansion of lawful intercept interpretation. White House cybersecurity czar Michael Daniel, an official whose asinine and self-contradictory job title calls for a sacking, told the Christian Science Monitor that he also wants to peek a little further into your laptop, tablet and cell phone: "We don't want to have something that puts it utterly beyond the reach of law enforcement in the appropriate circumstances." By 'appropriate circumstances', presumably Daniel means when information exists on a computer.

Saturday, November 17, 2012

Decrypting Data That Has Been Encrypted by ASP.NET

A colleague of mine let me know about an easy way to use .NET's decryption mechanism from the command line. From the directory of the framework version, issue the following command (replace filename and path where appropriate):

C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727>aspnet_regiis -pdf "filename" D:\path\
Encrypting configuration section...
Succeeded!

Neat!

Wednesday, September 19, 2012

More Fun With PCI

I received a notification from a large security auditing firm that of the ciphers currently available, only RC4 ciphers will be considered PCI compliant.

My assumption based on the notification is that this move is intended as a rejection of CBC (Cipher Block Chaining). Well, that's fine as far as I am concerned. CBC has some serious issues as implemented in SSL v3 / TLS v1.0. In a nutshell, you can time responses for applications using the block cipher to get ranges of possible data in SSLv3 and partial payload decryption in TLS. So-called "stream" ciphers like RC4 are immune to this particular attack vector. You don't get private keys from the attack, its by no means a fast attack (minimum of three hours), and you need access to monitor the session. Further, patches for CBC exist to over-ride the timing exploit (for example the NSS libraries used by Mozilla have been patched).


I will save debunking the man in the middle hysteria for a later post. What frustrates me about the requirement of RC4 stream ciphers for PCI compliance is not that CBC ciphers are no good - they are weak - it is the notion that somehow RC4 is somehow sufficient. Some points to consider:


-RC4 exploit using SSH with null password prevention enabled

-RC4 is frequently implemented poorly within applications other than sshd, for example by using poor to no random number generation

-Successful attack vectors exist, but they have yet to be put into a helpful graphical interface for use by your neighborhood teenager (as the BEAST framework did for CBC). Paul and Maitra published on RC4 key reconstruction techniques in 2007 (Permutation after RC4 Key Scheduling Reveals the Secret Key. SAC 2007), based on keystream byte assignment biases first published by Roos in 1995. This means, unlike CBC, there is a published algorithmic approach to full private key decryption of RC4. 


It cant be stressed enough that all of these vectors assume an attacker on your wireless or local network. If that is the case than SSL is the very least of your problems. While theoretical dismissal of this or that cipher based on penetration ability is sound and valuable, the PCI standard suggests a holistic approach to security. The various levels of PCI-DSS compliance suggest admission of the reality that the goals of securing a system will differ based on the purpose of that system. Banning the use of CBC will not serve to get less sites hacked, but it will keep administrators preoccupied with yet more busy work, switching from one cipher with published flaws to another cipher with published flaws.

Sunday, April 29, 2012

Phil Zimmerman's Latest Project

Phil Zimmerman of PGP Encryption fame is launching a new project, Silent Circle -  The idea is an application suite complete with encrypted VOIP, email and IM. Exciting stuff! Lets hope it works out better than Hushmail!

Wednesday, April 18, 2012

Random Number Generation

Latest Update from Basement Dweller News:

A great primer on random number generation from a few smart cookies at Intel, by way of IEEE:
http://spectrum.ieee.org/computing/hardware/behind-intels-new-randomnumber-generator/0

On a very related note, let's keep our eyes on systemic issues with encryption keys in the wild:
http://eprint.iacr.org/2012/064.pdf

I have yet to formalize an opinion as to the validity of any systemic key issues intrinsic to RSA (because I was a "D" math student I have to wait for the grown-ups to weigh in on these Deep Thoughts. I would like to see larger keys in use standardized and don't see any good reason not to)

A compelling critique of the survey, urging for additional data before judgment is reached:
http://dankaminsky.com/2012/02/14/ronwhit/

Billing systems development now available

Good news for current and future clients of Josh Wieder Technical Consulting : customers can now retain a variety of unique services related...