From: Bill Kelly Date: 2014-01-23T02:17:01-08:00 Subject: [ruby-core:60017] Re: [ruby-trunk - Bug #9424] ruby 1.9 & 2.x has insecure SSL/TLS client defaults Martin.Bosslet@gmail.com wrote: > > Rolling your own defaults is dangerous: Even skilled developers like the > Debian developers can get it wrong sometimes, with disastrous consequences. The Debian blunder has been referenced twice in this discussion, but I think the comparison is not apt. The Debian maintainer _removed lines of code_ from the OpenSSL PRNG implementation. [1] This is hardly in the same category as tightening the defaults to exclude specific ciphers or protocol features already known to be weak or exploitable. > It hurts even more that in such cases everyone will start pointing fingers, > asking: "Why didn't you stick to the library defaults???" As opposed to asking: "Why didn't you remove known weak ciphers and exploitable protocol features from the defaults when you were warned about them???" > I would prefer a whitelisting approach instead of blacklisting as in the > patch that was proposed. Blacklisting is never airtight, as it doesn't protect > us from future shitty algorithms creeping in. I wonder. In the blacklisting case, we're not required to make guesses about the future. We're merely switching off already-known weak or exploitable features. Whitelisting goes a step further, gambling that what we know today about the subset of defaults considered superior will continue to hold true down the road. It's not clear to me that's better than the more conservative step of simply blacklisting specific defaults already known to be problematic. Regards, Bill [1] The details are perhaps interesting: http://research.swtch.com/openssl