From: Bill Kelly Date: 2014-01-23T20:28:56-08:00 Subject: [ruby-core:60047] Re: [ruby-trunk - Bug #9424] ruby 1.9 & 2.x has insecure SSL/TLS client defaults shyouhei@ruby-lang.org wrote: > B Kelly wrote: >> >> I think we're talking at cross-purposes. Your arguments focus on what would >> be ideal: an upstream patch by OpenSSL. I think nobody disagrees that would be >> ideal, and presumably most among us are familiar with the downsides of maintaining >> downstream patches. > > Then how can it be legitimate for you to blame Debian people? > I don't wanna be raped like them. Interesting. I feel I must be communicating unclearly. I'm not someone who blamed Debian. (It's my preferred Linux distro.) Indeed, the Debian maintainer who removed lines of code affecting the OpenSSL PRNG first posted on the OpenSSL mailing list explaining his situation and asked if it was OK to remove the code. As I wrote in an earlier post, I think the details of what transpired in the Debian/OpenSSL blunder are interesting. Particularly, I think the details show it's difficult to point fingers at a specific person or part of the process in the Debian/OpenSSL situation. Mistakes were made; and yet the actions taken at each discrete step in the process seemed fairly reasonable. And in that /particular/ sense I recognize the parallels being drawn to the debate here about hardening the OpenSSL defaults for Ruby. My position has simply been that I regard the following scenarios as categorically distinct: 1. "I don't know what these lines of code in OpenSSL do, but Valgrind complains. Is it OK if I remove them?" 2. "SSLv2, TLS compression, and certain specific ciphers are regarded by the security community as weak or exploitable. Is it reasonable and beneficial to Ruby users if we exclude them from our defaults?" To me, there appears to be a vast distance between #1 and #2. My recent posts on this thread have been in part an attempt to understand the opposing view by eliciting responses from those who disagree. Regards, Bill