From: merch-redmine@... Date: 2021-06-15T18:00:33+00:00 Subject: [ruby-core:104285] [Ruby master Bug#17951] Collistions in Proc#hash values for blocks defined at the same line Issue #17951 has been updated by jeremyevans0 (Jeremy Evans). I have submitted a pull request with @xtkoba's fix: https://github.com/ruby/ruby/pull/4574 ---------------------------------------- Bug #17951: Collistions in Proc#hash values for blocks defined at the same line https://bugs.ruby-lang.org/issues/17951#change-92501 * Author: decuplet (Nikita Shilnikov) * Status: Open * Priority: Normal * ruby -v: ruby 3.0.1p64 (2021-04-05 revision 0fb782ee38) [x86_64-darwin20] * Backport: 2.6: UNKNOWN, 2.7: UNKNOWN, 3.0: UNKNOWN ---------------------------------------- ```ruby require 'set' def capture(&block) block end # it creates 1k of same blocks blocks = Array.new(1000) { capture { :foo } } hashes = blocks.map(&:hash).uniq ids = blocks.map(&:object_id).uniq equality = blocks.map { blocks[0].eql?(_1) }.tally hash = blocks.to_h { [_1, nil] } set = blocks.to_set puts(hashes.size) # => 11 puts(ids.size) # => 1000 puts(equality.inspect) # => {true=>1, false=>999} puts(hash.size) # => 1000 puts(set.size) # => 1000 ``` The script builds one thousand blocks and then compares them in various ways. I would expect proc objects to be completely opaque and thus be treated as separate objects. As in, they are not equal. All tests but first confirm this expectation. However, `Proc#hash` doesn't return 1000 different results rather it varies between 3 and 20 on my machine. As I understand, current behavior doesn't violate ruby's guarantees. But I would expect `Proc#hash` results to be as unique as `Proc#object_id`, at least a lot more unique than they currently are. The problem is likely to occur only for blocks defined at the same line. ref to similar/related issue https://bugs.ruby-lang.org/issues/6048 -- https://bugs.ruby-lang.org/ Unsubscribe: