is chunking really fixed?
I was under the impression that with the changes in 4.0, the following should work:
#[test]
fn hash_chunking() {
let c1: &[u8] = b"This hashing algorithm was extracted from the Rustc compiler.";
let c2: &[u8] = b" This is the same hashing algoirthm used for some internal operations in FireFox.";
let c3: &[u8] = b" The strength of this algorithm is in hashing 8 bytes at a time on 64-bit platforms, where the FNV algorithm works on one byte at a time.";
let mut h1 = H::with_seeds(HASH_SEED_1, HASH_SEED_2, HASH_SEED_3, HASH_SEED_4);
h1.write(c1);
h1.write(c2);
h1.write(c3);
let hash1 = h1.finish();
let mut c4 = Vec::<u8>::new();
c4.extend_from_slice(c1);
c4.extend_from_slice(c2);
c4.extend_from_slice(c3);
let mut h2 = H::with_seeds(HASH_SEED_1, HASH_SEED_2, HASH_SEED_3, HASH_SEED_4);
h2.write(&c4);
let hash2 = h2.finish();
println!("hash1: {}, hash2: {}", hash1, hash2);
assert!(hash1 == hash2);
}
But it fails on that last line. Is chunking still broken?