See the comments, What does the mut self_encoded means here?
pub trait DecodeLength {
fn len(self_encoded: &[u8]) -> Result<usize, Error>;
}
// This can compile.
impl DecodeLength for i32 {
// here
fn len(mut self_encoded: &[u8]) -> Result<usize, Error> {
usize::try_from(u32::from(Compact::<u32>::decode(&mut self_encoded)?))
.map_err(|_| "Failed convert decoded size into usize.".into())
}
}
// This way not works since the signature of this len is not correctly.
// impl DecodeLength for i32 {
// fn len(self_encoded: &mut [u8]) -> Result<usize, Error> {
// Ok(2)
// }
// }
>Solution :
you must remember when you change your fn like this
fn len(mut self_encoded: &[u8]) -> usize {
2
}
you didn’t change actual input, your input still is &[u8], you just tell the compiler that value of input variable can be changed
like this
impl DecodeLength for i32 {
// here
fn len(mut self_encoded: &[u8]) -> usize {
self_encoded = b"123";
2
}
}
but when you change the input type &[u8] to &mut [u8] you chose a new type for input
now compiler give you an error and says "expected &[u8] but found &mut [u8]
// Error: expected fn pointer `fn(&[u8]) -> _` found fn pointer `fn(&mut [u8]) -> _
impl DecodeLength for i32 {
fn len(self_encoded: &mut [u8]) -> usize {
2
}
}
remember &[u8] and &mut [u8] are different and have different use
[Edit]
you can’t use ? in function when your function doesn’t return Result
[Edit 2]
look at the following code
impl DecodeLength for i32 {
// here
fn len(mut self_encoded: &[u8]) -> usize {
let mut_ref = &mut self_encoded; // this give you `&mut &[u8]`
let mut_ref2 = self_encoded.as_mut(); // Error: `as_mut` is only valid for `&mut` references
1
}
}
you can’t change &[u8] type into &mut &[u8], this will give you an error
the simple way is change your DecodeLength like this
pub trait DecodeLength {
fn len(self_encoded: &mut [u8]) -> Result<usize, Error>;
}