Two equal functions giving different results

I have two functions that check whether an array is sorted. The first function seems to be accurate most of the time but not always, the second one seems to be working the way it’s supposed to 100% of the time. The conditional statements are the same, so I’m wondering what is causing this difference in behavior?

function test(arr) {
  for (let i = 0; i < arr.length; i++) {
    if (arr[i + 1] - arr[i] < 0) {
      return false
    } else {
      return true
    }
  }
}

console.log(test([2, 20, 1]))

function test2(arr) {
  for (let i = 0; i < arr.length; i++) {
    if (arr[i + 1] - arr[i] < 0) return false;
  }
  return true;
}

console.log(test2([2, 20, 1]))

>Solution :

The issue with this code is that you are returning a value on the first check the loop. In the first code example, it will test:

if (arr[i + 1] - arr[i] < 0) {
  return false
} else {
  return true
}

whether or not the condition is met, your code is returning something on the very first loop. Remember that when a function returns something, it is immediately ended and will not return any other values.

In the second function every time it goes through the loop it will ONLY return false if there the condition isn’t met, if the condition is met it will just continue. It will return true at the very end AFTER the entire loop finishes.

Leave a Reply