Home » Javascript » Why `null >= 0 && null <= 0` but not `null == 0`?

Why `null >= 0 && null <= 0` but not `null == 0`?

Posted by: admin November 30, 2017 Leave a comment


I had to write a routine that increments the value of a variable by 1 if its type is number and assigns 0 to the variable if not, where the variable is initially null or undefined.

The first implementation was v >= 0 ? v += 1 : v = 0 because I thought anything not a number would make an arithmetic expression false, but it was wrong since null >= 0 is evaluated to true. Then I learned null behaves like 0 and the following expressions are all evaluated to true.

  • null >= 0 && null <= 0
  • !(null < 0 || null > 0)
  • null + 1 === 1
  • 1 / null === Infinity
  • Math.pow(42, null) === 1

Of course, null is not 0. null == 0 is evaluated to false. This makes the seemingly tautological expression (v >= 0 && v <= 0) === (v == 0) false.

Why is null like 0, although it is not actually 0?


Your real question seem to be:


null >= 0; // true


null == 0; // false

What really happens is that the Greater-than-or-equal Operator (>=), performs type coercion (ToPrimitive), with a hint type of Number, actually all the relational operators have this behavior.

null is treated in a special way by the Equals Operator (==). In a brief, it only coerces to undefined:

null == null; // true
null == undefined; // true

Value such as false, '', '0', and [] are subject to numeric type coercion, all of them coerce to zero.

You can see the inner details of this process in the The Abstract Equality Comparison Algorithm and The Abstract Relational Comparison Algorithm.

In Summary:

  • Relational Comparison: if both values are not type String, ToNumber is called on both. This is the same as adding a + in front, which for null coerces to 0.

  • Equality Comparison: only calls ToNumber on Strings, Numbers, and Booleans.


I’d like to extend the question to further improve visibility of the problem:

null >= 0; //true
null <= 0; //true
null == 0; //false
null > 0;  //false
null < 0;  //false

It just makes no sense. Like human languages, these things need be learned by heart.


JavaScript has both strict and type–converting comparisons

null >= 0; is true
(null==0)||(null>0) is false

null <= 0; is true but (null==0)||(null<0) is false

"" >= 0 is also true

For relational abstract comparisons (<= , >=), the operands are first converted to primitives, then to the same type, before comparison.

typeof null returns "object"

When type is object javascript tries to stringify the object (i.e null)
the following steps are taken (ECMAScript 2015):

  1. If PreferredType was not passed, let hint be “default”.
  2. Else if PreferredType is hint String, let hint be “string”.
  3. Else PreferredType is hint Number, let hint be “number”.
  4. Let exoticToPrim be GetMethod(input, @@toPrimitive).
  5. ReturnIfAbrupt(exoticToPrim).
  6. If exoticToPrim is not undefined, then
    a) Let result be Call(exoticToPrim, input, «hint»).
    b) ReturnIfAbrupt(result).
    c) If Type(result) is not Object, return result.
    d) Throw a TypeError exception.
  7. If hint is “default”, let hint be “number”.
  8. Return OrdinaryToPrimitive(input,hint).

The allowed values for hint are “default”, “number”, and “string”. Date objects, are unique among built-in ECMAScript object in that they treat “default” as being equivalent to “string”.
All other built-in ECMAScript objects treat “default” as being equivalent to “number”. (ECMAScript

So I think null converts to 0.


I had the same problem !!.
Currently my only solution is to separate.

var a = null;
var b = undefined;

if (a===0||a>0){ } //return false  !work!
if (b===0||b>0){ } //return false  !work!

if (a>=0){ } //return true !