JavaScript regex - null argument makes the regex match

I'm writing a regex to be used with JavaScript. When testing I came across some strange behavior and boiled it down to the following:

/^[a-z]/.test("abc"); // <-- returns true as expected
/^[a-z]/.test(null);  // <-- returns true, but why?

I was assuming that the last case was going to return false since it does not meet the regex (the value is null and thus, do no start with a character in the range). So, can anyone explain me why this is not the case?

If I do the same test in C#:

var regex = new Regex("^[a-z]");
var res = regex.IsMatch(null); // <-- ArgumentNullException

... I get an ArgumentNullException which makes sense. So, I guess when testing a regex in JavaScript, you have to manually do a null check?

I have tried searching for an explanation, but without any luck.



That's because test converts its argument : null is converted to the "null" string.

You can check that in the console :


returns true.

The call to ToString(argument) is specified in the ECMAScript specification (see also ToString).


Here null is getting typecasted to String form which is "null".

And "null" matches your provided regex which is why it is evaluating to true

In Javascript, everything(or mostly) is an Object which has ToString method which will be automatically called upon internally in case there is a need for a typecast.


Recent Questions

Top Questions

Home Tags Terms of Service Privacy Policy DMCA Contact Us

©2020 All rights reserved.