var a = new Date(), now = a.getTime(), then = Date.UTC(2009,10,31), diff = then - now, daysleft = parseInt(diff/(24*60*60*1000)); console.log(daysleft );
The days left is off by 30 days.
What is wrong with this code?
Edit: I changed the variable names to make it more clear.
Days and years are one-based.
JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.
As Eric said, this is due to months being listed as 0-11 range.
This is a common behavior - same is true of Perl results from localtime(), and probably many other languages.
This is likely originally inherited from Unix's localtime() call. (do "man localtime")
The reason is that days/years are their own integers, while months (as a #) are indexes of an array, which in most languages - especially C where the underlying call is implemented on Unix - starts with 0.
date1 = new Date(); //year, month, day [, hrs] [, min] [, sec] date1 = new Date.UTC(date1.getFullYear(),date1.getMonth()+1,date1.getDate(),date1.getHours(),date1.getMinutes(),date1.getSeconds()); date2 = new Date(); date2 = date2.getTime(); alert(date1) alert(date2)
©2020 All rights reserved.