为什么 javascript getMonth 从0开始计数,而 getDate 从1开始计数?

这个问题纯粹是为了满足我的好奇心。

在 JavaScriptDate 对象中,当调用 getMonth()时,它返回月份,但从0开始计数。

0 = January
1 = February
...

但是当你调用 getDate()时,它从1开始计数

1 = 1
2 = 2
...

为什么前后矛盾?

54452 次浏览

I assume it's because it would be easier to reference in an array of names, i.e.

var months = ["January", "February", "March", "April", "May", "June", "July",
"August", "September", "October", "November", "December"];


var d = new Date();


var namedMonth = months[d.getMonth()];

If getMonth() returned 1-12, then programmers would have to do d.getMonth()-1 everytime they wanted a fancy named month.

Days of the month don't have specific "names" per se. The getDate() returns 1-(28-31). We usually just refer to them by their number.

The same concept as getMonth() applies for getDay() also, which returns 0-6 based on the day of the week

var days = ["Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday"];


var namedDay = days[d.getDay()];

All this returns something like:

console.log("Month: month[" + d.getMonth() + "]: " + namedMonth);
//Month: month[3]:  April
console.log("Day: days[" + d.getDay() + "]: " + namedDay);
// Day: days[4] : Thursday

If you want to say it's inconsistency - you need to ask the creator of specification of language. According to this page JavaScript is based on ECMAScript (EDIT: see @MichaelGeary comment).

And when you read from page 165 here, you will see that all is working exactly as it's designed.

For you it can be inconsistency. For me it's rather a feature - 0-based values let you access Array straight away without doing calculations (see @Christopher's answer). In case of day of month you can't really access any Array. It will be weird to have Array of names of days of the month... like this:

var namesOfDays = [
"Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", // and again at least 4 times ...
"Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday",
"Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday",
"Sunday", "Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday",
"Sunday", "Monday", "Tuesday"
]

Link1

Date.prototype.getDate()
Returns the day of the month (1-31) for the specified date according to local time.

Link2

A Date object contains a number representing a particular instant in time to within a millisecond For example, if you specify 150 seconds, JavaScript redefines that number as two minutes and 30 seconds.

When you implement methods in Javascript to find the difference between two times specified in miliseconds, you would need to return a date which needs to be greater than 0 for obvious reasons.

var startTime = new Date('1/1/1990');
var startMsec = startTime.getMilliseconds();
startTime.setTime(5000000);
var elapsed = (startTime.getTime() - startMsec) / 1000;
document.write(elapsed);


// Output: 5000

As explained by "SomeShinyObject" that

var months = ["January", "February", "March", "April", "May", "June", "July",
"August", "September", "October", "November", "December"];

helps in referencing them through array index.

Hence getDay, getHours, getMonths starts from 0.

Coming a bit late to this, but the correct answer is here:

https://stackoverflow.com/a/41992352/134120

They (the creators of JavaScript) copied the functionality from the corresponding class in Java (which in turn seems to have been copied from C). And so we're propagating the mistakes of the past 🤦‍♂️