If you run this JavaScript function on the 31st of a month, the result will be a month off. The best part is that this is the intended behvaior. JavaScript is a cursed language.
function getMonthName(monthNumber) {
const date = new Date();
date.setMonth(monthNumber - 1);
return date.toLocaleString([], { month: 'long' });
}
What would you expect "-1 month" to do for a date like 31st of March?
Would the result be the same as for "-1 month" on 29th of March?
If you go back 2 months so the 31st is existing again - should that mean that the result of using -1 month twice should be different to using -2 months?
I think it's just a stupid way to implement something like this as "month" isn't a defined size so defining it with a fixed value and documenting it properly is a decent solution but noone should use that kind of function in the first place
It is a stupid way to implement it, but the called function is named setMonth()! The minus one is performed externally, so if you set February you expect February, validation should adjust the other fields...