I am trying to convert an UTC date to millis since UNIX epoch.
For this, I am doing the following:
const date = new Date(); // local date
const utc = moment().utc(date); // UTC date
const utcMillis = utc.valueOf(); // Millis since UNIX epoch
Currently, I live in Spain (UTC + 2), and I have noticed that
date.getTime() === moment().utc(date).valueOf()
Why?
Note: you can test this online here
>Solution :
const date = new Date(); // local date
There is (currently) no such thing as a "local date" in ECMAScript. All Date instances are just an offset from the ECMAScript epoch, 1970-01-01T00:00:00Z, which is a fairly common epoch in programming.
const utc = moment().utc(date); // UTC date
The utc method sets a moment object into UTC mode, so all get and set methods use their UTC variants. In this instance, a new moment object will be created after calling getTime (or maybe valueOf, the result is identical) on date. And since that time value is UTC already, utc mode has no effect on the outcome.
If you just want the current time value (i.e. millisecond offset from the ECMAScript epoch), then Date.now() does exactly that (allowing for system clock accuracy) without a library or jumping through hoops. 🙂