As a software developer, the last thing I want to see is another obscure timezone to deal with. It does seem like it'd be important to set a standard here though, and it's unlikely to affect most software engineers anyway unless we start seeing colonization at some point.
Also, in general, I can't imagine how relativity will work with this kind of timezone conversion.
Relativity is exactly the problem, time runs at a different speed on the moon. I have no idea how this is supposed to be handled in software.
Like if you have a lunar instant of time and an earth time and you want to figure out how much time happened between those two instants, I guess you'll essentially need to decide on a frame of reference and then take into account relativity as you convert the lunar time to UTC. But I'm not a physicist, I'm not sure if doing that even makes sense.
Time passes differently for other lower orbit satellites as well. They just adjust the time to take up the slack but it's likely done at very high precision.
Honestly, it should be really easy to figure out. Take two sycronized high precision clocks, put one in orbit and keep one on earth and then subtract one time from another after a few days. (At that precision, you also need to take into account the time it takes to radio the signal back to earth as well.)
The main issue comes from someone trying to build a library for this. For example, try answering the question "what time was it 2y 46d 2h 15m ago on the moon in lunar time?" (assuming it was asked on Earth, in some known timezone). Needing to look up from some table on a time server to answer all these questions sounds like a nightmare.
On Earth, there is a table with leap seconds... and sometimes they're negative. That alone, is a good reason why writing time libraries is better left to people who specialize in writing time libraries.
The relativity part, also made me think: Luna orbits Earth at about 3600Km/h... but Earth's equator itself, "orbits" Earth's poles at 1600Km/h... so if one has relativity effects on time, half that speed must be having some relativity effects too, right...? Someone on the South Pole would also see a clock on the equator go some microseconds slower per day... and all the clocks at different latitudes, and everyone relative to everyone else, so you can't tell "precisely" the time on Earth without taking into account the exact location... 😬
If it was that easy, I don't think the US government would have mandated a whole project to figure it out. NASA would have done it by now and been using it internally for a while before anybody noticed.
That's not sarcasm - that's kinda how NASA solves weird (to baselines) problems like this. They just sort of do it, it's done, and then somebody might get around to publishing a paper about it. At least in the years I worked there (GSFC, 2010-2013) it used to be a thing that engineers would chat about while waiting for the coffee maker to finish brewing a fresh pot, or maybe doodle on a bad while waiting for a run to finish.