One small trait I really liked about Hoshi was how she could sense when they went into warp and it discombobulated her for a second. She made a little startled sound. To me that seems really true to life, especially in an era when they were still sort of working the bugs out of everything. I would think it might give some people migraines, or they would get nauseous or black out or something.
The Universal Translator is basically magic. TOS came closest to describing how it works, and it boiled down to, “IDK man it does some brain scans to detect your language structure”. There’s no satisfying answer as to why it knows the “Washington State Bridge” is a combination of a proper noun, a geopolitical concept, and a general noun.
In Enterprise, the Universal Translator is generally depicted as a modern miracle of technology, but one without useful internal intelligence. If it hears a few snippets of Romanian, it’s just going to start brute forcing a translation matrix with every technique it has at its disposal. More speech gives it more data to work with, but it’s still just cycling through its options.
Sato’s familiarity with xenolinguistics allows her to aid the Universal Translator by narrowing the system’s options or directing it down specific paths. She doesn’t know or learn the alien languages in the traditional sense, but she’s shown for having a knack for picking up on patterns and syntax. Again with the Romanian example, she’s doing the alien equivalent of saying, “This sounds European, skip trying to translate this as an Asian language for now”. The Universal Translator has fewer options to run through and gets to a successful translation matrix faster.
Throw unimaginable amounts of data at a GPU, you convert words to numbers through an algorithm called word2vec, then correlate between the words in context, English on one side, unknown language on the other, just looking for any connection at all. If the vector products are above a threshold (we call that activation) then there might be a relationship.
Once you have it you experiment with it, try to make more associative connections building on that. The gpus just churn endlessly till they go 'bing'.
We did a simpler version as humans with the Rosetta stone, it had Greek, which we knew, and 2 forms of Egyptian, which we didn't.
Yeah the show is ludicrous in having someone point to a broom, say 'mekh' and the next sentence is 'I'm afraid the current structure of relations in the alpha quadrant requires us to demand you no longer perform warp travel through our sector as we do not tolerate antimatter reactions under our religion' translated from 'guh-tah!'
It’s pretty simple to start at the fact you have a species that can read the thoughts of other people, even outside their species, and even inject thoughts into those other beings minds. So there must be something detectable that sentient species emit that can be detected and understood, and can also be pushed at them. Once that mechanism’s understood, a machine can be developed to apply that to language. Sato is really good at using that machine.
Handwaving tech and physics is easy. I'm not touching linguistics, other than to say I don't think you can explain it away. Languages are hard, even if by some magic all species everywhere have some common base, which I doubt.
I think in ToS Uhura knows a bunch of different languages and she's like "this bit is similar to this language and this bit is similar to that language".
I always thought that made some kind of sense.
Maybe it's been explained, but on Earth, there are many languages/cultural dialects/regional-slang/etc. Those all are absent for the Klingons/Romulans/Ferengi/etc.
Dialects are mentioned in Enterprise. In the first episode Klang’s dialect is the reason Enterprise’s UT can’t lock on and translate. Later It’s mentioned in the Augment arc that Enterprise has been programmed with 7 klingon dialects.
Admittedly I've been thinking I'm overdue to rewatch nearly all of Ent, but I have a vague memory of this getting explained (in a hand wavely kind of way). Maybe in the scene where Archer tells small children they recycle their poop? Or maybe my brain invented a scene to maintain my suspension of disbelief.
I kind of assumed that it's some kind of brain-scanning tech that can extract meaning directly from the language processing part of the brain, and it just needs some calibration for each language. If two random ships can synchronize a communication frequency and video format, they can probably also have some standard brain-scan info dump, so the scan could be done by the speaker.
It was supposed to be that she was the most talented linguist on the planet being fluent in 23 languages or something like that to the point where she knows linguistics so well it's super easy to pick up on new languages for her. I was watching this new as a kid and she's what got me into linguistics. I know it's not that easy but man she inspired me.