Season 1, episode 17, “Home Soil”
Lesson: real-time language translation tools will be incredibly valuable
This post is part of my ongoing quest to watch every episode of Star Trek: The Next Generation and pull one startup, entrepreneurship, tech, or investing lesson from each.
The Enterprise checks on a team of engineers running a 30-year terraforming project on a planet called Velara III. All planets that the Federation selects for terraforming must be devoid of life. If they’re inhabited, changing their environment to be fit for human occupancy could kill any life present and would violate the Prime Directive. The engineers seem anxious to continue their work, yet things have started going wrong and threaten to ruin the pace of the project. Turns out that the strange occurrences were caused by an inorganic life form, sort of like an AI brain, whose single cells were linked together in a layer of underground water that the terraformers were draining. Thus the terraforming work was killing them.
No one knew that until the Enterprise beamed the life forms onboard to examine them, thinking at first that they were simply strange flashing light patterns. These patterns were the life form’s language. When the life form starts reproducing and taking over areas of the ship, Picard and the crew turn on a universal translation device to speak with them and diffuse hostilities, eventually returning them to their planet and leaving with the engineers. They wasted decades of their lives terraforming a planet that actually had inhabitants because the Federation’s definition of life as organic and carbon-based was too narrow.
The terraforming engineers were unknowingly committing genocide by transforming the planet, and they could have avoided death and wasted work by being able to communicate with the inorganic life form. The universal translator existed, but the engineers didn’t think to use it because they didn’t categorize the anomalies they saw as life or speech.
Sadly we don’t have universal translators today. These would be devices that could translate literally any language into any other. If an alien species landed on earth tomorrow, we’d be able to understand them with a true universal translator. Think the Babel fish in The Hitchhiker’s Guide to the Galaxy.
But we do have some nifty, close to real-time translation devices today that are probably further evolved than you’d think. Here are a few:
Microsoft Translator translates conversations across up to 100 people in video and/or audio chats. It can translate both spoken and written communication, although the languages available for speech translation are more limited (just nine, as opposed to 60 for written). Of these 60, one is Klingon. HELLO, STAR TREK REFERENCE. The app translates spoken word into text displayed on the listener’s device, rather than audio in their ear, so calling this “real time” is a bit of a stretch. It’s also available on Skype.
Google Translate has three cool real-time features. One, based on technology called World Lens built by a company called Quest Visual, uses computer vision to read and translate text within the camera app. Hold your camera up to a street sign in Tokyo and it’ll analyze and augment with the English version, for example. Other apps that offer the same in-camera translation include Waygo (for Japanese, Chinese, and Korean) and TextGrabber (covering 100+ languages).
The second lets two people speaking in different languages have their conversation with a phone in front of them that displays what each is saying in the other’s language, right as it happens.
The third is an integration with Google Pixel Buds, Google’s answer to Apple’s Airpods. The wearer taps the right earbud and says a phrase, which opens up the Translate app on their phone (which must be a Google Pixel device, at least as of my writing this). Then whenever you tap the earbud and talk, it’ll send your message to the phone, which reads it aloud in the language you pick. To hear the person’s response, you press a microphone icon in the mobile app, record the person’s response, and you’ll hear it played back to you in your language via the earbuds.
Again, calling this true “real time” doesn’t quite fit, but it’s still an impressive feature. It supports 40 languages. However, people say the translations can be iffy and often unintentionally hilarious because of how off they are.
So we don’t quite have the ability to understand a bunch of pulsating lights angrily calling us “ugly giant bags of mostly water,” (which you have to agree is pretty accurate), but with advents in AI and computing power, we’re getting there.