In December 2020, a project called Breakthrough Listen discovered what was then thought to be an extraterrestrial signal or, technosignature, formally named: Breakthrough Listen Candidate 1. Although the signal has since been debunked as human interference, BCL1 was the first meaningful recording since the project began in 2015. The signal, which appeared to be sent from the nearby star system, Proxima Centauri, kicked up public interest in how we would respond in the case we did receive that fateful call.
This line of research has long been underway at the Search for Extraterrestrial Intelligence (SETI) Institute. In fact, the Breakthrough Listen team that first reported the BCL1 signal, is hosted at the Berkeley SETI research centre in California. The SETI Institute was founded in 1984 and has remained steadfast in its search for any sign of extraterrestrial intelligence. Embedded within that search, is the investigation of language, and specifically, the ultimate universal language. The first attempt at such a language — and one that has since remained a mould for the exolinguistic field — was drafted in a 1960 book called Lincos: Design of a Language for Cosmic Intercourse, by the German mathematician Hans Freudenthal. The language detailed in Lincos was intended to be spoken, made up of radio waves and formatted as phonetics interpreted through symbols from maths, science and Latin. Its basis was determined by the expectation that in order to create technology and devices designed to receive radio waves, a civilization would require mathematics.
Since Lincos, researchers have explored other potential formats for messages and universal languages. In 1974, the Arecibo Message used a combination of modulated radio waves and binary symbolism. The message was aimed at globular cluster M13 and has a 25,000-year journey before it reaches its destination. Soon after our first attempt at ET communication, the Voyager Golden Records were launched by NASA in 1977. The Records encapsulate multiple forms of communication from recorded sound and brainwaves to symbolism and imagery. This message has a journey expected to last 40,000 years.
More recently, Yvan Dutil and Stéphane Dumas, a pair of Canadian physicists, created an alphabet based on Freudenthal’s Lincos. With this alphabet, the pair worked with Messaging Extraterrestrial Intelligence (METI) International, and American company Team Encounter, to send a comprehensive message that would contain more information than ever sent into the cosmos before. The message was called the Cosmic Call Primer and contained 23 pages of signal degradation-resistant communications that covered basic maths, science and biology. Within it, the pixelated symbols that make up an alphabet, number system and intact concepts and ideas, are each entirely unique. No matter how the message is received — mirrored, upside-down, etc. — the message will always say what it means. The first set of messages were sent out in 1999 from a radio transmitter in Evpatoria, Ukraine, with another sent out in 2003. The signals have 9 total destinations. The first expected arrival will be to HIP 4872, a star in the northern constellation of Cassiopeia in April 2036.
On October 4th, 2022, METI will send out another transmission, this time from the Goonhilly Satellite Earth Station in Cornwall, UK. Here, pulses will communicate a basic counting system, information on the periodic table and atoms, and a selection of musical pieces. The message is aimed at TRAPPIST-1. The star of interest is surrounded by seven planets, three of which exist in a zone where water can remain liquid, and the planet has the potential to support life. This safe zone is aptly named the Goldilocks zone.
In all cases, these transmissions are based on ideoglyphs intended for civilizations that use our brand of mathematics and physics. But are there grounds to explore artificial languages that revert to a more emotional format? In the early 19th century, Friedrich Wilhelm von Humboldt, a Prussian philosopher, developed the framework for modern ethnolinguistics. His thinking suggested that language is a representation of one’s culture and individuality. With this in mind, how can we preserve our own humanity in a language that attempts to reach civilizations that likely exist with entirely different ruling principles. As much as we want to successfully attempt communication, do we not want to preserve something of the stuff that makes Earth human?
In considering what language should be developed when reaching out to extraterrestrial civilizations, it’s also timely to consider how our own language may change and adapt as we get closer to homestead-based space travel. In their study, “Language Development during Interstellar Travel”, linguistic professors Andrew McKenzie and Jeffrey Punske, explore how the language of those aboard ‘generation ships’ will significantly change from the one travellers left with. As new vocabulary and concepts are developed on long-term voyages, a language irreconcilable from that on Earth can easily emerge. The study investigates examples of how language has changed with isolation, like in the case of Polynesian explorers where unique language and grammar structures materialised.
Today’s language has already seen some mutations in our time of digital isolation. The use of emoticons and memes are an example of a contemporary, image-first language. These gestural snapshots behave as universal shorthand for complex emotions and concepts. They also help signal community through shared references that can transcend borders and spoken language. Humboldt would surely see this development as a representation of digital ethnolinguistics.
Gestural languages allow for a lot of information to be packed into a single symbol. In this vein, Sign Language has long been considered a strong contender as the universal language of space. As it is now, the International Space Station uses two primary languages: English and Russian. Every astronaut must learn both in order to communicate, read necessary messages and materials, and to decode machinery. Sign language could present a beneficial replacement.
Not only does Sign Language offer an easy-to-learn alternative to Russian and English, but it also has operational advantages. As sound is not transmitted in space, Sign Language would allow for communication even in emergencies on the space station, during radio interferences or when conducting a spacewalk. Another benefit of Sign Language as a future language for space travel is its improvement to overall cognition performance. Because gestured language requires a signer to navigate facial expression, codifiers, spatial perception and a complex grammatical system, the modality encourages positive adaptations to manual and visual cognition. In tests like Acredolo & Goodwyn (2000), non-signing peers were notably underperforming in cognitive tests of visual-spatial skills, spatial memory and mathematical problem solving. The increase in both nonverbal intelligence and performance IQ tests for signing participants was consistent, regardless of when the participant learned Sign Language. When considering our potential for communicating with another civilization, improved cognition and an understanding of language modalities could help our brains better adapt to new languages or modalities of communication.
Our proposed language takes on the shapes and gestures imbued in ASL to create a graphic system that can flex between visual and manual modalities.
Sign Language has already had a place in space travel. It started with the Gallaudet Eleven, a group of 11 deaf male students from Gallaudet University that helped scientists understand what prolonged zero-gravity would do to the human body. Because 10 of the 11 participants had lost their hearing due to spinal meningitis at an early age, they were immune to motion sickness. Over a decade of tests and experimentation eventually led researchers to have enough information to send the first humans to space. Since then, astronauts Bill Readdy and Tracy Caldwell have sent back signed messages from space. As a precise form of human-robot communication, scientists have also developed haptic gloves that can be used to send messages and direct robots on-board space shuttles. Through Sign Language performed in the haptic glove, robots receive signals for movement and actions. And in 2012, Robonaut 2 sent the international programming message, “Hello World,” back to Earth with the use of ASL.
Our proposed language takes on the shapes and gestures imbued in ASL to create a graphic system that can flex between visual and manual modalities. The future language decodes the gestures of ASL through symbols to create an emotion-forward language that is human-centric by way of focusing on the hand and the body. This language considers the history of artificial languages and the adaptations of digital language to create an inclusive and adaptable method for communication in space.