Lessons in instrument design from Star Trek

Story


by S. Astrid Bin 

Editor’s Note: Longtime fans of this site may be familiar with its “tag line,” “Stop watching sci-fi. Start using it.” So I was thrilled when a friend told me they had seen Astrid present how she had made an instrument from a Star Trek episode real! Please welcome Astrid as she tells us about the journey and lessons learned from making something from a favorite sci-fi show real. —Christopher

I’ve been watching Star Trek for as long as I can remember. Though it’s always been in the air of culture, it wasn’t until March 2020—when we were all stuck at home with Netflix and nothing else to do—that I watched all of it from the beginning.

Discovering Trek Instruments

I’m a designer and music researcher, and I specialise in interfaces for music. When I started this Great Rewatch with my husband (who is an enormous Trek fan, so nothing pleased him more) I started noting every musical instrument I saw. What grabbed me was they were so different from the instruments I write about, design, make, and look at, because none of these instruments, you know, actually worked. They were pure speculation, free even of the conventions of the last couple of decades since computers became small and powerful enough that digital musical instruments started to become a common thing on Kickstarter. I got excited every time I saw a new one.

What struck me the most about these instruments is that how they worked didn’t ever seem to enter into the mind of the person who dreamed them up. This sure is a departure for me, as I’ve spent more than ten years designing instruments and worrying about the subtleties of sensors, signal processing, power requirements, material response, fabrication techniques, sound design, and countless other factors that come into play when you make novel digital musical instruments. The instruments in Star Trek struck me as anarchic, because it was clear the designers didn’t consider at all how they would work, or, if they did, they just weren’t concerned. Some examples: Tiny instruments make enormous sounds. Instruments are “telepathic”. Things resonate by defying the laws of physics. Some basic sound design is tossed in at the end, and bam, job done.

Some previous instrument design projects. From left: Moai (electronic percussion), Keppi (electronic percussion), Gliss (synth module interaction, as part of the Bela.io team)

I couldn’t get over how different this was to the design process I was used to. Of course, this is because the people designing these instruments weren’t making “musical instruments” the way we know them, as functional cultural objects that produce sound of some kind. Rather, Trek instruments are storytelling devices, alluring objects that have a narrative and character function, and the sound they make and how they might work is completely secondary. These instruments have a number of storytelling purposes, but most of all they serve to show that alien civilisations are as complex, creative and culturally sophisticated as humans’.

This was striking, because I was used to the opposite; so often the technical aspects of an instrument—and there are many, from synthesis to sensors—always somehow become the most significant determining factor in an instruments’ final form.

The Aldean Instrument

There was one instrument that especially intrigued me, the “unnamed Aldean instrument” from Season 1, Episode 16 of Star Trek: The Next Generation, “When the Bough Breaks”. This instrument is a light-up disc that is played by laying hands on it, through which it translates your thoughts to sound. In this episode the children of the Enterprise are kidnapped by a race of people who can’t reproduce (spoiler alert: it was an environmental toxin, they’re fine now) and the children are distributed among various families. One girl is sent to a family of very kind musicians, and the grandfather teaches her to play this instrument. When she puts her hands on it, lays her fingers over the edge and is very calm it plays some twinkly noise, but then she gets anxious when she remembers she’s been kidnapped, and it makes a burst of horrible noise.

[If you have a subscription to Paramount, you can see the episode here. —Ed.]

This instrument was fascinating for a lot of reasons. It looked so cool with the light-up sides and round shape, and it was only on screen for about four tantalising seconds. Unlike other instruments that were a bit ridiculous, I kept thinking about this one because it was uniquely beautiful, and it seemed like a lot of thought went into it.

I researched the designers of Trek instruments and this instrument was the only one that had a design credit: Andrew Probert. Andrew is a prolific production designer who’s worked mainly in science fiction, and he’s been active for decades, designing everything from the bridge on the Enterprise to the Delorian in Back to the Future. He’s still working, his work is fantastic, and he has a website, so I emailed him and asked him what he could tell me about the design process.

He got back to me straight away and said he couldn’t remember anything about it, but he dug out his production sketch for me:

Courtesy of Andrew Probert, https://probert.artstation.com/

The sketch was so gloriously beautiful that I couldn’t resist building it. I had so many questions that you can’t answer, except through bringing it into reality: How would I make it work like it did in the show? How would I make it come alive slowly, and require calmness? How was I going to make that shape? Wait, this thing is supposed to translate moods, what does that even mean? How was I going to achieve the function and presence that this instrument had in the show, and what would I learn?

Building the Aldean Instrument

Translating moods

When I discussed this project with people, the question I got asked most often was “So how are you going to make it read someone’s mind?”

While the instrument doesn’t read minds, the idea of translating moods gave me pause and eventually led me to think of affective computing, an area of computing that was originated by a woman named—brace yourself—Rosalind Picard. Picard says that affective computing refers to computing that relates to, arises from, or deliberately impacts emotions.

Affective computing considers two variable and intersecting factors: Arousal (on a scale of “inactive” to “active”), and valence (on a scale from “unpleasant” to “pleasant”). A lot of research has been done on how various emotions fall into this two-dimensional space, and how emotional states can be inferred by sensing these two factors.

Image by Patricia Bota, 2019

I realised that, to make this instrument work the way it did in the show, the valence/arousal state that the instrument was sensing was much simpler. In the show, the little girl is calm (and the instrument plays some sparkly sound), and then she’s not (and the instrument emits a burst of noise). If this instrument just sensed arousal through how hard it was being gripped and valence through how much the instrument was moving, this creates an interaction space that still has a lot of possibility.

The instrument playing requires calmness, and I could sense how much they were moving around with an accelerometer, by calculating quantity of motion. If the instrument was moved suddenly or violently it could make a burst of noise. For valence—pleasantness to unpleasantness—I could sense how hard the person was gripping the instrument using a Trill Bar sensor. The Trill Bar can sense up to five individual touches, as well as the size of those touches (in other words, how hard those fingers are pressing). 

Both the touch sensing and the accelerometer data would be processed by a Bela Mini, a tiny but powerful computer that could process the sensor data, as well as provide the audio playback.

Making the body

I got to work first with the body of the instrument. I often prototype 3D shapes using layers of paper that are laser cut and sandwiched together, as it allows for a gradual, hands-on process that allows adjustments throughout. After a few days with a laser cutter and some cut and paste circuitry, I had something that lit up that I could attach the sensing system to.

Putting it together

I attached the Bela Mini to the underside of the instrument body, and embedded the Trill Bar sensor on the underside of the hand grip, so I could sense when someone’s hand was on the instrument. 

As I set out to recreate how the instrument looked and sounded in the show, I wanted to make a faithful reproduction of the sound design, despite the sound design being pretty basic.

The sound is a four-part major chord harmony. I recreated the sound in Ableton Live, with each part of the harmony as a separate sample. I also made a burst of noise. 

When the instrument is being held gently and there are no sudden movements, it can play; this doesn’t mean stillness, just a lack of chaos. As the player places their fingers over the instrument’s edge, each of their four fingers will be sensed and trigger one part of the harmony. The harder that finger presses, the louder that voice is.

There’s a demo video of me playing it, above.

This process was just as interesting as I suspected, for a number of reasons.

Firstly, de-emphasising technology in the process of making a technological object presented a fresh way of thinking. Instead of worrying about what I could add, whether the interaction was enough, or what other sensors I had access to (and thereby making the design a product of those technical decisions), I was able instead to be led by the material and object factors in this design process. This is an inverse of what usually happens, and I certainly am going to consciously invert this process more often from now on.

Secondly, thinking about what this instrument needed to do, say and mean, and extract the technological factors from there, made the technical aspects much simpler. I found myself working artistic muscles that aren’t always active in designing technology, because there’s often some kind of pressure, real or imagined, to make the technical aspects more complex. In this situation, the most important thing was supporting what this was in the show, which was an object that told a story. When I thought along those lines, the two axes of sensing were an obvious, and refreshingly simple direction to take.

Third, one of the difficult things about designing instruments is that, thanks to tiny and powerful computers, they can sound like anything you can imagine. There’s no size limitations for sound, no physical bodies to resonate, no material factors that affect the acoustic physics that create a noise. This freedom is often overwhelming, and it’s hard to make sound design choices that make sense. However, because I was working backwards from thinking about how this instrument was presented in the plot of the episode, I had something to attach these decisions to. I recreated the show’s simplistic sound design, but I’ve since designed sound worlds for it that support this calm, gentle, but very much alive nature that the Aldean instrument would have, when I imagine it played in its normal context. 

Not only physically recreating the shape an instrument from Star Trek, but making it function as an instrument showed me that bringing imaginary things into reality is a process that offers the creator a fresh perspective, whether designing fantastical or earthly interfaces.

Leave a Comment