The sounds of the past are coming to the instruments of the future by Khalida Sarwari October 29, 2019 Share Facebook LinkedIn Twitter Victor Zappi, a recently appointed assistant professor of music technology at Northeastern, designs novel interfaces for musical expression. Photo by Matthew Modoono/Northeastern University Victor Zappi’s finger hovers over an elaborately designed circuit board resembling pizza. He points to a rectangular black chip in the center of the topmost layer. This chip comes from a 1982 computer. Remember the Commodore 64? The circuit board is attached to a prototype space supporting a medley of brightly colored criss-crossed wires. Zappi, who is a recently appointed assistant professor of music technology at Northeastern, will use the wires to test the various components of the board. Victor Zappi, assistant professor of music technology at Northeastern’s College of Arts, Media and Design, demonstrates a circuit board that, once completed, can be embedded into a new generation of digital musical instruments to produce unique sounds that until now could only be simulated. Photo by Matthew Modoono/Northeastern University “I designed this system to interface this chip, which is nothing but a synthesizer with modern digital technology,” he elaborates. Once completed, this small electronic board will be able to control vintage synthesizer chips, enabling composers of electronic music to authentically create vintage sounds—think video game sound effects from the 1980s—and include them in contemporary compositions. The idea is that the board, a fusion of analog and digital technology, can then be embedded into a new generation of digital musical instruments to produce unique sounds that until now could only be simulated. “Before, there was no way to actually include in new musical devices these old chips whose sound is so iconic,” Zappi says. “They’re vintage, like an old Fender guitar, or a Hammond organ. They have a very peculiar sound and it’s very hard to reproduce with digital technologies.” This is not the first time Zappi has taken an experimental approach to designing innovative interfaces for musical expression. Two years ago, he took home the top prize at a musical design competition in Atlanta, Georgia, for designing (and performing!) the hyper drumhead, an instrument that was modeled after the way that sound propagates within the human throat and mouth. “It’s kind of weird; the way it makes music is actually based on the way the human vocal tract works,” Zappi says. He stumbled upon that discovery nearly by accident. He and a research colleague were designing algorithms that simulated how the vibrations of the vocal folds become utterances. Their research resulted in the development of the hyper drumhead, while also helping to shed light on vocal tract diseases and speech disorders. “I’m trying to understand how to develop a system that can help patient care,” he says. “It would be a way to quickly model how a specific patient speaks and, in the case of surgery, foresee the impact that the operation would have on his or her vocal abilities. Right now we can do this, but it takes a lot of time, a lot of money, and technology that is extremely difficult to use.” In Zappi’s world, music and engineering are locked in an intimate dance; the disciplines are intertwined, sharing a symbiotic relationship. Trained as a computer scientist in Italy, he has spent his career using technology to gain a deeper understanding of music, a passion he’s maintained since boyhood. “I started when I was a kid in my high school with rock and metal, and then I switched to electronic music because I wanted to play with a computer on my own,” he remembers. “I had a lot of fun with bandmates, but I had some kind of a push to make my own music that maybe wouldn’t fit in with other people. And then this basically blended with my work.” His passion for finding new ways to merge music and technology has led him on a journey around the world. In Paris, he worked at a research center specializing in music composition and performance. In the U.K., he designed digital instruments at the Augmented Instruments Laboratory at Queen Mary University of London, continuing in this endeavor at the University of British Columbia in Vancouver, where he also worked as a research fellow. “It’s basically laid the foundation for the kind of research that I’m about to start here at Northeastern,” he says. As a professor of music, he invites his students to play with interfaces, while also encouraging them to learn music theory, and the physics and acoustics behind musical instruments. “Students are the engine of research,” he says. “I’m really excited about the idea of gathering a group of students that may work with me and we can apply for funding and continue to develop this instrument,” Zappi says. For media inquiries, please contact media@northeastern.edu.