2015 (the 5th) Shanghai Conservatory of Music’s International Electronic Music Week Concert Series
New Sounds from USA
Time: 19:45, Oct. 23
Location: Shanghai Symphony Hall -Chamber Hall
1. Photographs for Cello, RadioBaton and Virtual Orchestra (2015)
Composer: Richard Boulanger
Cello: Kari Juusela
RadioBaton and ModularSynth: Richard Boulanger
iPad Processing: Michael Bierylo
Visuals* : Basil Simon
1. On the Road
2. In the Woods
3. From the Studio
4. Around the City
In “Photographs”, the acoustic cello is played, processed and spatialized with the Keith McMillen “K-Bow”. The amplified acoustic cello is further processed by a performer playing two iPads running the Csound apps “csSpectral” and “csGrain” by Boulanger Labs. In this “concerto”, the processed acoustic cello is accompanied by a Csound ForLive “virtual orchestra” conducted and performed on The Mathews’ RadioBaton – a wireless 3D controller developed at Bell Labs for Richard Boulanger by the “father of computer music” – Max Mathews. The piece also features Csound-based 8-channel spatial software written by Brandon Bell that is used to immerse the audience in the center of the “virtual orchestra”, to move instruments and timbres around the concert hall under algorithmic control and under the direct control of the RadioBaton; and to move the hall around the listener!
In the third movement, “From the Studio”, the virtual orchestra “takes a little break” and so the improvising cellist is “live-sampled” and processed by a EuroRack modular synthesizer that features the Roland Scooper, Demora, BitRazer and Torcindo, the Mutable Warps and Clouds, the TipTop Z-DSP, the 4ms MultiBand Resonator, the AudioDamageGrainShift, the ModCan CV Record and Dual FreqShifter, the SoundMachines UL1, the DoepferExt. In/EnvFollower and Voltage Controlled Bit Modifier, and the Make Noise Phonogene, EchoPhon, ErbeVerb and RxMx. The music of all four movements is accompanied by a live visual performance that features a multi-camera mixing and processing system designed in Jitter and Resolume by Basil Simon. *The unique, exquisite, beautiful, delicate, haunting, evocative, and powerful images that Basil uses to “underscore” the piece are by the American photographer – Lori Whalen, and the French photographer – CatiVaucelle.
2. Vox Voxel – 3D Printer and Realtime Processing, Ambisonics (2014)
Composer: John Granzow, Fernando Lopez-Lezcano
From an IBM 720 line printer playing “Three Blind Mice” in 1954 to dot matrix printers playing love songs and Queen, mechanical noises coming from printers were slowly tamed, domesticated and controlled, and countless unproductive hours of programming time were spent in figuring out how to make those noises into musical notes, phrases and whole pieces for the enjoyment of the IT team. From deafening antique mainframe line printers to whisper quiet inkjets, all have been at the spotlight of a concert performance (or at least a basement computer room).
Vox Voxel is “composed” by designing a suitably useless 3D shape and capturing the sound of the working 3D printer using piezoelectric sensors. Those sounds are amplified, modified and multiplied through live processing in a computer using ardour and LV2/LADSPA plugins, and output in full matching 3D sound.
The piece is dedicated to our endangered wooden 3d printer, slowly declining with the rise of folded metal frames in entry-level machines. The wood, (if fragile) is good for contact vibrations, to amplify rhythms of the tool-path and the frequencies of stepper motors. This rare 3d printer takes six minutes to warm up its extruder. For this, it has also fabricated an array of extensions for its equally endangered human performer.
3. Sequentially Mutable Nebulae for EuroRack Modular Synthesizers (2015)
Live audio – synthesized, composed and performed by Richard Boulanger and Michael Bierylo
Live video – synthesized, composed and performed by Chatchon Srisomburananont
Live cameras – mixed and processed by Basil Simon
“Sequentially Mutable Nebulae” is a “composed” multi-channel soundscape being realized live. Against a number of evolving and permuting ostinati, individual soundEvents, and swarms of mutating soundObjects, that are literally being designed on the fly by Bierylo and Boulanger, are projected about the concert hall. They are cast and recast as the energy of the piece ebbs and flows.
In the quieter sections, some FM Radio excerpts are presented using the ADDAC and Thonk FM Radio Modules; and some live vocal processing is featured using the TipTop Z-DSP, Mutable Clouds, and the Audio Damage GrainShift modules. All the moving sounds are synthesized by the Mutable Braids, Elements, and Edges; the MakeNoise DPO, Mysteron, and Telharmonic; the IntellijelShapeShifter and Atlantis; the ADDAC WavePlayer; and the Csound-based QuBit Nebulae. All the moving sounds are synthesized by the Mutable Braids, Elements, and Edges; the MakeNoise DPO, Mysteron, and Telharmonic; the IntellijelShapeShifter and Atlantis; the ADDAC WavePlayer; and the Csound-based QuBit Nebulae.
The sounds are spatialized by a collection of modules from ADDAC, 4ms, Topobrillo and WMD. The spatial locations, movements, and distributions, are controlled by Doepfer Theremins, IntellijelQuadPanners, Hackme Vectors, MakeNoisePressurePoints, a SoundMachineLightPlane, and an ADDAC Lissajous Generator.
The ostinatic melodic and bass patterns are produced and permutated by five sequencers: the Intellijel Metropolis, the MakeNoise René, the MonomeWhiteWhale and EarthSea, and the ArturiaBeatStep Pro. The rhythmic patterns are generated by the Mutable Grids, the TipTop Trigger Riot and Circadian Rhythms, and the ADDAC Heuristic Rhythmic Generator. Clocks are synchronized, divided, and multiplied by a number of 4ms modules. Patterns, paths, and parameters are randomized with the MakeNoiseWoggleBug, the QuBitNanoRand, and the ADDAC Probability Generator.
The visual component of “Sequentially Mutable Nebulae” is also being realized “live” by using voltage-controlled EuroRack modules that support visual synthesis such as the BleepLabs 3Trins, the LZX Industries Visual Cortex, and with the modules that support the voltage control of “Processing” sketches such as the MiniGorille Geometry Synth. These visuals are being controlled by the Doepfer Theremin, the MiniGorille CV Graphic, and by the Meng Qi Music Voltage Memory.
In addition, the piece features multiple HD cameras, live video processing, and audio-reactive generative visuals. Everyone in the audience would not only be surrounded by and immersed in this percolating live soundscape as it evolves; but they will also be able to see, close-up, all the live patching and real-time sound design that Bierylo, Boulanger, and Srisomburananont will be doing on the EuroRack modular systems to synthesize, control and mutate the sound and visual objects.
4. Five Pieces for Guitar and Live Electronics (2007) – 1.Echoes; 2.Lachrymal; 3. Brunete; 4. Saudade; 5.Stèle
Composer: Ronald Bruce Smith
Guitar: David Tennenbaum
Five Pieces for Guitar and Live Electronics was commissioned under a UC Discovery grant from the University of California’s Industry-University Cooperative Research Program (IUCRP) and Gibson Guitar Corporation. It was made possible through the support of the Center for New Music and Audio Technologies (CNMAT), Department of Music, University of California, Berkeley. The aim of the commission was to incorporate into a concert piece for solo guitar the guitar centered software technology that researchers there had developed. These effects were developed in the Max/MSP environment and I have made use of several of them along with a few other things that are not in that specific package.
Much of the music in this composition requires a high-level of guitar virtuosity. It was written with the outstanding technique and musicianship of David Tanenbaum in mind for whom the work was written and who premiered it in November 2007.
Following the San Francisco premiere of the composition, Jules Langert of the San Francisco Classical Voice wrote: “Ronald Bruce Smith’s Five Pieces for Guitar and Live Electronics (2007) is a kind of duet, in which the electronics amplify and react to whatever the guitar is playing, creating new material in the process. This partnership was most evident in the thoughtful, final piece, Stèle, where both elements seemed to shape the music in tandem, with guitar and electronics behaving almost as equals. Echoes, the first piece, is a brilliant toccata-like movement, while the other three pieces evoke flamenco, Brazilian, and middle- Eastern idioms.” Paul Hertelendy of Artssf noted after same performance that “The contrasting sections spread over the varied personae of the solo instrument, sometimes Javanese, sometimes Brazilian. I especially relished the latter’s Saudade (nostalgic yearning), an emotional trip not quickly forgotten.”
5. Divertimento de Cocina
Composer: Fernando Lopez-Lezcano
A LaunchPad controller, a computer and a custom set of SuperCollider classes control the music synthesis processes that use and transform raw kitchen utensil samples recorded a long time ago. The extremely simple rhythms at the beginning of the piece become progressively more complicated as they are layered together in increasingly thicker textures. While the performer walks through different soundscapes, rhythms form the backbone and guide for the rest of the piece. The arrays of buttons in the Launchpad controller manipulate multiple “virtual performers” which can be queued, started, paused or stopped asynchronously. The piece meanders through eight layers of materials arranged in ‘scenes’ through this very simple interface. The control program also dynamically spatializes all sounds under the control of the performer and the 3D soundscape can be diffused through an arbitrary number of speakers (the original soundstream is internally generated in Ambisonics, with at least 3rd order full periphonic resolution).
6. KOTI for Solo Cello, the Finnish Kantele, and a Chinese Singing Bowl
Composer: Kari Juusela
Cello: Kari Juusela
iPad and NanoKontrol: Michael Bierylo
iPad, LaunchPad and NanoKontrol: Richard Boulanger
Live Visual Performance: Basil Simon
KOTI mean “home” in Finnish. It was written as an homage to the composer’s summer cottage and his childhood home in Lappeenranta, Finland. The main “musical” theme for KOTI is based on an ancient song from the Finnish national epic – The Kalevala. The piece is for solo cello, the Finnish kantele, and a Chinese singing bowl accompanied by Csound-based samplers, synthesizers and signal processors triggered from a Novation LaunchPad and some KorgNanoControllers. The cello is played and spatialized using a K-bow*. The sounds of the cello, kantele, and bell are processed and transformed through two Csound-based iPad apps – “csGrain” and “csSpectral”. The realtime visuals, that underscore the piece, juxtapose, mix, mask, and chroma-key multi-camera live and processed images with “classic” images from The Kalevala by the great AleksiGallen-Kallela.
*The Keith McMillen “K-bow” is a carbon-fiber sensor-bow that is wirelessly connected to a laptop through a built-in Bluetooth transmitter. Accompanying software is programmed to read and follow a performer’s gestures. The software continuously tracks the speed, pressure, and XYZ location of the bow. These control signals can be used to: trigger sampled sounds and control their playback speed, direction and pitch; to trigger and control synthetic sounds and modulate their parameters; or by using a number of built-in signal processing algorithms, to modify and spatialize the sound of the amplified acoustic cello itself. In fact, custom algorithms have been written to allow the position of the bow to control the location of the cello sound as it is projected from the 8 loudspeakers in tonight’s concert.