155 DOI: 10.4312/mz.58.2.155-174 UDK 781.6:791:004 Transcoding as a Compositional Paradigm: The Association of Compositional Parameters and Computer Analysis of Moving Images in Outer Space Javier Elipe-Gimeno, Charles de Paiva Santana Aix Marseille University, CNRS, PRISM, Marseille ABSTRACT This paper focuses on one of the author’s compositions, Outer Space, conceived using tran - scoding techniques between musical and visual parameters, such as the relative percentage of black and white dots per frame and the difference between the position of the dots from one frame to another. Keywords: algorithmic music, musical composition, experimental film, video analysis, Max/ MSP , orchestration IZVLEČEK Prispevek se osredotoča na eno od avtorjevih kompozicij, Outer Space (Vesolje), ki je bila zasnovana s tehnikami prekodiranja med glasbenimi in vizualnimi parametri, ki jih določa relativni odstotek črnih in belih pik v posameznem kadru oz. razlika v položaju pik med prvim in drugim kadrom. Ključne besede: algoritemska glasba, glasbena kompozicija, eksperimentalni film, video analiza, Max/MSP , orkestracija MZ_2022_2_FINAL.indd 155 MZ_2022_2_FINAL.indd 155 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 muzikološki zbornik • musicological annual lviii/2 156 Introduction Composers and musicologists who work with computational tools have to deal, on a daily basis, with different types of association, at varying levels of sophistication, between musical, sonic and digital code. 1 Be it due to the MIDI protocol, the parameterization of audio processing (bitrate, sample rate, loudness), the analysis of sound through descriptors and representa - tion systems such as MEI, 2 MusicXML 3 and Humdrum, 4 among others, the digital code plays the role of mediator between musical phenomena and computational processing. The imposition of communication in numerical or ‘digital’ code comes from the computer: it is commonplace and applies to virtually any kind of operation we wish to do on it, be it of a musical, textual or visual nature. 5 In regard to the humanities, art, and musical composition, the use of the computer and consequent omnipresence of the digital code favors the con - ception of the sensorial stimulus as something fluid, as something which has plasticity. 6 Musical forms and materials can, despite their many individual specificities, be transposed to, or adapted from, other representational manifes - tations, mediated by the digital code. 7 This kind of approach can be identified 1 Geraint A. Wiggins, “Computer Representation of Music in the Research Environment,” in Mod- ern Methods for Musicology: Prospects, Proposals, and Realities, eds. Tim Crawford and Lorna Gib - son (London: Routledge, 2016), 27–42. 2 Perry Roland, “The Music Encoding Initiative (MEI),” paper presented at MAX2002: Proceedings of the First International Conference on Musical Applications Using XML, 55–59, http://xml.coverp - ages.org/MAX2002-PRoland.pdf . 3 Michael Good, “MusicXML for Notation and Analysis,” in The Virtual Score: Representation, Re - trieval, Restoration, eds. Walter B. Hewlett and Eleanor Selfridge-Field (Cambridge, MA: Mit Press, 2001), 113–124. 4 David Huron, “Music Information Processing Using the Humdrum Toolkit: Concepts, Exam - ples, and Lessons,” Computer Music Journal 26, no. 2 (2002): 11–26; David Huron, “Humdrum and Kern: Selective Feature Encoding,” in Beyond MIDI: The Handbook of Musical Codes (Cambridge, MA: MIT Press, 1997), 375–401. 5 “To mediate an object, a digital or computational device requires that this object be translated into the digital code that it can understand. […] The key point is that without the possibility of discrete encoding there is no object for the computational device to process.” David M. Berry, “The Com - putational Turn: Thinking About the Digital Humanities,” Culture Machine 12 (2011). 6 “[…] sensory substitution operates by mapping an otherwise absent modality into an existing one; absent vision into existing hearing. […] However for most, audiovisual transcoding links two modalities, ‘channels’ already in perceptual use. Secondly, sensory substitution involves long- term integration and interaction with the environment; […] there are some striking parallels, and transcoded AV certainly hints at artificial synesthesia and a rewired sensorium, but as bounded aesthetic objects these works cannot realise that perceptual transformation.” Margaret Schedel, “Colour is the Keyboard: Case Studies in Transcoding Visual to Sonic,” The Oxford Handbook of Algorithmic Music, eds. Roger T. Dean and Alex McLean, Oxford Handbooks Online (Oxford University Press, 2018), DOI: /10.1093/oxfordhb/9780190226992.013.8 . 7 “[…] a computer requires that everything is transformed from the continuous flow of our every - day reality into a grid of numbers that can be stored as a representation of reality which can then MZ_2022_2_FINAL.indd 156 MZ_2022_2_FINAL.indd 156 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 157 in what has been called multimodality. 8 Works in the contemporary repertoire integrate, in a structured way, music, dance, scenography and light through so - phisticated mappings. Examples of computer programs that specialize in gen - erating new musical structures from an algorithmic mapping of extra-musical phenomena such as fractals, 9 text, 10 or images 11 abound. The same can be said for recent approaches to sonification. 12 If the alliance between music and digital code is an obvious one, having been treated on numerous occasions in the literature, the study of specific transcoding strategies, sometimes called translational models in the context of algorithmic music, 13 seem to have received less attention. In this paper, we adopt the definition of musical transcoding as the act of transferring patterns, forms, or structures from a system of principles or rules (a code proper to an artistic or scientific domain) to the musical or compo - sitional code, and vice versa. In practice, in the musical context, transcoding refers to a series of creative techniques that call upon extra-musical elements and may seek a certain unity of musical content by sharing the same structure between several distinct parameters of the musical discourse. be manipulated using algorithms. These subtractive methods of understanding reality ( episteme) produce new knowledges and methods for the control of reality ( techne). They do so through a digital mediation, which the digital humanities are starting to take seriously as their problematic.” Berry, “The Computational Turn.” 8 Meinard Müller et al., “A Multimodal Way of Experiencing and Exploring Music,” Interdisciplin- ary Science Reviews 35, no. 2 (2010): 138–153; Marina Gall and Nick Breeze, “Music Composi - tion Lessons: The Multimodal Affordances of Technology,” Educational Review 57, no. 4 (2005): 415–433; W. Clemens, Body, Sound and Space in Music and Beyond: Multimodal Explorations, ed. Clemens Wollner, 1st ed. (London: Routledge, 2018). 9 See Omar López-Ortega and Shiani Ioana López-Popa, “Fractals, Fuzzy Logic and Expert Sys - tems to Assist in the Construction of Musical Pieces,” Expert Systems with Application 39, no. 15 (2012): 11911–11923; Eduardo Miranda, Composing Music with Computers (London: Routledge, 2001); David Little, “Composing with Chaos: Applications of a New Science for Music,” Journal of New Music Research 22, no. 1 (1993): 23–51. 10 For instance, Hannah Davis and Saif Mohammad, “Generating Music from Literature,” paper presented at Proceedings of the 3rd Workshop on Computational Linguistics for Literature (CLFL 2014), 1–10, DOI:10.3115/v1/W14-0901. 11 See Luísa Ribas, “Sound and Image Relations: A History of Convergence and Divergence,” Di- vergence Press 1, no. 2 (2014); Ernest Edmonds et al., “Audio-Visual Interfaces in Digital Art,” paper presented at Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE 2004), 331–336, DOI:10.1145/1067343.1067392; Esther Lemi, Anastasia Georgaki, and James Whitney, “Reviewing the Transformation of Sound to Image in New Computer Music Software,” paper presented at Proceedings of the 4th Sound and Music Computing Conference (SMC 2007), 57–63, DOI:10.5281/zenodo.849384 . 12 For instance Antonio Polo and Xavier Sevillano, “Musical Vision: An Interactive Bio-Inspired Sonification Tool to Convert Images into Music,” Journal on Multimodal User Interfaces 13, no. 3 (2019): 231–243. 13 Margaret Schedel identifies the following synonyms to the word transcoding: “synaesthetic algo - rithm, sensory substitution, transcoding, crossmodal, intermedia, and so on.” Schedel, “Colour Is the Keyboard.” MZ_2022_2_FINAL.indd 157 MZ_2022_2_FINAL.indd 157 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 muzikološki zbornik • musicological annual lviii/2 158 Not only the association of music with the digital code but also the use of extra-musical forms, patterns and structures as a compositional resource precedes the invention of the computer and goes back at least to classical antiquity. 14 If we think broadly, this definition of transcoding can encompass the prac - tice of association between mathematical proportions and musical intervals by the Pythagoreans, or the associations between modes and ethos in the ancient and medieval world, the association between musical metrics and poetics, or the music of spheres. In the West, one of the first treatises on composition, the Micrologus (1026 A.D.) by Guido d’Arezzo, 15 inaugurated a practice of transcoding that would last for centuries, the association of alphabetic code with musical pitches (Chapter XVII). In this eleventh century, Guide introduces a strategy to algo - rithmically map vowels from a given text musical pitches. After the method of composition combining text and melody introduced by d’Arezzo and taken up by Johannes Cotto, 16 another incarnation of this technique will be at the origin of one of the most intriguing scores of the Re - naissance, the Missa Hercules dux Ferrariae by Josquin des Prez. 17 The work is one of five compositions by different composers dedicated to Duke Hercules II of Ferrara, using the syllables of his name as the melodic basis, the cantus fir- mus, for each composition. This technique, called “Sogetto Cavato dalle vocali di queste parole” by Gioseffo Zarlino’s famous treatise on music theory, would resonate well into the twentieth century, when cellist Mstislav Rostropovich commissioned twelve composers, including Pierre Boulez, Luciano Berio, and Henri Dutilleux, to write a tribute to the patron of the arts Paul Sacher. 18 Boulez’s composition, Messagesquisse, will use not only the association be - tween syllables and pitches, but also syllables and rhythm through the Morse code. 19 In addition to its use as a tool of homage, the sogetto cavato will be used, throughout the history of Western music, as a means of introducing con - straints into the compositional work, as a challenge. If in the past the links between music, alphabetic code and literature have shaped most transcoding approaches, in the modern and contemporary period, 14 For a history of related practices see Helmut Kirchmeyer , “On the Historical Constitution of a Rationalistic Music,” Die Reihe 8 (1968): 11–24. 15 Claude Palisca (ed.), Hucbald, Guido, and John on Music: Three Medieval Treatises (New Haven: Y ale University Press, 1977). 16 Ibid. 17 Lewis Lockwood, “Soggetto cavato,” New Grove Dictionary of Music and Musicians, ed. Stanley Sadie (London: Macmillan, 1980), 17: 442–443. 18 C. Samuel, Works presentation by Claude Samuel (in the Volume’s order), in Pierre Boulez: Œuvres Complètes, CD collection (Universal Music Division Decca Records France, 2013). 19 Antoine B onnet , “Ecriture and Perception: On Messagesquisse by Pierre Boulez,” Contemporary Music Review 2, no. 1 (1987): 173–209. MZ_2022_2_FINAL.indd 158 MZ_2022_2_FINAL.indd 158 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 159 composers are going to look in other domains for structural sources. One of the most influential musical works of the 1950s was Iannis Xenakis’s Metasta- seis, in which the composer developed a framework of orchestral lines through a compositional plan that shared structural elements with the Philips Pavil- ion, a building commissioned by the Brussels Exposition Universelle of 1958 where Varèse’s Poème écletronique was premiered. 20 Since the beginning of the twentieth century, in many artistic practices, the interactions between visual and sound parameters form a cohesive and inter - dependent whole. Visual art schools such as the expressionist “Blue Knight” group led by Kandinsky, or the visual music of Hans Richter and Oskar Fisch - inger have paved the way for the transcoding art of today. 21 In the case of the music made by Spanish composer Javier Elipe-Gimeno for Peter Tscherkassky’s film Outer Space (2019), a transcoding approach al - lowed an investigation associating parameters of musical composition and computer analysis of data extracted from film frames (data such as the relative percentage of black and white dots per frame, the difference between the posi - tion of the dots from one frame to another). Other approaches to transcoding from experimental film to music are found in the work of artists like Bernhard Lang and Norbert Pfaffenbichler. Outer Space Projects Film Project Outer Space, by Peter Tcherkassky, 22 is an Austrian experimental film made in 1999. This film uses the technique of “Found Footage,” 23 which uses images from other films to create a new editing (and create new sensations). In this case, Tscherkassky based himself on the American horror film The Entity 24 (Sidney J. Furie, 1982) to create the films Outer Space (1999) and Dream Work (2000). 25 For the realization of these films, the Austrian director copied sev - eral frames from the original film ( The Entity) on a 35 mm virgin film in a camera obscura. For this, he used several laser devices (of different sizes and 20 Kurt Stone, “Reviews of Records: Xenakis: Metastaseis (1953–54); Pithoprakta (1955–56); Eonta for Piano, Two Trumpets, and Three Trombones (1963–64),” The Musical Quarterly 54, no. 3 (1968): 387–395, DOI:/10.1093/mq/LIV.3.387. 21 See Schedel, “Colour is the Keyboard.” 22 Outer Space, Index 8: Films From a Dark Room, DVD, directed by Peter Tscherkassky (Vienna: Index Edition, 2006). 23 Jonathan Rosenbaum, “ Lost Material and Found Footage: Peter Tscherkassky ’s Dark Room – and Ours,” Found Footage 4 (2018), published also at https://jonathanrosenbaum.net/2021/08/ lost-material-and-found-footage-peter-tscherkasskys-dark-room-and-ours-tk/ . 24 The Entity , DVD, directed by Sidney J. Furie (Los Angeles: Twentieth Century Fox , 1982). 25 Dream Work, Index 8: Films From a Dark Room, DVD, directed by Peter Tscherkassky (Vienna: Index Edition, 2006). MZ_2022_2_FINAL.indd 159 MZ_2022_2_FINAL.indd 159 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 muzikološki zbornik • musicological annual lviii/2 160 shapes), which allow him different ways of copying the photograms. 26 This way of working accentuates certain impressions and sequences of the film. The new “Found Footage” editing creates in Outer Space an additional emphasis on the scenes of aggression, creating a new sensory dimension. We can see an example of this concept in the sequence of the film that begins at minute 5’57’’, where the same sequence of shots is repeated several times. 27 This continuous repeti - tion creates a new sensation in the viewer. For the creation of the original soundtrack, Tscherkassky used a similar technique, but applied to sound. For this, he copied some fragments of the sound part of the original film onto the virgin film. 28 This sound composition was therefore composed of residual sounds from the original film ( The Entity ), caused by copying parts of the image into the sound part of the virgin film. This process causes a certain type of saturated sounds. This soundtrack creates a perfect balance between moments of emptiness and moments of visual sound saturation of the film. The articulation between the images and the film is organized in an organic way, thanks to the use of common structures between the sound and the visual part (use of images as waveforms, use of images as a musical motif, etc.). The composition 29 of a new soundtrack for the film Outer Space for electric guitar, piano, saxophone, percussions and electronics was a difficult exercise – the Tscherkassky films are a very energetic, alternating between very dense and empty scenes (e.g. white screens, etc.). From a traditional image composition point of view, this could lead us to an illustrative composition that would only accentuate the visual intention. Likewise, a counterpoint composition could give us information that is not related to the images. The work of transcoding can allow us to find new musical propositions which are adapted to the com - plexity of the images. The information provided by the transcodage work can show us how to adapt to the constant evolution of the film. For this purpose, it is necessary a search for common structures between the visual and sound planes (in a certain way, similar to the work that Peter Tcherkassky had done). The use of computer analysis in this work has as its main motivation the search for common structures between the musical and cinematographic parts. This work show us how to explore and search for new articulations between the mu - sical/sound and visual parts. For instance, we can discover with the computer 26 Matthew Levine, “ Controlled Chaos: The Cinematic Unconscious of Peter Tscherkassky,” Found Footage Magazine 4 (2018). 27 “Peter Tscherkassky – Outer Space,” Vimeo Video, 9:22, January 30, 2019, in Index 8: Films From a Dark Room, DVD, directed by Peter Tscherkassky (Vienna: Index Edition, 2006), https://vimeo. com/314251447. 28 You can see an example of this way of working in the following link: “Cinémas de traverse [ex - cerpt]: Peter Tscherkassky Interview,” Y ouTube video, 0:40, May 26, 2015, a ccessed November 19, 2022, https://www.youtube.com/watch?v=aitaaM-ZmHU . 29 Javier Elipe-Gimeno, “Outer Space,” unpublished edition (Paris, 2019). MZ_2022_2_FINAL.indd 160 MZ_2022_2_FINAL.indd 160 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 161 analysis some musical-visual gestures that we cannot appreciate in a simple viewing of the film, finding new links between film and music. The objective is then to work and explore a writing of the film/music relationship, which will evolute during the whole film. Using Algorithms for the Analysis of the Film’s Visual Data In order to find common parameters between film and music, we realized dif - ferent types of analysis performed with the Jitter library in Max/MSP 30 soft- ware. 31 For this, we searched for different types of algorithms, such as: 1. Predominance of blacks/whites: The first of these algorithms is based on the analysis of the average brightness, establishing for example the value 0 when the brightness color is black, 1 when is white and 0.5 when the amount of black and white is similar. As Outer Space is a film with rapidly changing events, this analysis generates luminosity gestures that could be interpreted in a musical way. In Figure 1, we can see an exemple of this kind of analysis programmed in Ircam in 2018. 2. Number of oscillations per second: The second type of algorithm to ex - plore is to perform a calculation of the rhythm of footage and the rate of change of shots in the film. The idea was to calculate predominant shots and to obtain the rhythm and proportions: This would allow us to be aware in a general way of thematic proportions of each shot. 3. Spatialization in the picture: The third type of algorithm allows us to calculate in where of the picture action takes place. One can see that, for example, in Figure 3, when action of the film occurs either in the center at the edges or distributed evenly over the whole frame. 4. The fourth type of algorithm performs a calculation of intermediate planes that can exist between two main planes. This would allow us to cal - culate the transitions and the type of permutation between them. There - fore it creates a progression graph. 5. The last type of algorithm for the analysis of the film images is the distur - bance (number of elements per shot), which allows us to calculate intelli - gibility of the shots or the number of elements per shot. For example, the intelligibility of the images in relation to basic geometric figures, such as the circle, square or triangle. 30 Max/MSP is an object programming language for music and multimedia initiated at Ircam- Centro Pompidou in 1985 (originally called The Patcher), and currently developed by the Cali - fornian company Cycling ’74. “What is Max?,” Cycling ’74, accessed November 19, 2022, https:// cycling74.com/products/max . 31 The jitter library is a collection of Max objects that can be used in the Max/MSP software for video processing and analysis. MZ_2022_2_FINAL.indd 161 MZ_2022_2_FINAL.indd 161 10. 02. 2023 13:52:10 10. 02. 2023 13:52:10 muzikološki zbornik • musicological annual lviii/2 162 Figure 2: Analysis of the spatialization in the picture, from a Max/MSP/Jitter patch made in Ircam in 2018. Figure 1: Analysis of the average brightness, from a Max/MSP/Jitter patch made in Ircam in 2018. MZ_2022_2_FINAL.indd 162 MZ_2022_2_FINAL.indd 162 10. 02. 2023 13:52:13 10. 02. 2023 13:52:13 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 163 Application of the Concepts of Music in Film Analysis Using Computer Analysis Among the diff erent algorithms planned, we have studied in more depth the fi rst one, based on the light analysis or predominance of blacks/whites (the other algorithms will be used later in future stages of the work). We have cho- sen this fi rst algorithm because it was the one that, from the beginning, gave the clearest and most direct results. Th e luminosity curve of this analysis shows us a graphic that we use in four diff erent ways: global structure; relationship between fi lm and music; complementarity relations with the visual part and part and rhythmic counterpoint. A) Analysis of the general structure Th is fi rst use of the analysis help us to obtain a global structure of the fi lm. Th e Max/MSP patch 32 used for this analysis permitted us to record the curve and to give the global structure of the fi lm. As we can see in Figure 3, the general brightness curve shows diff erent sections. We have divided the curve into nine parts, coinciding with the nine sections of the musical composition. As men- tioned in the In the section Using Algorithms [...], when the waveform is at its lowest value, it represents the abundance of black in the image. When it is at its highest value, a predominance of white. Figure 3: Analysis of the general structure using the luminosity curve. Th e brightness curve represented by a waveform makes an analogy with music. We can fi nd several coincidences between the sound and visual structure. In Figure 3, the diff erent sections have been ordered by diff erent criteria: the size, 32 Th is patch Max was worked at Ircam in collaboration with the RIM Carlo Laurenzi. MZ_2022_2_FINAL.indd 163 MZ_2022_2_FINAL.indd 163 10. 02. 2023 13:52:13 10. 02. 2023 13:52:13 muzikološki zbornik • musicological annual lviii/2 164 the geometry and the temporality of the waveforms. Th e fi rst two criterias show us a structure that is common to the visual and sound plane. Th e tempo- ral criteria allow us to fi nd sections that had a similar duration, causing visual shapes that can help us to fi nd these nine divisions. B) Writing the relationship between fi lm and music We can also use the brightness curve in order to make it dialogue with the other instruments. Th e visual form of this analysis allows us to make a similar- ity between the luminosity curve and the instrumental lines. In Figure 4, we can observe how the luminosity curve interacts with the other instrumental lines, treating this curve as a fi fth instrument. Figure 4a: Use of the luminosity curve in Javier Elipe-Gimeno’s Outer Space score (bars 30–34). Figure 4b: Use of the luminosity curve in Javier Elipe-Gimeno’s Outer Space score (bars 30–34). MZ_2022_2_FINAL.indd 164 MZ_2022_2_FINAL.indd 164 10. 02. 2023 13:52:15 10. 02. 2023 13:52:15 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 165 In Figure 4, the design of the curve allows us to interact with the instruments in diff erent ways: doubling, making a sort of sonfi ciation of the luminosity or by counterpoint (brightness gesture does not correspond to that of the instru- mental lines). Th is type of work off ers diff erent ways of interaction with the images: 1. From parallelism to non-synchronism: Th is results in a musical composi- tion that constantly alternates between a work of accompaniment and a synchronization between the image and the music. We can appreciate a fairly regular balance between doubling and counterpoint. 2. Phenomenon punctuation: Diff erent types of punctuation, such as antici- pation, association, narrative counterpoint or recall, combine naturally with audiovisual discourse. 3. Using the black/white dominance curve analysis, the dialogue between in- struments and fi lm. Th e luminosity, which is a visual parameter, is used as a sound parameter. Th is therefore provokes a relatively organic fi lm-music interaction. C) Principle of complementarity with the visual part: audiovisual orchestration. Th is third kind of application of the luminosity curve shows the exchange of energy between the visual and sound parts. In other words, how the au- diovisual action passes from the screen to the sound, and vice versa. In some moments, the audiovisual activity is on the screen, other times in the sound part, and in other moments, in visual and sound levels at the same time. Th is give us a sensation of audiovisual orchestration. In Figure 5, we can see how the brightness curve has a rather low value. At that moment, the instrumental formation is active. At the moment when the brightness curve increase (two bars before the end), the other instruments stop, having the sensation of energy exchange between the visual and the soundart. Figure 5: Principle of complementarity in Javier Elipe-Gimeno’s Outer Space score (bars 19–25). MZ_2022_2_FINAL.indd 165 MZ_2022_2_FINAL.indd 165 10. 02. 2023 13:52:16 10. 02. 2023 13:52:16 muzikološki zbornik • musicological annual lviii/2 166 Th is eff ect or sensation of audiovisual orchestration is used in other mo- ments in a progressive way. Audiovisual orchestration is not always used in a compensatory way (i.e. more music when there is less action in the images, and vice versa). It can also act in a parallel way. For exemple, increase or de- crease of the action takes place at the same time in the sound and visual part. An example of this parallel audiovisual orchestration is shown in Figure 6. Th e progressive decrease of the intensity of the images (and of the marked rhythm they provoke) is accompanied by a similar work in the music. Th e dynamics used in the music decrease at the same time as the brightness curve reaches the value 0. D) To give a rhythmic counterpoint to the energy given by the images Th e luminosity curve can also be used to obtain a rhythmic counterpoint. In this case we can create a rhythmic counterpoint with the help of granular syn- thesis. To obtain it, we use a percussion sound (a symphonic bass drum in our case) which will allow us to obtain a precise rhythm. Th is gives us a rhythmic value that varies constantly. Th is value will decrease when the white light in- creases (getting an acceleration eff ect) and increase (longer values) when there is a predominance of black tones. Figure 6: Principle of parallel audio-visual orchestration in Javier Elipe-Gimeno’s Outer Space score (bars 153–160). MZ_2022_2_FINAL.indd 166 MZ_2022_2_FINAL.indd 166 10. 02. 2023 13:52:17 10. 02. 2023 13:52:17 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 167 Figure 7: Granular synthesis Max/MSP patch used to determine the rhythm contra point of the Outer Space composition. In Figure 7, we can observe a granular synthesis module at the output of the luminosity analysis. A Max object for beat analyse can be used to determine precisely the rhythm of this counterpoint. In this way we link the luminosity gesture to a time scale, giving us a kind of rhythmic gestures. Other Transcoding Parameters in Addition to Computer Analysis Th e diff erent types of analysis shown in the previous section allow us to trans- port visual structures to the sound part, but also to analyze the images of a fi lm with parameters of musical analysis, obtaining a set of parameters common to both planes (visual and sound). Th is analysis does not intend to give precise information that has to be used to make the musical composition. Th e idea is to obtain a musical-and visual analysis that gives us a series of gestures repre- senting commun structures between the visual and sound parts. Consequently, these analyses are proposed as a basis to be completed with another series of elements. In our fi rst approach of this method, the analyses were completed with the following series of analogies: • Image oscillations: trills, oscillations notes • Flashing pictures: tremolos, alternating high and low notes • Visual accents: accentuation of a sound gesture • Symmetries in the picture: reversal of sound gestures • Image superposition: superposition of diff erent sound materials • Concatenative synthesis: creation of complex instrumental gestures MZ_2022_2_FINAL.indd 167 MZ_2022_2_FINAL.indd 167 10. 02. 2023 13:52:20 10. 02. 2023 13:52:20 muzikološki zbornik • musicological annual lviii/2 168 This way of proceeding, more intuitive, is completed with the use of the pa - rameters obtained in the analysis. For example, the image oscillations and the flashing moments analyzed by the luminosity curve will give us a more or less regular curve (also due to an analysis resolution problem). The analysis will not be able to determine these oscillations in detail. Consequently, the luminosity analysis was completed in this case by the addition of trills or tremolos between two notes. Another element that we have not yet been able to determine in the luminosity analysis is the interpretation of symmetries and overlapping planes. This was also completed with motifs and musical gestures superimposed in a retrograde, inverted manner, etc. Future Steps of Work The analysis and study of common structures between image and musical com - position using computer analysis is a complex process. This is necessary to make a synthesis between the possible algorithms and must be used with different possibilities and the application of the analysis. The use of these analyses employing graphic elements (such as the luminosity curve explained above) can be equally complicated to interpret musically. These analyses present therefore different possibilities of progression for future stages of work that we can divide them into three different axes. First, the work and adaptation of other algorithms, such as those described in the third point of this analysis: • Analysis of the action in the frame of the image: central / side / all frames ... • Analysis of music harmonic fields • Analysis of the action within the framework of the image • Analysis of image symmetries • Image intelligibility: recognizable geometric shapes in images • Development of sound synthesis tools related to image analysis A secondary axis would be to combine the algorithms used with various CAO tools such as the Open Music software 33 or the Bach library 34 in Max/MSP , which would allow to associate the results of the computer analysis to the use of musical intervals and rhythmic values. Another possibility is the application of the values obtained using real-time processing modules. For example, in Figure 8, a synthe - sizer programmed with the help of Max/MSP/Jitter works with the parameters obtained from the film analysis. In this case, the division of the screen into vertical 33 OpenMusic is a object programming langage based on CommonLisp created and developed at Ircam-Centre Pompidou. “OpenMusic,” IRCAM, accessed November 19, 2022, https://forum. ircam.fr/projects/detail/openmusic/ . 34 The Bach Library is a collection of Max objects that can be used in the Max/MSP software who work on computer-assisted composition in a real-time world . “Projects,” Bach, accessed Novem - ber 19, 2022, https://www.bachproject.net/#latest-news . MZ_2022_2_FINAL.indd 168 MZ_2022_2_FINAL.indd 168 10. 02. 2023 13:52:20 10. 02. 2023 13:52:20 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 169 segments allows us to obtain a time line of values. Th anks to the adjustment of diff erent parameters, such as the application of diff erent fi lters, noise parameters, beats par loop, etc., we can obtain a sound result from the analysis performed. Figure 8: A Max/MSP synthesizer patch plays with the parameters obtained with the analyses, made by Carlo Laurenzi from Ircam. Th e third axis would be the improvement of the graphical representation of the analyses used, which in the analyses shown was the luminosity curve rep- resented by a waveform. Th is graphical representation could be improved by adding, for example, other analyses to this luminosity curve. For exemple the spatialisation in the picture or the rhythmic counterpoint proposition. Th e way to link the diff erent types of analysis to the graphical representation would be for example to use diff erent colors, or even to use a 3D representation in order to avoid confusion of the analysis curves used. Conclusions With this paper, we have shown an application of the concept of transcoding, a concept that runs through virtually the entire history of Western music, from antiquity to the present day. Transcoding remains of great importance today thanks to the diff erent multimodal tools allowed by computer music. Among the most representative applications of this approach is the alliance between musical composition and experimental cinema. At the other end of the modern spectrum, the work of artists such as Norbert Pfaff enbichler and MZ_2022_2_FINAL.indd 169 MZ_2022_2_FINAL.indd 169 10. 02. 2023 13:52:21 10. 02. 2023 13:52:21 muzikološki zbornik • musicological annual lviii/2 170 Bernard Lang continue this tradition of perpetual transcoding between sight and sound. In this case study, we have analyzed how this relationship material - izes in the soundtrack of Outer Space taking into account many transcoding principles. The computer analysis of the images of the film Outer Space allowed the composer to explore the different common structures between the visual and sound planes. The algorithms used allowed us to explore different parameters of the images. Among them, the most used was the luminosity curve, which allowed us to explore the same visual gesture with different prisms: on a verti - cal point of view, with the duplication of instrumental gestures, on a horizontal point of view, with the analysis of the general structure of the film. And finally, on an intermediate point of view, with the analysis of the energies between the two plans. The luminosity analysis allowed us to represent these analyses in a single analysis curve. In this way, we were able to synthesize the analyses used in a graphic score. This graphical score could be enriched by adding other algo - rithms in the computer analysis. One way to be able to add the different algo - rithms in a single graphical score would be to make a 3D graphical score, thus having the different common parameters between the two planes in a single graphical representation. References Berry, David M. “The Computational Turn: Thinking About the Digital Humanities.” Cul- ture Machine 12 (2011): 1–22. Bonnet, Antoine. “Ecriture and Perception: On Messagesquisse by Pierre Boulez.” Contempo- rary Music Review 2, no. 1 (1987): 173–209. Buelow, George J., and Hans Joachim Marx. New Mattheson Studies. Cambridge University Press, 2007. Carpenter, Grégoire, Victor Cordero, and Éric Daubresse. “Cartographier le timbre.” Disso- nance 119 (2012): 48–63. Accessed June 2022. https://www.dissonance.ch/upload/pdf/ diss119web_hb_daubresse_exemples.pdf . Davis, Hannah, and Saif Mohammad. “Generating Music from Literature.” Paper present - ed at Proceedings of the 3rd Workshop on Computational Linguistics for Literature (CLFL 2014), Gotheburg, Sweden, April 27, 2014, 1–10. DOI:10.3115/v1/W14-0901. Edmonds, Ernest, Andrew Martin, and Sandra Pauletto. “Audio-Visual Interfaces in Digi - tal Art.” Paper presented at Proceedings of the 2004 ACM SIGCHI International Confer- ence on Advances in Computer Entertainment Technology (ACE 2004), Singapore, 331–336. DOI:10.1145/1067343.1067392. Elipe, Javier-Gimeno. “Outer Space .” Unpublished edition. Paris, 2019. Gall, Marina, and Nick Breeze. “Music Composition Lessons: The Multimodal Affordanc - es of Technology.” Educational Review 57, no. 4 (2005): 415–433. Good, Michael. “MusicXML for Notation and Analysis.” In The Virtual Score: Representa - tion, Retrieval, Restoration, edited by Walter B. Hewlett and Eleanor Selfridge-Field, 113–124. Cambridge, MA: Mit Press, 2001. MZ_2022_2_FINAL.indd 170 MZ_2022_2_FINAL.indd 170 10. 02. 2023 13:52:21 10. 02. 2023 13:52:21 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 171 Huron, David. “Humdrum and Kern: Selective Feature Encoding.” In Beyond MIDI: The Handbook of Musical Codes, 375–401. Cambridge, Ma: MIT Press, 1997. Huron, David. “Music Information Processing Using the Humdrum Toolkit: Concepts, Examples, and Lessons.” Computer Music Journal 26, no. 2 (2002): 11–26. Jacob, Bruce L. “Algorithmic Composition as a Model of Creativity.” Organized Sound 1, no. 3 (1996): 157–165. DOI: 10.1017/S1355771896000222. Kirchmeyer, Helmut. “On the Historical Constitution of a Rationalistic Music.” Die Reihe 8 (1968): 11–24. Lemi, Esther, Anastasia Georgaki, and James Whitney. “Reviewing the Transformation of Sound to Image in New Computer Music Software.” Paper presented at Proceed- ings of the 4th Sound and Music Computing Conference (SMC 2007). DOI:10.5281/ zenodo.849384 . Levine, Matthew. “Controlled Chaos: The Cinematic Unconscious of Peter Tscherkassky.” Found Footage Magazine 4 (2018). Little, David. “Composing With Chaos: Applications of a New Science for Music.” Journal of New Music Research 22, no. 1 (1993): 23–51. Lockwood, Lewis. “Soggetto cavato.” New Grove Dictionary of Music and Musicians, edited by Stanley Sadie. London: Macmillan, 1980. 17: 442–443. López-Ortega, Omar, and Shani Ioana López-Popa. “Fractals, Fuzzy Logic and Expert Systems to Assist in the Construction of Musical Pieces.” Expert Systems with Applica- tions 39, no. 15 (2012): 11911–11923. Macleod, Catriona. Elective Affinities: Testing Word and Image Relationships. Amsterdam: Rodopi, 2009. Mattheson, Johann. Der vollkommene Capellmeister, edited by Friederike Ramm. Bärenreit - er-Verlag, 1999. Miranda, Eduardo. Composing music with computers. London: Routledge, 2001. Müller, Meinard, Verena Konz, Michael Clausen, Sebastian Ewert, and Christian Fremerey. “A Multimodal Way of Experiencing and Exploring Music.” Interdisciplinary Science Reviews 35, no. 2 (2010): 138–153. Nierhaus, Gerhard. Algorithmic Composition: Paradigms of Automated Music Generation. Ber - lin: Springer Science & Business Media, 2009. Palisca, Claude V., ed. Hucbald, Guido, and John on Music: Three Medieval Treatises. New Haven: Yale University Press, 1977. Parra, Hèctor. Pour une approche créatrice des interrelations structurelles entre les espaces acous- tiques et visuels. Mémoire de DEA, Université Paris 8 Vincennes – Saint Denis, 2005. Polo, Antonio, and Xavier Sevillano. “Musical Vision: An Interactive Bio-Inspired Sonifica - tion Tool to Convert Images into Music.” Journal on Multimodal User Interfaces 13, no. 3 (2019): 231–243. Ribas, Luísa. “Sound and Image Relations: A History of Convergence and Divergence.” Divergence Press 1, no. 2 (2014). Roland, Perry. “The Music Encoding Initiative (MEI).” Paper presented at MAX2002: Pro- ceedings of the First International Conference on Musical Applications Using XML, 55–59. http://xml.coverpages.org/MAX2002-PRoland.pdf . Rosenbaum, Jonathan. “Lost Material and Found Footage: Peter Tscherkassky’s Dark Room – and Ours.” Found Footage 4 (2018). Published also at https://jonathanrosen - baum.net/2021/08/lost-material-and-found-footage-peter-tscherkasskys-dark-room- and-ours-tk/ . MZ_2022_2_FINAL.indd 171 MZ_2022_2_FINAL.indd 171 10. 02. 2023 13:52:21 10. 02. 2023 13:52:21 muzikološki zbornik • musicological annual lviii/2 172 Schedel, Margaret. “Colour Is the Keyboard: Case Studies in Transcoding Visual to Sonic.” The Oxford Handbook of Algorithmic Music , edited by Roger T. Dean and Alex McLean. Oxford Handbooks Online. Oxford: Oxford University Press, 2018. DOI: /10.1093/ oxfordhb/9780190226992.013.8 . Shenton, Andrew. Olivier Messiaen’s System of Signs: Notes Towards Understanding His Music. London: Taylor & Francis, 2017. Stone, Kurt. “Reviews of Records: Xenakis: Metastaseis (1953–54); Pithoprakta (1955–56); Eonta for Piano, Two Trumpets, and Three Trombones (1963–64).” The Musical Quarterly 54, no. 3 (1968): 387–395. DOI:/10.1093/mq/LIV.3.387. Studemann, Frederick. “The Sounds of the Financial Crisis.” Interview with Julian An - derson. The Financial Times . Podcast Audio. December 21, 2010. https://www.ft.com/ content/56f5a76f-bca1-4312-b234-0aebbea3eb19 . “What is Max?” Cycling ’74. Accessed November 19, 2022. https://cycling74.com/products/ max. Wiggins, Geraint A. “Computer Representation of Music in the Research Environment.” In Modern Methods for Musicology: Prospects, Proposals, and Realities, edited by Tim Crawford and Lorna Gibson, 27–42. London: Routledge, 2016. Wöllner, Clemens. Body, Sound and Space in Music and Beyond: Multimodal Explorations. 1st Edition. London: Routledge, 2018. Videography “Cinémas de traverse (excerpt): Peter Tscherkassky Interview.” YouTube video, 0:40. May 26, 2015. Accessed November 19, 2022. https://www.youtube.com/ watch?v=aitaaM-ZmHU . Dream Work, Index 8: Films From a Dark Room. DVD. Directed by Peter Tscherkassky. Vi - enna: Index Edition, 2006. Outer Space, Index 8: Films From a Dark Room. DVD. Directed by Peter Tscherkassky. Vi - enna: Index Edition, 2006. “Peter Tscherkassky – Outer Space.” Vimeo Video, 9:22. January 30, 2019. In Index 8: Films From a Dark Room. DVD. Directed by Peter Tscherkassky. Vienna: Index Edition, 2006. https://vimeo.com/314251447 . The Entity . DVD. Directed by Sidney J. Furie. Los Angeles: Twentieth Century Fox, 1982. MZ_2022_2_FINAL.indd 172 MZ_2022_2_FINAL.indd 172 10. 02. 2023 13:52:21 10. 02. 2023 13:52:21 J. Elipe-Gimeno, C. de Paiva Santana: Transcoding as a Compositional Paradigm 173 POVZETEK Prekodiranje kot paradigma komponiranja: Povezava kompozicijskih parametrov in računalniške analize gibljivih slik v skladbi Outer Space (Vesolje) Novi oxfordski ameriški slovar definira »prekodiranje« kot »pretvorbo (jezika ali informacije) iz ene kodirane oblike v drugo.« Prekodiranje se nanaša na vrsto tehnik, ki se bodisi opirajo na zunajglasbene elemente bodisi iščejo določeno enotnost znotraj glasbene vsebine v enot - nih strukturnih elementih glasbenega diskurza. Če je v preteklosti prekodiranje temeljilo na povezavi med glasbo, abecedno kodo in literaturo, pa v sodobnem obdobju skladatelji iščejo strukturne povezave na drugih področjih. Pri glasbi, ki spremlja sodobne eksperimentalne filme, lahko pristop prekodiranja omogoči raziskave, ki parametre glasbene kompozicije z računalniško analizo povežejo s podatki, pridobljenimi iz gibljivih slik (podatki, kot so relativni odstotek črnih in belih pik v posameznem kadru, razlika med položajem pik od enega kadra do drugega). Zgodovina glasbenih praks kaže, da glasba in prekodiranje tvorita pomemben binom, ki nam ponuja podrobnejši znanstveni pogled na glasbeno analizo. Prispevek se osre - dotoča na eno od avtorjevih skladb, Outer Space (Vesolje), zasnovano s tehnikami prekodira - nja, ki povezujejo glasbo in eksperimentalni film. Uporabljeni algoritmi so nam omogočili raziskovanje različnih parametrov slik. Med njimi je bila najpogosteje uporabljena krivulja svetilnosti, ki nam je omogočila raziskovanje iste vizualne geste z različnimi prizmami: z vertikalnega zornega kota, s podvajanjem instrumentalnih linij, s horizontalnega gledišča, z analizo splošne kompozicije filma. In končno, z vmesnega vidika, z analizo energij med obema materialoma. MZ_2022_2_FINAL.indd 173 MZ_2022_2_FINAL.indd 173 10. 02. 2023 13:52:21 10. 02. 2023 13:52:21 muzikološki zbornik • musicological annual lviii/2 174 ABOUT THE AUTHORS JAVIER ELIPE-GIMENO (elipe-gimeno@prism.cnrs.fr ) is composer and author of a thesis entitled “Composing from Silent Cinema: A Theoretical and Practical Approach.” Javier Elipe-Gimeno is regularly in charge of projects with music and image in collabora - tion with other artistic and scientific disciplines. He has also conducted research on sound composition and experimental cinema around the topic of “violence and society” and has participated in projects mixing new technologies and composition. His research is based on three axes: Sound composition and experimental cinema, musical composition and new technologies and computer assisted orchestration. CHARLES DE P AIV A SANT ANA (charles@prism.cnrs.fr ) holds a PhD in music from the University of Campinas and in computer science from the University Pierre-et-Marie-Curie. His research focuses on the analysis, modeling and computer simulation of compositional strategies of contemporary repertoire, as well as their impact on perception. He teaches 20th century and computer music at the Aix Marseille University. O AVTORJIH JAVIER ELIPE-GIMENO (elipe-gimeno@prism.cnrs.fr ) je skladatelj in avtor disertacije z naslovom »Skladanje iz nemega filma: Teoretični in praktični pristop.« Redno sodeluje v projektih, ki obravnavajo povezavo med glasbo in sliko ter z drugimi umetniškimi in znan - stvenimi disciplinami. Izvajal je tudi raziskave o zvočni kompoziciji in eksperimentalnem filmu na temo »nasilje in družba« ter sodeloval pri projektih mešanja novih tehnologij in kompozicije. Njegove raziskave temeljijo na treh oseh: zvočna kompozicija in eksperimen - talni film, glasbena kompozicija in nove tehnologije ter računalniško podprta orkestracija. CHARLES DE P AIV A SANT ANA (charles@prism.cnrs.fr ) je doktoriral iz glasbe na Uni - verzi v Campinasu in iz računalništva na Univerzi Pierre-et-Marie-Curie, njegove raziskave pa se osredotočajo na analizo, modeliranje in računalniško simulacijo kompozicijskih strategij sodobnega repertoarja ter vpliva teh na recepcijo. Glasbo 20. stoletja in računalniško glasbo poučuje na Univerzi Aix Marseille. MZ_2022_2_FINAL.indd 174 MZ_2022_2_FINAL.indd 174 10. 02. 2023 13:52:21 10. 02. 2023 13:52:21