Saturday 28 Lectures and workshops

IMPROTECH Paris - Αθηνα 2019



Detailed program

Thursday Sept. 26

Friday Sept. 27

Saturday Sept. 28

Sunday Sept. 29



Saturday Sept 28 - Lectures

University of Athens, 09:30 - 13:30

Improvisation, Digital Intelligence and Cultural Heritage



Movies

Conference recordings on YouTube

09:30

Keynote talk : From Digital to Human Intelligence in Music Understanding Research

Xavier Serra (Universitat Pompeu Fabra, Sp.)

We are able to develop AI algorithms that solve complex musical tasks, yet, we are unable to apply these powerful technologies to help understand and improve our own musical comprehension abilities. Our machines are rapidly becoming capable of “understanding” music, while we still use traditional and time-consuming educational methods for training people in the development of their basic musical skills, or for that matter, in the development of most cognitive-based human capabilities. To make sense of a particular music listening experience, as listeners we identify relevant auditory cues and then piece the cues together into patterns that can be retained long enough for brain mechanisms to examine and create the impression of auditory objects. Music lovers that appreciate and comprehend a particular musical style are able to verbalize their cognitive experience after listening to a music piece of that style. In this talk we propose that by building on prior research from the fields of Music Cognition, Music Information Retrieval, and Music Education we should be able to develop tools and perceptual training methodologies with which to help a naive listener to understand and apreciate a music tradition to which they had no prior exposure. Given that computers will never be able to comprehend or feel for us, we should do our best to build systems that can help us with that.

[CANCELED] Forms of presence in instrumental and electronic improvisation in relation to cultural contexts

Marc CHemillier (EHESS, Fr.)

10:15

" Jazz Mapping ” : Thematic Development and Story Telling in Jazz Improvisation

Dimitri Vassilakis (University of Athens, Gr.)

“Jazz mapping" is a multi-layer analytical approach to jazz improvisation based on hierarchical segmentation and categorization of segments, or constituents, according to their function in the overall improvisation. In this way higher-level semantics of transcribed and recorded jazz solos can be exposed. In this approach, the knowledge of the expert jazz performer is taken into account in all analytical decisions. We apply the method to two well-known solos, by Sonny Rollins and Charlie Parker and we discuss how improvisations resemble storytelling, employing a broad range of structural, expressive, technical and emotional tools usually associated with the production and experience of language and of linguistic meaning. We make explicit the choices of the experienced jazz improviser who has developed a strong command over the language and unfolds a story in real time, very similar to prose on a given framework, He/she utilizes various mechanisms to communicate expressive intent, elicit emotional responses, and make his/her musical “story,” memorable and enjoyable to fellow musicians and listeners. We also comment on potential application areas of this work related to music and artificial intelligence.

10:45

Coffee Break

11:15

Metrical Polyrhythms and Polytemporality in live Improvisation Settings

Sami Amiris (American College, Athens Big Band, Gr.), Antonis Ladopoulos (University of Athens, American College, Gr.)

Artistic explorations of two improvising musicians who dare to dwell on the fringe of polytemporality and metrical polyrhythmicity. The musicians forming the critically acclaimed Phos Duo, expand their horizons by each expanding on distinct metrical surfaces - thus each one has a different sense of the whole than the other, like observers with different speeds in the theory of relativity - while still listening to each other and interacting. To make this possible in practice, a new type of metronomic sequencer - like an individual mechanical conductor - is used, capable of handling different timings simultaneously and independently from channel to channel, but still controllable overall. The result is an exciting and challenging musical environment that the duo finds fruitful for the creation of new musical textures, both through-composed and improvised.

11:45

Polyphonic Conversations

Peter Nelson (University of Edinburgh, UK)

“... what’s intriguing about the ... improvisation I do is that its more polyphonic than your average conversation ...” John Oswald, interview 2009.

In recent years, the technologies used in making music and other time-based arts have changed radically, engaging with communication networks that have settled into every crevice of our social realities. This circumstance changes even the notion of what we consider to be the social, as well as the ways in which we engage with artefacts, processes and institutions. This talk will explore the implications of current technologies for improvisation strategies, and will interrogate the conversation as an expanding and transforming discourse whose agents may come to include more than simply human selves.

12:15

Disposable Music

Georg Hajdu (Hochschule für Musik und Theater Hamburg, DE)

My presentation introduces the concept of real-time composition and composition as a dispositif in the sense of Foucault and Deleuze, defining it as a heterogeneous ensemble of pieces which together form an apparatus. The introduction situates the dispositif in the context of cultural developments, most notably its slow, but steady shift away from textualization in digital media. As musicians are adapting to ensuing cultural and, above all, economic changes, new music forms emerge which rely to a lesser degree on fully-notated scores such as comprovisation or laptop performance. Antithetically, the computer also allows the creation of “author-less” notated scores in real-time to be sight-read by capable musicians—a practice for which special software has been developed in recent years. Since these scores are not meant to be kept and distributed, they are ephemeral and, therefore, disposable. Examples are given to illustrate the interwovenness of this approach, where carefully selected narratives and dramaturgies make up for the inherent unpredictability of the outcome.


13:30

Lunch Break




Saturday sept 28 - Workshops

Onassis STEGI, 16:00 - 19:00

Game, mobiles, transducers



16:00 - 17:00

The Dynamic Percussion System: A procedural music engine for video games

Daniel Brown (Intelligent Music System, USA)

The Dynamic Percussion System is software used in commercial video games like Rise of the Tomb Raider (2015) that composes procedural percussion music that adapts to game action in real time.
While the use of procedural music in video games is an exciting development, there are many unanswered questions and issues surrounding it. How does it fit into the traditional workflow of professional composers and sound designers? What choices do such people have when “authoring” procedural music? How does it interact with precomposed game music?
The Dynamic Percussion System comprises both an in-game playback system and an authoring tool to be used by composers and sound designers. The design of the authoring tool––its user interface and functionality––addresses these questions. It offers one model of how the new techniques of generating procedural music can be adapted into the traditional methods used in commercial development. It has also motivated new techniques for generating, implementing, and interpreting game music.

17:00 - 18:00

Improvisation with Motion Sensors and Live coding: Combining Dance and Instrumental Improvisation

Ioannis Zannos (Ionian University, Gr.)

This workshop introduces techniques of improvisation with wearable movement sensors combined with live coding. Movement sensors based on IMU (Inertial Measurement Units) are used to measure the movement of a performer. The motion data is transmitted to computer over wifi, and live coding is used to control the generation of sound in SuperCollider and graphics on the Godot Gaming engine. The workshop shows how to use the sc-hacks library in order to program and modify the response of the system to sensor data. Techniques for sending the control data to remote locations over the internet are shown. This enables joint performance from several different remote locations. At each location the data sent from all performers is used to synthesize the audiovisual performance.

18:00 - 19:00

Composing “musiques mixtes” : acoustic spaces, improvisation and gestures.

Lara Morciano (composer, It.), Jose-Miguel Fernandez (composer, Ircam, Fr.)

The workshop introduces some possibilities of real-time interaction between instruments and electronic, explored through a device which uses the hands movements of the performer to control and synchronize various sound processes. This presentation will mainly focus on the Philiris composition for piano, motion capture and transducers. The different sections of the piece will be presented from a technical and compositional point of view. Examples of the interaction between the real and virtual piano linked to motion capture, sound synthesis and real time treatments through transducers within the piano and the notion of "double" will be presented. It will also show examples of Antescofo, score follower and programming language software, for the creation of real-time processes and interaction in relation with the notions of acoustic spaces, improvisation and gestures.