Mixing Modalities: Graphical and Text-Based Interaction in Music Notation Editing
Authors: Nowakowski, Matthias / Berndt, Axel / Plaksin, Anna / Şahin, Nevin / Hadjakos, Aristotelis
Date: Thursday, 7 September 2023, 4:15pm to 5:45pm
Location: Main Campus, L 1 <campus:stage>
Abstract
Editing of music notation is done in graphical as well as in text-based interfaces. The latter is still common practice in the MEI community while the prior tends to be the more intuitive approach. In the context of the currently developed research and editing Tools mei-friend (Goebl, Weigl, 2022) and VIBE (Nowakowski, Hadjakos, Stärk, 2022) we discuss possibilities to combine both worlds by enriching text-based music editors with specialized GUI widgets. The aim is not a WYSIWYG interface of the whole music notation but lightweight, context-sensitive graphical elements that complement the XML encoding.
Music notation is complex. Due to the high information density of music notation, it is clear for a skilled reader to determine an element in its context and how it is to be played. But often there is room for interpretation, which arises e.g. by historical distance or changing purposes and contexts (Plaksin et al., 2022). So how does one map this complex sign system on machine interactions?
Interaction with music notation usually happens through a mixed modality of mouse and keyboard, the latter of which is highly optimized towards editing textual information in a one-dimensional way. Common music notation programs use these modalities to accomplish their manyfold tasks. Formats like MusicXML or MEI can textually represent this information density, complexity and hierarchy in which the element is placed. A graphical method would have the advantage to translate this information density more quickly into text by a unique gesture or interaction with the user interface.
Some examples could be:
-
Preview of single MEI Elements while typing.
-
Interactive navigation from text to musical score and vice versa.
-
Templating and recommendations while writing text and music
-
Versioning overviews focussed on distinct symbols or XML sequences.
In an effort to get in touch with a diverse audience to understand the individual patterns of interaction with graphic modalities, we ask these open questions:
-
What is the main interaction medium with music notation?
-
Which kind of information and visual aids do different target groups need?
-
How can an interface support the understanding of complex sign systems and its relation to the XML encoding?
-
How can the interaction with multiple textual layers be supported during the editing of music?
-
What visual aids could editors need to edit XML more quickly?
-
What is the main interaction medium (XML, music graphics, lilypond, …)?
-
Which kind of information do different target groups need?
-
As users need to understand how XML and music notation correspond: How can an interface support learnability in this respect?
Participants of the conference will have the opportunity to try out both editors and leave ideas for widgets on the poster which then will be discussed in the following paper.
Bibliography
Goebl, W.; and M. Weigl, D. Alleviating the Last Mile of Encoding: The mei-friend Package for the Atom Text Editor. In Münnich, S.; and Rizo, D., editor(s), Music Encoding Conference Proceedings 2021, pages 31–39, 2022. Humanities Commons http://doi.org/10.17613/45ag-v044.
Nowakowski, M., Hadjakos, A and Stärk, A, Toward Interactive Music Notation for Learning Management Systems, in: International Journal on Innovations in Online Education, vol. 5, iss. 3, 2022. Begel House Inc.
Plaksin, Anna, David Lewis, Nevin Şahin, and Axel Berndt, ‘Sharing MEI: Common Semantics in Diverse Musics?’, in: Music Encoding Conference (Halifax, Canada: Dalhousie University, Music Encoding Initiative, 2022).
About the authors
Matthias Nowakowski studied Musicology and Philosophy at the University of Cologne (Germany), and Mediainformatics at the University of Applied Science in Düsseldorf (Germany). After his thesis about automatic transcription for electroacoustic music he concentrated his academic work on music information retrieval and is currently working at the Center of Music and Film Informatics in Detmold (Germany). He is currently writing his PhD thesis about interaction and user experience with digital musical notations.
Axel Berndt is a computer scientist with a musical background. He works currently as a postdoctoral researcher at the Center of Music and Film Informatics in Detmold, Germany. His research includes various topics in the music technology field, such as musical human-computer interaction, music performance modeling and analysis, game scoring, and generative techniques.
Anna Plaksin is a postdoctoral research associate at the Institute of Art History and Musicology (IKM) at Johannes Gutenberg University Mainz. She currently works on support for editorial markup in the mei-friend Web application.
Nevin Şahin is an assistant professor of music theories at Hacettepe University Ankara State Conservatory. Focusing on non-western music traditions which utilize different notation systems such as Hampartsum notation and Byzantine neumes, she tries to bring together MEI and early music research.
Aristotelis Hadjakos is a professor of music informatics and co-director of the Center of Music and Film Informatics. He conducts research in the area of musical human-machine interaction. His specific research interests are digital scores, digital humanities, and tangible musical interfaces.
Contribution Type
Keywords