Figure 2 - uploaded by Mark T. Marshall
Content may be subject to copyright.
DJ playing with a slider and force sensor on the turntables 

DJ playing with a slider and force sensor on the turntables 

Source publication
Article
Full-text available
This paper exams the creation of augmented musical instruments by a number of musicians. Equipped with a system called the Augmentalist, 10 musicians created new augmented instruments based on their traditional acoustic or electric instruments. This paper discusses the ways in which the musicians augmented their instruments, examines the similariti...

Context in source publication

Context 1
... However, one guitarist decided to detect strumming rhythm using an infrared distance sensor, which was mounted on the body of the guitar, under the strings. This sensor was set up so that when the guitarist strummed the strings their hand would pass over it. It was then configured as an on/off switch which trig- gered whenever the guitarist’s hand passed over it. This switch between on and off then provided a measurement of the strumming rhythm. One other possibility that guitarists examined for control gestures was the use of head and body movements. Sug- gestions included the use of head mounted accelerometers to detect head tilting and the use of accelerometers on the body to detect weight shifting. However, these were found to be too cumbersome and/or restrictive for use when playing. For each of the guitarists, the control gestures just described were mapped to a number of audio effects in Apple’s Logic Pro. The choice of gestures, effects and mappings were left to the individual guitarists. Logic Pro was chosen as it is a software package that many of the participants were familiar with and also offers a large number of possible effects to control. All of the guitarists chose to use effects that they were already somewhat familiar with and that are commonly used by guitarists playing electric guitar. These effects included distortion, delay, chorus, flanger, and master volume. Example mappings included the control of delay using the tilt of the guitar, controlling distortion using the picking position and mapping strumming rhythm to master volume. Two of the participants in the design process were bassists. The bass guitar is interesting due to its similarity in form factor to the electric guitar, while having a different playing style. This allows us to look at any similarities between the instruments developed by the guitarists and bassists. The sensing of the gestures for the bass was similar to the guitar. In particular, both bassists also used the tilt of the bass guitar as a control parameter, again detected using an accelerometer mounted to the headstock or body of the instrument. One of the bassists also tried to use body movements as a control parameter. As with the guitarists, he attempted to use head tilt (detected with an accelerometer on the head) as a control. While finding this somewhat difficult to control, he also found it extremely enjoyable and kept it as part of his instrument. The effect that the bassists had the most fun with was the wah effect mapped to the accelerometer measuring the tilt of the neck. This is essentially a bi-pass cutoff where the cutoff frequency is set by the sensor. The wah has existed for many years as a foot pedal for guitarists and bassists alike but transferring this concept to the angle of the neck proved quite difficult for one bassist who had little experience with effects. Instead he ended up playing the bass as normal, with a few slight body movements in time with the music. This created very subtle changes to the ambience of the bass as the wah moved in time with the music. Other effects tested by the bassists included distortion and filter effects. These were often mapped to the tilt of the instrument, allowing a subtle, graduated control of the effect. Two DJs took part in the development sessions. One of the DJs primarily played Techno, while the other was a Drum and Bass DJ. These two styles of music make use of different beat styles, which may be suited to different audio effects. Although a DJ tends to have their hands full much of the time, we found that the DJs preferred to use their hands to control the sensors, rather than finding some unused performance gesture. This meant that they were often simply utilising the properties of the sensor directly, rather than attempting to use the sensor to sense a gesture. As such the sensors often became extra controls for their mixer. In testing, one DJ who used the system made extensive use of 3 sensors: an accelerometer, a slider and a force sensor. The accelerometer was attached to the performer’s hand and used as a tilt sensor. This allowed them to control effects by tilting their hand in 2 axes. This was the only sensor which the DJ used to sense movement, rather than as a direct control. Sliders are extremely common in DJ equipment and are used for volume, turntable speed, as well as many effects. The DJ quickly picked up on the advantage of the slider. Retaining its position and it’s location next to the pitch control on the turntable allowed for quick and easy adjust- ments whilst mixing. When a DJ mixes, a large proportion of his time is spent focusing on the pitch control which is located on the turntable next to the tone arm and so the DJ was able to quickly switch to this slider to control effects. The force sensor was attached to the opposite turntable in the same place. The DJ activates the force sensor by pressing on it. The force of this pressing is then measured by the sensor. Figure 2 shows the system in use. The second DJ made use entirely of sliders, using them as additional effect controls on top of their turntable decks. DJs can make use of a large number of effects during a performance, switching effects during a track or when changing tracks. As one of our DJs made use only of sliders as additional effect controls, the mapping is not particularly interesting. As such, we will concentrate on the other DJ who made use of a number of sensors and effects. This DJ who tested the system tried a large number of effects with each sensor, before settling on several options for each sensor. The result is that for each sensor the DJ has a number of effects, which can be controlled one at a time or even several at once. For the accelerometer, the effects chosen were a bandpass filter effect and a beat repeating effect called Beatmasher. While the bandpass filter resulted in controlled, predictable effects, controlling the Beatmasher resulted in interesting but more random results. Interestingly the DJs found that the Beatmasher effect was more useful when used on Techno music than Drum and Bass. For the slider, the DJ chose the distortion drive level and the Transpose Stretch effect, which pitch shifts and time stretches the audio. Finally, for the force sensor the DJ chose to control a number of effect mix levels, including reverb mix, delay mix, flanger mix and phaser mix. The nature of the force sensor, which returns to a zero value output as soon as the performer stops pressing on it, allowed the DJ to add and effect by pressing the sensor, increasing the effect by increasing pressure and then instantly stop the effect by releasing the sensor. A single vocalist also took part in the design process for the Augmentalist. Vocalists offer an interesting contrast to the other performers as their hands are generally free when performing (except when holding a microphone or mic stand). This allows them to use a great range of finely controlled mappings utilising the dexterity of the hands, as well as other parts of the body. Note that in the case of the vocalist, the instrument could be considered to be their own body, or the microphone and/or mic stand. The vocalist (an MC who ’rapped’ rather than sang) made use of hand gestures to control effects. This included sens- ing of the tilting of his hand in two dimensions. This was accomplished through the use an accelerometer strapped to the back of his hand. As with the other participants, the vocalist also considered the use of head movements, again sensing head tilt using an accelerometer. However, these movements were found to be too disconcerting to use in performance. The vocalist also examined the augmentation of the microphone. Gestures used included the sliding of the hand along the microphone (measured using a slider attached to the microphone body) and grip pressure on the microphone, detected using an FSR attached to the microphone body. Most MCs hold the microphone to perform, instead of using a stand and so to put controls on the microphone itself proved to be intuitive. Furthermore, by mapping the tight- ness of the grip on the microphone to an effect mix, the mapping was a natural extension of emotive performance as with the accelerometer on the guitar. This particular vocalist was not as well versed on all the various effects and their parameters as, for example, the guitarist. This meant that often effects were discovered by accident as more experimentation took place, rather than attempting to achieve a specific sound. Something that the vocalist was keen to try straight away was a pitch shifter. This effect simply changes the pitch of the input by an amount specified on a discrete bidirectional scale. After trying with the accelerometer and struggling to maintain a steady hand (i.e. keep the pitch normal) he requested that we be able to limit the MIDI output at half so the he could keep the sensor at 0 more easily. After this he found it very intuitive to map a drop in pitch to the downwards movement of his hand and keep the pitch at 0 with his hand up. The pressure sensor with its ’return to zero’ style of op- eration worked really well with effects that made the sound messy as when released the effect would return to normal. Delay mix and reverb mix as well a flanger intensity worked well to accent and in some cases twist quite dramatically the sound before snapping back to a dry signal when released. The final participant in the development process for the Augmentalist was a saxophonist. In general, we would not expect saxophonists to be overly familiar with the use of electronic effects and audio processing. However, the interview stage revealed that this saxophonist was also a keen producer of electronic music and as such was very well versed and comfortable with standard studio style controls. As will be seen, this may have influenced his choice of sensing and control gestures. In a similar way to the DJs that ...

Similar publications

Article
Full-text available
While the Ondes Martenot is one of the oldest electronic musical instruments, it is still played today. It is known for making extremely expressive electronic sounds. A major advantage of the Ondes Martenot is that it enables players to control the temporal dynamics of notes in a particularly sensitive manner by pressing an "intensity key". In this...

Citations

... The hacking of the horn, which resulted in an "augmented" horn, using terminology typical of the NIME (New Interfaces for Musical Expression) community, corresponds with the compositional material itself. 14 Lucier explained that in their pieces "there were no scores to follow; the scores were inherent in the circuitry." 15 The score was embedded in the instrument, which was the piece itself. ...
... The proposition that technology is a non-neutral mediator of human cognition has deep roots in philosophy (Ihde 1990;Verbeek 2005), and a practical mechanism for this influence can be seen by the extent to which musical instrument creation is not an abstract goal-directed exercise but rather one that explores and responds to the opportunities and constraints of the tools (Magnusson 2010;Newton and Marshall 2011;Lepri and McPherson 2019). ...
Article
Full-text available
It is widely accepted that acoustic and digital musical instruments shape the cognitive processes of the performer on both embodied and conceptual levels, ultimately influencing the structure and aesthetics of the resulting performance. In this article we examine the ways in which computer music languages might similarly influence the aesthetic decisions of the digital music practitioner , even when those languages are designed for generality and theoretically capable of implementing any sound-producing process. We examine the basis for querying the non-neutrality of tools with a particular focus on the concept of idiomaticity: patterns of instruments or languages which are particularly easy or natural to execute in comparison to others. We then present correspondence with the developers of several major music programming languages and a survey of digital musical instrument creators examining the relationship between idiomatic patterns of the language and the characteristics of the resulting instruments and pieces. In an open-ended creative domain, asserting causal relationships is difficult and potentially inappropriate , but we find a complex interplay between language, instrument, piece and performance that suggests that the creator of the music programming language should be considered one party to a creative conversation that occurs each time a new instrument is designed.
... These particular design values led the authors to mount non-invasive (temporary) controls on the body of the guitar to allow the performer to control the DSP within hand's reach and without having to reach out to a laptop (ibid). Likewise, Newton and Marshall observe that whilst AMIs enable new kinds of musical expression at the same time they impose an additional cognitive load on the performer when interacting with the extra controls [26]. With this issue in mind, the authors' approach to AMI design (ibid) involved providing a kit for rapid instrument augmentation to musicians, who were assumed to know how to best augment their own instrument in terms of best use for new controls that fitted their personal performance practices. ...
... For example a graphical user interface (GUI) can map its inputs to transport controls and locators on Ableton Live via MIDI messages or OSC using TouchOSC 6 ( Figure 3). This setup allows us to quickly prototype in-instrument media controls by simply taping a mobile phone to the body of the guitar (similar to the non-invasive sensor placement in [14,26]). As shown in Figure 3, the prototype controls on the GUI are directly inspired on our findings, and allow musicians to navigate an audio track by jumping directly to a section, as well as play, pause, place anchor points in the track and navigate them (when multiple anchor points are placed) by using the touch screen. ...
... A supportive AMI should seek to intervene in the instrument only insofar as it supports the interaction with the instrument to achieve a particular goal. Previously cited examples of supportive augmentation include those that seek to overcome different kinds of physical and cognitive impediments caused by lack of experience with the instrument [31,36], disabilities [9,12], or by the other augmentations made to the instrument [14,26]. ...
Conference Paper
Full-text available
A substantial number of Digital Musical Instruments (DMIs) are built upon existing musical instruments by digitally and physically intervening in their design and functionality to augment their sonic and expressive capabilities. These are commonly known as Augmented Musical Instruments (AMIs). In this paper we survey different degress of invasiveness and transformation within augmentations made to musical instruments across research and commercial settings. We also observe a common design rationale among various AMI projects, where augmentations are intended to support the performer's interaction and expression with the instrument. Consequently, we put forward a series of minimally-invasive supportive Guitar-based AMI designs that emerge from observational studies with a community of practicing musicians preparing to perform which reveal different types of physical encumbrances that arise from the introduction of additional resources beyond their instrument. We then reflect on such designs and discuss how both academic and commercially-developed DMI technologies may be employed to facilitate the design of supportive AMIs.
... Here is an augmented saxophone [10]. The practice is common enough that there are metapapers on the topic [27]. ...
Conference Paper
Full-text available
In this paper we examine how the term 'Audio Augmented Reality' (AAR) is used in the literature, and how the concept is used in practice. In particular, AAR seems to refer to a variety of closely related concepts. In order to gain a deeper understanding of disparate work surrounding AAR, we present a taxonomy of these concepts and highlight both canonical examples in each category, as well as edge cases that help define the category boundaries.
... The second possible solution is to use another modality to interact, leaving the hands free to play. For example, gesturing with the instrument [38], gesturing with the arms, head or body [22,28], or explicit direction of gaze [39]. Alternatively, the musician could use their voice to interact, e.g. with spoken commands [17], or even harness the notes and phrases that they are playing [16]. ...
Conference Paper
Full-text available
Guitars are physical instruments that require skillful two-handed use. Their use is also supported by diverse digital and physical resources, such as videos and chord charts. To understand the challenges of interacting with supporting resources at the same time as playing we conducted an ethnographic study of the preparation activities of working musicians. We observe successive stages of individual and collaborative preparation, in which working musicians engage with a diverse range of digital and physical resources to support their preparation. Interaction with this complex ecology of digital and physical resources is finely interwoven into their embodied musical practices, which are usually encumbered by having their instrument in hand, and often by playing. We identify challenges for augmenting guitars within the rehearsal process by supporting interaction that is encumbered, contextual and connected, and suggest a range of possible responses. - - - - Open Access - https://dl.acm.org/citation.cfm?id=3300706
... Kirsh (2013), citing Gibson (1966), writes: "the by now classic position of embodied cognition is that the more actions you can perform the more affordances you register (e.g., if you can juggle you can see an object as affording juggling)." The importance of discovery has been noted in several tangible and reconfigurable musical interfaces (Newton and Marshall, 2011;Xambó et al., 2013;Zamborlin, 2015). ...
Article
Full-text available
This paper investigates the appropriation of digital musical instruments, wherein the performer develops a personal working relationship with an instrument that may differ from the designer's intent. Two studies are presented which explore different facets of appropriation. First, a highly restrictive instrument was designed to assess the effects of constraint on unexpected creative use. Second, a digital instrument was created which initially shared several constraints and interaction modalities with the first instrument, but which could be rewired by the performer to discover sounds not directly anticipated by the designers. Each instrument was studied with 10 musicians working individually to prepare public performances on the instrument. The results suggest that constrained musical interactions can promote the discovery of unusual and idiosyncratic playing techniques, and that tighter constraints may paradoxically lead to a richer performer experience. The diversity of ways in which the rewirable instrument was modified and used indicates that its design is open to interpretation by the performer, who may discover interaction modalities that were not anticipated by the designers.
... Several academic projects focused on extending the capabilities of conventional instruments, building on the expertise of trained performers [36]. We believe that co-design practices (e.g., [9]), learning all the lessons from AMIs creation and evaluations (e.g., [30,39]), and taking into account the self-evaluations deriving from AMIs autobiographical designs (e.g., [49]) will be the key for developing future SMIs that can fully exploit the potentialities offered by their technology, while at the same time reducing the impact of the issues inherent to the learning and usage of novel musical systems. ...
Conference Paper
Full-text available
Augmented musical instruments (AMIs) consist of the augmentation of conventional instruments by means of sensor or actuator technologies. Smart musical instruments (SMIs) are instruments embedding not only sensor and actuator technology, but also wireless connectivity, on-board processing, and possibly systems delivering electronically produced sounds, haptic stimuli, and visuals. This paper attempts to disambiguate the concept of SMIs from that of AMIs on the basis of existing instances of the two families. We counterpose the features of these two families of musical instruments, the processes to build them (i.e., augmentation and smartification), and the respective supported practices. From the analysis it emerges that SMIs are not a subcategory of AMIs, rather they share some of their features. It is suggested that smartification is a process that encompasses augmentation, as well as that the artistic and pedagogical practices supported by SMIs may extend those offered by AMIs. These comparisons suggest that SMIs have the potential to bring more benefits to musicians and composers than AMIs, but also that they may be much more difficult to create in terms of resources and competences to be involved. Shedding light on these differences is useful to avoid confusing the two families and the respective terms, as well as for organological classifications.
... Newton and Marshall's Augmentalist is not an AMI per se, but a platform for augmenting electric guitars using sen- sors (Phidgets 2 ) and microcontrollers [9]. The authors pro- posed to analyze how performers used the resources made available by the Augmentalist to create customized AMIs for each performance. ...
... From 2014, three GuitarAMI prototypes were built, which had been tested in performances and music educa- tion activities [21]. The second GuitarAMI prototype was also used in performance for the B.E.A.T. 8 trio (drums, trumpet and guitar) 9 . ...
Conference Paper
Full-text available
This paper describes two augmented nylon-string guitar projects developed in different institutions. GuitarAMI uses sensors to modify the classical guitars constraints while GuiaRT uses digital signal processing to create virtual guitarists that interact with the performer in real-time. After a bibliographic review of Augmented Musical Instruments (AMIs) based on guitars, we present the details of the two projects and compare them using an adapted dimensional space representation. Highlighting the complemen-tarity and cross-influences between the projects, we propose avenues for future collaborative work.
... We also wanted to take advantage of the refined acoustic properties of the instrument; that is, we were interested in how the resonances of the trombone would shape the feedback sound. We were not interested in "augmenting" the trombone to allow for feedback sound in addition to normal playing technique (as in hyper-instruments or other augmented instrument designs [10]), but rather in harnessing the trombone as the starting platform for a different sound production technique. We envisioned a system in which (1) the trombone provided a unique and acoustically refined resonating body, (2) the Larsen effect provided oscillation to resonate the body, and (3) analog and digital hardware allowed the user to process the feedback in order to gain control of this resonance. ...
Conference Paper
Full-text available
The New Instrument Research Laboratory at Princeton University presents its research on control of audio feedback in brass instruments through the development of a new electroacoustic instrument, the Feedback Trombone. The Feedback Trombone (FBT) extends the traditional acoustic trombone with a speaker, microphone, and custom analog and digital hardware. Signal from a microphone on the bell of the trombone is processed by a microcontroller-based DSP system and fed back in to the trombone through a speaker driver, inducing oscillation and resonating the body of the instrument. We gained control of the timbre and pitch of the resulting sound by (1) applying filtering and compression in the processing stage, and (2) controlling microphone and speaker placement (see Figure 1). In this paper, we describe the development of the Feedback Trombone and discuss what we learned about controlling the timbre and pitch of the instrument.
... Toolkits for creating DMIs have become increasingly common [18,21,19,2,23,13,5], with different projects aimed at a variety of musical contexts and technical skill levels. These Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. ...
Conference Paper
Full-text available
This position paper discusses toolkits for creating digital musical instruments. Musical interaction imposes stringent technical requirements on interactive systems, including high spatial and temporal precision and low latency. Social and community factors also play an important role in musical interface toolk-its, including design sharing and the ability of performers and composers to count on the longevity of an instrument. This paper presents three examples of musical interface toolkits, including our own Bela, an open-source embedded platform for ultra-low-latency audio and sensor processing. The paper also discusses how the requirements of specialist musical interface toolkits relate to more general HCI toolkits.