

Here is an experiment in mappings using the mouse as an example sensor. I am trying out some of the suggested mappings in 'Mapping performer parameters to synthesis engines' by Andy Hunt and Marcelo M. Wanderley.
In this example max patch I am using the speed of mouse movement (rate of change) with the mouse button pressed to control volume. I have taken the rate of change in both the vertical and horizontal mouse data and added them (I suspect this is not the most appropriate method, but seems to work ok). I have added them so that both horizontal and vertical movements can account for an increase in volume, thus removing the necessity to be exact with any one direction. This data is then fed into an [if] conditional statement, which ensures that the data will be fed through once the mouse button is pressed and wont when it is not pressed.
On the signal end of the patch I am using a binaural type construct where 2 [cycle~] objects are combined using these expressions in either channel:
Channel 1 = x + (y/2)
Channel 2 = x - (y/2)
I have set the parameters so that, with the mouse button not pressed, the horizontal movement of the mouse controls the y value and the vertical movement controls the x value. This is fairly simple mapping but the use of movement, along with the matching of cycles through the expression results in interesting sounds and control.











