LooMMapping HomePage PageList

This page is mainly for me for my reference, but may be of use to others. This depends on ho it progresses.

Past Approaches

Scaling all raw HID data between Highest and Lowest values received from each sensor.
A good approach, accounts for changes of lighting for LDR sensors. integrated into first Harp.sc, and now Harp2.sc.
(Harp.sc is old, and used HIDDeviceService, Harp2.sc is new and uses GeneralHID)

String 4 is fucked, sometimes works nicely, other times outputs random rubbish.

Sending all continuous sensors out on K-busses 0-12
Good for testing and quick patch-building, essentially the same as using MouseX/Y.kr, unsatisfying to use for longer periods. Attempted to build synths which acted on Audio feedback, with the aim of making the parameters interact as with feedback in an audio mixer, not very successful because of filter blowup and over simplistic implementation.
The method .kbusses can be used with Harp2.sc.

Detecting quick changes
This is a good method for the strings, specifically for detecting plucking of the strings. The strings output is always the same, so for each string you can set a test eg. when the string output drops from >0.2 to <0.2 in one step, you know the string has been plucked. The last value before the pluck shows how hard it was plucked, like how far it was drawn back. This method is not good for the LDR's as it is very difficult to predict how they react in different lighting, with different clothing etc etc.
String-pluck-actions are now in Harp2.sc.

Finger Latches
The idea was that I wanted to focus on the strings as gestural sensors, and to use the finger-flute-holes on the palm-plate to set meta-parameters. I decided to do this by detecting pauses in their output, followed by a quick pull-off of the finger. The gesture then would be a quick hover at a certain height over the hole, and then quickly lifting away, the value output staying at the resting level. This was again hampered by the LDR's unpredictability, and also I am starting to think that it was maybe the wrong way around, seeing as they can be used to capture much more nimble movements than the strings.

Mapping Matrices
MappingMatrix.sc was my attempt to understand the MNM project for MaxMSP. I completely do not understand it at all because I cant read complex math notation, with which the project's paper is primarily concerned. My class makes a 3D matrix eg. 10*10*10 which is imagined as a 3D grid of little cubes. One of these 1000 cubes is set to be the high value, and all the other cubes have values getting closer to zero, the further they are from this high-value cube. Each of your sensors gets a slice of the big cube, ie a 2D function-graph, so that you can read from the graph using eg the sensors' current output and their average rate-of-change as Xand Y values, and gain the Z value from the function. Because the sensors all use the same cube they all share characteristics in the way they are mapped. The main idea is that the central high-point in the cube can be moved around in 3D by some master controls, thereby changing the way that all of the sensors behave relative to each other. This approach is analogous to timbre-space and ideas of spectromorphology, in that the behaviour of the parameters can be morphed, and they in turn control the timbre and other characteristics.

This is an important area of data-collecting, and also highlights one important aspect of the LooM sensors. When the sensors are resting at a particular value you get no output. This means that to keep track of their activity with respect to time you need  to store their value and poll that value at a fixed rate. The rests in output can be quite handy for the string-pluck detection already mentioned though.

Harp2.sc does not differentiate between individual button-presses and held-down patterns of buttons. But does really, that is you get an array of buttons as output, but its more clever so that it does not decide what combination you have held down together until you start releasing buttons, and single button presses work the same way, everything is done on button release. Triggers rather than switches, there are enough combinations to make switches needless anyway. POSITIVE THINKING!! The patterns open up a larger range of possible settings and actions, whereas the quick presses can be used for more frequent things.

Current Idea  


Plucking a string makes a new synth-node on the server, the plucked sound can either die away, as a percussive event, or if the string is quickly grabbed up again (and visibly manipulated) then control of the sound passes to that string and the sound carries on.

The five strings are not all the same. The nearest to the frame is easier to manipulate, therefore the five strings will control in order the most-to-least prominent sounds, eg the least prominent(or gestural) sound might be a reverb or mad delay mash hailstorm device of some kind.

We need the option to move synths around in the order-of-execution, to get feedback or to avoid one synth's output being processed by another.

Each synth should be able to move around the prominence-pecking-order. This is based on the crude idea that no-one will concentrate on more than two streams of gesture, so less prominent synths can move more slowly.

More than one synth can be assigned to a string, but we need to be able to get rid of them again. Quickly is need be, or more gradually too.

Powered by Pawfaliki v0.5.1