Abstract Plain (Frank Black)

On January 29, 2014, in Uncategorized, by enemyin1

Reckon you’ve earned yourselves a musical interlude for just following that epic pluralism thread. Enjoy.

 

Tagged with:
 

Invisible Clock: semi-algorithmic improvisation

On January 13, 2014, in Uncategorized, by enemyin1

Haven’t done this in a long while. I fired up my antique version of MAX MSP and Reaper and used my java based probabilistic sequencer jDelta (code here) to belt out this short improvisation. The sound is a marimba-like tuned percussion designed on the Native Instruments FM8 synthesizer. jDelta allows you to take a short seed sequence (here a repeated cluster chord) and graphically determine the probability of individual notes playing in the sequence, transpositions, velocity or tempo changes in real time using multislider objects. I then improvised a few ornaments over algorithmic variations induced on the seed phrase.

 

 

Relevance to posthuman performance practice: JDelta is just a sequencer that allows a certain global control over event probabilities. It assigns played note values to some arrays, then determines the probability of some output related to those values by imposing conditions on a random number output. A smarter program might (for example) use Bayesian statistics or neural networks rather than random numbers to fix the probabilities of an event relative to a given musical context (a little beyond my programming ability at the moment). While the program is not remotely smart, it mediates performance by allowing one to conceive the distribution of events in a graphical way, delegating how the events fall to the machine.

Tagged with:
 

Xenakis and the Missing Structure

On April 20, 2013, in Uncategorized, by enemyin1

Loop

[A slightly edited extract from my paper "Nature's Dark Domain: an Argument for a Naturalised Phenomenology". Royal Institute Of Philosophy Supplement [serial online]. July 2013;72:169-188 with audio!]

Most listeners will readily distinguish an eight second sequence from Xenakis’ pioneering ‘granular’ composition Concret Ph.

ConcSequence

and a loop that repeats the first one-second slice of it for eight seconds.

ConLoop

This is discernible because of the obvious repetition in pitch and dynamics.

Telling the looped sequence from the non-looped sequence is not the same as acquiring subjective identity conditions that would allow us to recognise the extra structure distinguishing the non-looped from the looped sequence in a different context (e.g. within the entirety of Concret Ph). What is discerned here is arguably a fact about the shortfall between type-identifiable phenomenology and non type-identifiable phenomenology (“unintuitable” or “dark” phenomenology).

As an illustration of this, the mere awareness that there is missing structure in the loop does not help settle the issue between virtualist and occurentist construals of that structure. It is plausible to suppose that the perceptual awareness of the missing structure in the Xenakis loop consists of virtual contents – a representation of tendencies in the developing sound rather than something like a constantly updated encoding of discrete sonic facts [1]. Indeed the virtual model would be consistent with the widely held assumption that our representation of temporal structure is accomplished via recurrent neural architecture that modulates each current input by feeding back earlier input.[2] But whether the contents of representations of temporal structure are virtual or occurrent in nature has no direct bearing on their conceptual or intuitive accessibility.

 


[1]Tim Van Gelder, ‘Wooden Iron? Husserlian Phenomenology Meets Cognitive Science’, Electronic Journal of Analytic Philosophy, 4, 1996.

[2]Op. cit.

Tagged with:
 

The Posthuman: Differences, Embodiments, Performativity
Call For Papers
September 11th – 14th 2013, University of Roma 3, Rome, Italy

The University of Roma 3, the University Erlangen-Nürnberg,
the University of the Aegean and Dublin City University
are pleased to announce:
The 5th Conference of the Beyond Humanism Conference Series

The specific focus of the Conference “The Posthuman: Differences, Embodiments, and Performativity” will be the posthuman, in its genealogies, as well as its theoretical, artistic and materialistic differences and possibilities. In order to guarantee a systematic treatment of the topic, we will particularly focus on the following themes:

1 What is the posthuman? Have humans always been posthuman? If so, in which sense? Is the posthuman a further evolutionary development of the human being? What are the implications of gender, sex and race, among other differential categories, for the embodied constitution of the posthuman? Do posthumans already exist? What is the difference between the posthuman, the transhuman, the antihuman and the cyborg?

2 Philosophical issues concerning the genealogies of the posthuman: Which traditions of thoughts are significant to the posthuman theoretical attempt to postulate a post-dualistic and post-essentialist standpoint? What are the differences between the genealogies of the posthuman and of the transhuman? What points do they hold in common? Is the posthuman a Western-centric notion? Could non-dualistic practices such as shamanism be accounted as posthuman?

3 Bioarts, Body Art, Performance Art and the Posthuman: Which kind of art can be seen as leading towards the posthuman? Is the notion of the posthuman traceable in artistic traditions which precede the coining of the term “posthuman”? Can the posthuman be detected in cultures which have not been canonized by Western aesthetics?

4 Ethics, Bioethics, and the Moral Status of the Posthuman: Does the posthuman lead to a new, non-universalist, non-dualist understanding of ethics? Will posthumans have the moral status of a post-person, or will it be possible for them to have human dignity and personhood? Are human rights necessarily humanistic, or can they be re-enacted within a posthuman frame?

5 Emerging Technologies and the Posthuman: Which technologies represent the most significant challenge concerning the concept of the human/posthuman? Are restrictive national regulations concerning emerging technologies helpful in a globalized world? Do mind-uploading, plastic surgery, and cyborgian practices dissolve the border between human beings and machines? Human enhancement is already happening: should morphological freedom be regulated by social norms, or should it stand on individual choices?

6 Materialism and Posthuman Existence: The notion of matter as an active agent has been reinforced through Quantum Physics, on a scientific level, as well as by New Materialisms and Speculative Realism, on a philosophical level. Is the posthuman grounded in a materialist understanding of existence? What are the ontological, as well as the existential implications of the relationality of matter? Can it be related to a Posthuman Agency? What would a Posthuman Existentialism imply?

7. Posthuman Education: The notion of education in a posthumanist world; the transformation of the roles of teachers and learners in a posthuman social environment; what is the concept of a post- and transhumanist school? Which learning activities are central in a posthumanist educational system? Epistemological considerations about knowledge construction in the posthumanist era need to be considered further.

Papers will be selected and arranged according to related topics. Equal voice will be given, if possible, to presentations from the arts, humanities, sciences, and technological fields.

Major areas of interest include (in alphabetic order):
Animal Studies, Antihumanism, Heritage and the Arts, Postmodernism, and Conceptual Art, Bioarts and Performance Art, Bioethics, Cosmology, Critical Race Studies, Cultural Studies, Cyborg Studies, Deconstructionism, Disability Studies, Ecology, Informatics, Emerging Technologies and Ethics, Enhancement, Evolution, Existentialism, Gender Studies, Intersectionality, New Materialisms, Philosophy, Physics, Posthumanism, Quantum Physics, Science and Technology Studies, Singularity, Spirituality, Speculative Realism, Transhumanism

Other possible topics include, but are not limited to:
· Bioethics, bioconservatism, bioliberalism, enhancement
· Posthumanist anthropology, aesthetics, ecology, feminism, critical theory
· Representation of human performance in technology and the arts
· Enhancement and political discourse, regulation, and human rights
· Humanism, posthumanism, transhumanism and antihumanism in philosophy
· Poststructuralism, postmodernism, and posthumanism
· New Materialisms, speculative realism and quantum physics
· Existentialism, relational ontology, posthuman agency
· Transhuman and posthuman impact on ethics and/or value formation
· Phenomenology and postphenomenology
· Embodiments and identity
· Transhumanism and/or posthumanism in science fiction and utopian/dystopian literature
· Non-dualism in spiritual practices, mysticism and shamanism
· Globalization and the spread of biomedicine and transhumanism/br> · Economic implications of transhumanist projects
· Popular culture and posthumanist representations
· Theology, enhancement, and the place of the posthuman
· Technology, robotics, and ethics
· Cybernetics, artificial intelligence, and virtual reality
· Cyborgs and democracy
· Humanity, human nature, biotechnology

SUBMISSIONS & DEADLINES

We invite abstracts of up to 500 words, to be sent in MS Word and Pdf format to: posthuman.conference@gmail.com

Files should be named and submitted in the following manner:
Submission: First Name Last name.docx (or .doc) / .pdf
Example: “Submission: MaryAndy.docx”

Abstracts should be received by May 15th 2013.
Acceptance notifications will be sent out by June 15th.
All those accepted will receive information on the venue(s), local attractions, accommodations, restaurants, and planned receptions and events for participants.
*Presentations should be no longer than 20 minutes. Each presentation will be given 10 additional minutes for questions and discussions with the audience, for a total of 30 minutes.

FEES & REGISTRATION

A reduced registration fee of €50 (65USD) will apply to all participants.

SERIES “BEYOND HUMANISM”(site)

The Conference is part of the Series “Beyond Humanism”. The 1st Conference took place in April 2009 at the University of Belgrade (Humanism and Posthumanism), the 2nd Conference in September 2010 at the University of the Aegean (Audiovisual Posthumanism), the 3rd Conference in October 2011 at Dublin City University (Transforming Human Nature) and the 4th Conference in September 2012 at the IUC in Dubrovnik (Enhancement, Emerging Technologies and Social Challenges). This year, the conference “The Posthuman: Differences, Embodiments, and Performativity” will be held at the University of Roma 3, Department of Philosophy, Rome, Italy, from the 11th until the 14th of September 2013.

Tagged with:
 

Ringer

On August 3, 2012, in Uncategorized, by enemyin1

Putting Monk through the ringer.

Too Monk

Tagged with:
 

Kundera on Xenakis

On April 10, 2012, in Uncategorized, by enemyin1

Milan Kundera perfectly encapsulates what is great about Xenakis:

Even being a “prophet of unfeelingness,” Joyce was able to remain a novelist; Xenakis, on the other hand, had to leave music. His innovation was different in nature from that of Debussy or of Schoenberg. Those two never lost their ties to the history of music, they could always “go back” (and they often did). For Xenakis, the bridges had been burned. Olivier Mesian said as much: Xenakis’s music is “not radically new but radically other.” Xenakis does not stand against some earlier phase of music; he turns away from all European music, from the whole of its legacy. He locates his starting point somewhere else: not in the artificial sound of note separated from nature in order to express human subjectivity, but in the noise of the world, in a “mass of sound” that does not rise from inside the heart but instead comes to us from the outside, like the fall of the rain, the racket of a factory, or the shouts of a mob.

His experiments on sounds and noises that lie beyond notes and scales – can they become the basis of a new period in music history? Will his music live for long in music lovers’ memory? Not very likely. What will remain is the act of enormous rejection: for the first time someone dared to tell European music that it can all be abandoned. Forgotten. (Is it only chance that in his youth, Xenakis saw human nature as no other composer ever did? Living through the massacres of a civil war, being sentenced to death, having his handsome face forever scared by a wound…) And I think of the necessity, of the deep meaning of this necessity, that led Xenakis to side with the objective sound of the world against the sound of a soul’s subjectivity.

Tagged with:
 

Java Code for Delta: Midi Effect/Looper

On December 15, 2011, in Uncategorized, by enemyin1

Here’s the java code for jDelta: a sequencing object for the MAX MSP software environment that I wrote a few years ago. JDelta stores note pitch, note duration, velocity and note delta times. It plays back each note according to its delta time with respect to preceding notes in a sequence. The neat thing is that probability of each stored note event, its transposition, velocity and its length can all be manipulated via the PD arrays (notePD, etc.). So the stored sequence is a seed from which variable sequences can be generated on the fly.

This code is designed to be used in MAX MSP 5 (should be compatible with 6 but I haven’t tried it). Currently, the clock source is MaxClock. Similarly, there are other expressions here peculiar to code written for the max environment such as the ‘DeclarInlets’ statements which determine the number of inputs and outputs from the graphical max object. There are a lot of methods but their purpose has been documented as as far as I’m able. An example of a piece recorded using JDelta as a looping device is for my Ring Modulated Piano improvisation here.

Anyway, feel free to tinker around. I think a better programmer than me ought to be able to adapt this to work much more smoothly and maybe to load sequences from midi files (!).

An example of the MAX interface I use with jDelta is given with the link to RM piano.

***************************************************************************************************************************

 

 

 

 

import com.cycling74.max.*;

 

import java.util.HashMap;

 

import java.util.Random;

 

import java.io.*;

 

 

 

 

 

 

 

 

 

public class jDelta extends MaxObject

 

{

 

 

 

private HashMap <Integer, Integer> indexTable;

 

//To pair pitches with indices//

 

private HashMap <Integer, Double> timeTable;

 

//to pair pitches with relevant onset times//

 

 

 

 

 

 

 

private int min;

 

private int max;

 

private int index;

 

// Index for array assigns. Incremented with note on’s//

 

private int count;

 

// Count outputs note data, cycling through arrays//

 

private Random generator;

 

// instance variable representing random number generator//

 

private double onTime;

 

// holds time of note on//

 

 

 

private double masterDuration;

 

//Field to hold target duration value from external source

 

 

 

private boolean wrap;

 

//If wrap ‘true’ then sequence length is maintained at a constant duration

 

private boolean Sticky;

 

// keeps Max at last value to allow update at same sequence length//

 

 

 

private double lastDelta;

 

// holds value that is added to final delta time to increase length between end and beginning of sequence//

 

 

 

private double offTime;

 

// holds time of note ff//

 

private double noteDuration;

 

// holds noteDuration//

 

private double deltaTime;

 

// holds time beteween note on’s//

 

private double r;

 

// holds next random value//

 

 

 

private int [] pBank;

 

private int [] velBank;

 

private double [] ndBank;

 

private double [] deltaBank;

 

 

 

// Arrays storing pitch, velocity and note duration//

 

 

 

private int arraySize;

 

// Pass Banksize argument to arraySize for determining step time//

 

 

 

private int [] transBank;

 

private int [] velChangeBank;

 

 

 

// Arrays storing tranpose and velocity change data //

 

 

 

 

 

private double [] notePD;

 

 

 

// Stores doubles representing probability of note event//

 

 

 

private double [] transPD;

 

 

 

//Stores doubles representing probability of transpose event//

 

 

 

private double [] velChangePD;

 

 

 

//Stores doubles representing probability of velocity change event//

 

 

 

//

 

private double [] tempoBank;

 

// Array to hold tempo values//

 

 

 

private double [] tempoPD;

 

 

 

// array to hold probability of tempo changes//

 

 

 

 

 

private MaxClock clock;

 

 

 

private PrintWriter pw;

 

 

 

 

 

 

 

private int [] pitchBuffer1;

 

private int [] velBuffer1;

 

private double [] ndBuffer1;

 

private double [] deltaBuffer1;

 

 

 

private int [] pitchBuffer2;

 

private int [] velBuffer2;

 

private double [] ndBuffer2;

 

private double [] deltaBuffer2;

 

 

 

private int [] pitchBuffer3;

 

private int [] velBuffer3;

 

private double [] ndBuffer3;

 

private double [] deltaBuffer3;

 

 

 

private int [] pitchBuffer4;

 

private int [] velBuffer4;

 

private double [] ndBuffer4;

 

private double [] deltaBuffer4;

 

 

 

// Buffers for on the fly storage//

 

 

 

 

 

private static final String[] INLET_ASSIST = new String[]{

 

“inlet 1 help”

 

};

 

private static final String[] OUTLET_ASSIST = new String[]{

 

“outlet 1 help”

 

};

 

 

 

// variables for zip method//

 

 

 

// Minimum index of array to which zip applies//

 

int zipMin;

 

// Maximum index of array to which zip applies//

 

int zipMax;

 

// Number of steps by which indexed values are moved//

 

int zipAdd;

 

 

 

 

 

/** Creates a new instance of jDelta */

 

public jDelta(int bankSize)

 

 

 

{

 

 

 

 

 

declareInlets(new int[]{DataTypes.ALL, DataTypes.ALL,  DataTypes.ALL });

 

declareOutlets(new int[]{DataTypes.ALL, DataTypes.ALL, DataTypes.ALL, DataTypes.ALL, DataTypes.ALL,

 

DataTypes.ALL, DataTypes.ALL,  DataTypes.ALL, DataTypes.ALL, DataTypes.ALL, DataTypes.ALL });

 

 

 

setInletAssist(INLET_ASSIST);

 

setOutletAssist(OUTLET_ASSIST);

 

 

 

this.Sticky = false;

 

this.wrap = false;

 

this.masterDuration = 2000.0;

 

//initialise masterDuration to something!//

 

this.indexTable = new HashMap<Integer, Integer> ();

 

this.timeTable = new HashMap<Integer, Double> ();

 

this.min = 1;

 

this.count = 1;

 

this.onTime = 0.0;

 

this.index = 0;

 

this.lastDelta = 0.0;

 

this.arraySize = bankSize;

 

 

 

 

this.clock = new MaxClock(new Callback(this, “Play”));

 

this.pBank = new int [  bankSize ] ;

 

this.velBank = new int [  bankSize] ;

 

this.ndBank = new double [  bankSize] ;

 

this.deltaBank = new double  [ bankSize];

 

 

 

this.transBank = new int [ bankSize];

 

this.velChangeBank = new int [ bankSize];

 

 

 

this.tempoBank = new double [bankSize];

 

//set all tempo values to 1 by default//

 

for (int i = 0; i < bankSize; i++)

 

{

 

tempoBank[i] = 1.0;

 

}

 

this.tempoPD = new double [bankSize];

 

 

 

this.notePD = new double [ bankSize];

 

this.transPD = new double [ bankSize];

 

this.velChangePD = new double [ bankSize];

 

 

 

 

 

this.generator = new Random();

 

 

 

this.pBank[0] = 60;

 

this.velBank[0] = 0;

 

this.ndBank[0] = 0.0;

 

this.deltaBank[0] = 0.0;

 

 

 

 

 

 

 

pitchBuffer1 = new int [bankSize];

 

velBuffer1 = new int [bankSize];;

 

ndBuffer1 = new double [bankSize];

 

deltaBuffer1 = new double [bankSize];

 

 

 

pitchBuffer2 = new int [bankSize];

 

velBuffer2 = new int [bankSize];;

 

ndBuffer2 = new double [bankSize];

 

deltaBuffer2 = new double [bankSize];

 

 

 

pitchBuffer3 = new int [bankSize];

 

velBuffer3 = new int [bankSize];;

 

ndBuffer3 = new double [bankSize];

 

deltaBuffer3 = new double [bankSize];

 

 

 

pitchBuffer4 = new int [bankSize];

 

velBuffer4 = new int [bankSize];;

 

ndBuffer4 = new double [bankSize];

 

deltaBuffer4 = new double [bankSize];

 

 

 

 

 

}

 

 

 

 

 

 

 

public void Reset()

 

{

 

index = 0;

 

count = 1;

 

}

 

 

 

 

 

public void giveMin(int intMin)

 

{

 

min = intMin;

 

count = intMin;

 

outlet(4, count);

 

}

 

 

 

public void giveMax(int intMax)

 

{

 

max = intMax;

 

outlet(4, max);

 

}

 

 

 

public void getWrapValue(int anInt)

 

{

 

//      Determines whether wrap applies. Inputting 1 turns on wrap

 

 

 

if (anInt == 1)

 

{

 

wrap = true;

 

outlet(10, 1);

 

 

 

;}

 

else

 

{

 

wrap = false;

 

outlet(10, 0);

 

}

 

}

 

 

 

 

 

public void Sticky(int anInt)

 

{

 

//      Determines whether wrap applies. Inputting 1 turns on wrap

 

 

 

if (anInt == 1)

 

{

 

Sticky = true;

 

outlet(10, 1);

 

 

 

;}

 

else

 

{

 

Sticky = false;

 

outlet(10, 0);

 

}

 

}

 

 

 

 

 

public double giveFinalDelta(double aDouble)

 

{

 

lastDelta = aDouble;

 

return lastDelta;

 

}

 

 

 

public void setFinalDelta(double aDouble)

 

{

 

lastDelta = aDouble;

 

}

 

 

 

 

 

public void getDuration()

 

{

 

 

 

double durationSum = 0.0;

 

for(int i = min; i <= max ;i++)

 

{

 

durationSum = durationSum + deltaBank[i];

 

}

 

 

 

outlet (8, min);

 

outlet(9, max);

 

outlet(6, durationSum);

 

 

 

 

}

 

 

 

public double returnDuration()

 

{

 

double durationSum = 0.0;

 

for(int i = 0; i <= max ;i++)

 

{

 

durationSum = durationSum + deltaBank[i];

 

}

 

 

 

return durationSum;

 

 

 

}

 

 

 

 

 

 

 

 

 

 

 

public double getRunningDur()

 

{

 

double durationSum = 0.0;

 

for(int i = min; i <= (index-1) ;i++)

 

{

 

durationSum = durationSum + deltaBank[i];

 

}

 

outlet(11, durationSum);

 

return durationSum;

 

 

 

}

 

// Get running total of durations prior to the duration to be entered into deltaBank//

 

 

 

 

 

public void getMasterDuration(double aDouble)

 

{

 

 

 

masterDuration = aDouble;

 

outlet(6, masterDuration);

 

}

 

 

 

 

 

public void Start()

 

{

 

 

 

clock.delay(0.0);

 

 

 

}

 

 

 

public void Stop()

 

{

 

clock.unset();

 

count = 1;

 

 

 

}

 

 

 

 

 

// Determines pitch, velocity, note duration and delta time for a given array index//

 

public void list (int [] aList)

 

{

 

if (getInlet() == 0)

 

{

 

if ( aList[1] != 0)

 

// If a note on…//

 

{

 

 

 

deltaTime = (clock.getTime() – onTime);

 

//                        Calculate deltaTime by substracting last onTime from current time. On first note onTime is

 

//initialised to zero

 

onTime = clock.getTime();

 

// update onTime//

 

 

if(wrap == false)

 

{

 

 

 

 

deltaBank[index] = deltaTime;

 

//Sends delta to index associated with previous note – prior to incrementing index

 

if (index < (pBank.length – 1))

 

{index = index + 1;}

 

else

 

{index = 0;}

 

// note on increments index if less than array length and zeros it otherwise//

 

outlet(4, index);

 

if(Sticky == false)

 

{max = index;}

 

outlet(9, max);

 

 

pBank[index] =  aList[0];

 

velBank[index] =  aList[1];

 

// Assigns pitch and velocity//

 

 

 

this.indexTable.put( aList[0], index);

 

this.timeTable.put( aList[0], onTime);

 

//associates current index and current time with the input pitch//

 

 

 

}

 

 

 

else

 

//If Wrap is enabled…

 

{

 

 

if (! (!((getRunningDur() + deltaTime) < masterDuration)&& !(index == 0)))

 

// If the delta time plus sequence length is less than desired duration or index = 0 it can be safely added to deltaBank//

 

{

 

 

 

deltaBank[index] = deltaTime;

 

//Sends delta to index associated with previous note – prior to incrementing index

 

if (index < (pBank.length – 1))

 

{index = index + 1;}

 

else

 

{index = 0;}

 

outlet(4, index);

 

max = index;

 

outlet(9, max);

 

// note on increments index if less than array length and zeros it otherwise//

 

pBank[index] =  aList[0];

 

velBank[index] =  aList[1];

 

// Assigns pitch and velocity//

 

 

 

this.indexTable.put( aList[0], index);

 

this.timeTable.put( aList[0], onTime);

 

//associates current index and current time with the input pitch//

 

 

 

}

 

 

 

 

 

 

}

 

 

 

 

 

 

 

 

 

 

}

 

 

 

else

 

// If a note off…//

 

{

 

 

 

offTime = clock.getTime();

 

//records time of note off//

 

noteDuration = (offTime – this.timeTable.get( aList[0]));

 

//                        calculates note duration by subtracting time of note on for the same pitch, retrieved from the

 

//hashtable

 

ndBank[this.indexTable.get( aList[0])] =  noteDuration;

 

 

 

this.timeTable.remove( aList[0]);

 

this.indexTable.remove( aList[0]);

 

//removes mappings//

 

 

 

 

 

}

 

}

 

 

 

if (getInlet() == 1)

 

{

 

 

 

if(aList[0] == 3)

 

{

 

 

 

for (int i = 1; i < aList.length; i++)

 

{transBank[i] = aList[i];}

 

 

 

}

 

 

 

if(aList[0] == 4)

 

{

 

for (int i = 1; i < aList.length; i++)

 

velChangeBank[i] = aList[i];

 

}

 

 

 

}

 

 

 

}

 

 

 

public void Export () throws IOException

 

{

 

 

 

String fileName =  MaxSystem.saveAsDialog(“Save As”, “Filename”);

 

DataOutputStream os = new DataOutputStream(new FileOutputStream(fileName));

 

for(int i = 0; i < pBank.length; i++)

 

{

 

os.writeInt(pBank[i]);

 

}

 

 

 

for(int i = 0; i < velBank.length; i++)

 

{

 

os.writeInt(velBank[i]);

 

}

 

 

 

for(int i = 0; i < ndBank.length; i++)

 

{

 

os.writeDouble(ndBank[i]);

 

}

 

 

 

for(int i = 0; i < pBank.length; i++)

 

{

 

os.writeDouble(deltaBank[i]);

 

}

 

 

 

os.writeInt(min);

 

os.writeInt(max);

 

os.writeInt(index);

 

 

 

os.close();

 

}

 

 

 

//Import with dialog//

 

 

 

public void Import() throws IOException

 

{

 

String fileName = MaxSystem.openDialog(“Open File”);

 

outlet(10, fileName);

 

DataInputStream is = new DataInputStream(new FileInputStream(fileName));

 

for(int i = 0; i < pBank.length; i++)

 

{

 

pBank[i] = is.readInt();

 

}

 

 

 

for(int i = 0; i < velBank.length; i++)

 

{

 

velBank[i] = is.readInt();

 

}

 

 

 

for(int i = 0; i < ndBank.length; i++)

 

{

 

ndBank[i] = is.readDouble();

 

}

 

 

 

for(int i = 0; i < deltaBank.length; i++)

 

{

 

deltaBank[i] = is.readDouble();

 

}

 

min = is.readInt();

 

max = is.readInt();

 

index = is.readInt();

 

 

 

is.close();

 

 

 

if(wrap)

 

{

 

// If wrap is enabled reduce number of sequence steps (max) to fit within duration. Final duration added on play//

 

while((returnDuration() > masterDuration) && max > 1)

 

{

 

max = max – 1;

 

}

 

 

 

}

 

 

 

outlet(4, index);

 

outlet(8, min);

 

outlet(9, max);

 

 

 

}

 

 

 

 

 

//Import with prepend to fileName//

 

 

 

public void Insert(String fileName) throws IOException

 

{

 

 

fileName =  “/Applications/Max5/patches/delta clips/” + fileName;

File f = new File(fileName);

 

outlet(10, fileName);

 

DataInputStream is = new DataInputStream(new FileInputStream(f));

 

 

 

 

 

for(int i = 0; i < pBank.length; i++)

 

{

 

pBank[i] = is.readInt();

 

}

 

 

 

for(int i = 0; i < velBank.length; i++)

 

{

 

velBank[i] = is.readInt();

 

}

 

 

 

for(int i = 0; i < ndBank.length; i++)

 

{

 

ndBank[i] = is.readDouble();

 

}

 

 

 

for(int i = 0; i < deltaBank.length; i++)

 

{

 

deltaBank[i] = is.readDouble();

 

}

 

 

 

min = is.readInt();

 

max = is.readInt();

 

index = is.readInt();

 

 

 

is.close();

 

 

 

if(wrap)

 

{

 

// If wrap is enabled reduce number of sequence steps (max) to fit within duration. Final duration added on play//

 

while((returnDuration() > masterDuration) && max > 1)

 

{

 

max = max – 1;

 

}

 

 

 

}

 

 

 

outlet(4, index);

 

outlet(8, min);

 

outlet(9, max);

 

 

 

}

 

 

 

 

 

public void write()throws FileNotFoundException

 

{

 

String fileName;

 

fileName = MaxSystem.saveAsDialog(“Save As”, “Filename”);

 

pw = new PrintWriter(fileName);

 

 

 

for(int i = 0; i < pBank.length; i++)

 

{

 

pw.print(pBank[i]);

 

}

 

 

 

for(int i = 0; i < velBank.length; i++)

 

{

 

pw.print(velBank[i]);

 

}

 

 

 

for(int i = 0; i < velBank.length; i++)

 

{

 

pw.print(ndBank[i]);

 

}

 

 

 

for(int i = 0; i < deltaBank.length; i++)

 

{

 

pw.print(deltaBank[i]);

 

}

 

pw.close();

 

 

 

}

 

 

 

 

 

 

 

public void list(double [] list)

 

{

 

if (getInlet() == 1)

 

{

 

if(list[0] == 0.0)

 

{

 

 

 

for (int i = 1; i < list.length ; i++)

 

{notePD[i] = list[i]/100.0;}

 

}

 

 

 

if(list[0] == 1.0)

 

{

 

 

 

for (int i = 1; i <list.length; i++)

 

{transPD[i] = list[i]/100.0;}

 

}

 

 

 

if(list[0] == 2.0)

 

{

 

 

 

for (int i = 1; i <list.length; i++)

 

{velChangePD[i] = list[i]/100.0;}

 

}

 

 

 

 

 

if(list[0] == 3.0)

 

{

 

 

 

for (int i = 1; i <list.length; i++)

 

{tempoBank[i] = list[i]/100.0;}

 

}

 

 

 

if(list[0] == 4.0)

 

{

 

 

 

for (int i = 1; i <list.length; i++)

 

{tempoPD[i] = list[i]/100.0;}

 

}

 

 

 

 

 

 

 

}

 

 

 

}

 

 

 

public void beatOut(int i, int p, int v, double nd)

 

{

 

 

 

if ( r < notePD[i])

 

{

 

outlet(0,p);

 

outlet(1,v);

 

outlet(2, nd);

 

}

 

 

 

outlet(5, count);

 

}

 

 

 

public void Play()

 

{

 

 

 

double delayTime = 200.0;

 

if (wrap == false)

 

{this.deltaBank[max] = (this.ndBank[max] +  lastDelta);}

 

//Final delay is defaulted as equal to final note duration when lastDelta = 0.//

 

else

 

setFinalDelta(masterDuration – getRunningDur());

 

{this.deltaBank[max] = lastDelta;}

 

getDuration();

 

// If Wrap is on, then the last delta is set at the difference between the running dur prior to max and the desired duration. Just need to ensure differnce always pos//

 

 

 

this.r = generator.nextDouble();

 

outlet(3, r);

 

 

 

int p =  pBank [count];

 

int v = velBank [count];

 

double nd = ndBank [count]  ;

 

// assign array values to local variables

 

 

 

if(r < transPD[count])

 

{p = pBank[count] + transBank[count];}

 

if(r < velChangePD[count])

 

{v = velBank[count] + velChangeBank[count];}

 

 

 

beatOut(count, p, v, nd);

 

 

 

if( r < tempoPD[count])

 

{

 

delayTime = (deltaBank[count]*tempoBank[count]);

 

}

 

else

 

{

 

delayTime = deltaBank[count];

 

}

 

 

 

 

 

clock.delay(delayTime);

 

 

 

if (count < max)

 

{

 

r = generator.nextDouble();

 

count = count + 1;

 

}

 

else

 

{

 

count = min;

 

}

 

}

 

 

 

protected void notifyDeleted()

 

{

 

clock.release();

 

post(“Feck”);

 

}

 

 

 

 

 

 

 

 

 

 

 

}

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Tagged with:
 

Microsound and Time

On November 7, 2011, in Uncategorized, by enemyin1

In Splice, Freeze, Stretch and Mutate: Digital rhythm as harbinger of the event  Eleni Ikoniadou asks if the manipulation of microsound in granular synthesis reveals a “rhythmic time” below the level of our awareness of temporal succession. More microsound here!

Tagged with:
 

 

Laurie Anderson’s elegant proposal for a sound installation in which the audience’s own body is used simultaneously as conductive medium and speaker nicely illustrates a problem confronting the Located Event theory of sound (LET – See Roden 2010 – web published version here). LET comes in two flavors. The first, due to Robert Casati and Jerome Dokic holds that sounds are resonance events in objects. The other, due to Casey O’Callaghan, holds that sounds are disturbances in a medium caused by vibrating objects. On the first theory, space ships really make sounds in a vacuum since the sounds just are the vibrations induced in them by their propulsion systems. According to the second, they don’t, since there’s no medium in which auditory pressure waves can occur.

The fact that both theories cohere more or less equally with folk psychoacoustics is a nice case of epistemic underdetermination. While Casati and Dokic’s view implies that there is a sound located in a vibrating tuning fork contained in an evacuated jar; O’Callaghan’s implies that there is none. Thus most folk would likely judge that there is no sound in the evacuated jar. However, were the air in a jar containing a vibrating tuning fork to be alternately evacuated and replenished they would probably perceive this as an alteration in the conditions of audition of a continuous sound, rather than the alternation of discrete sound events.

However, the LET is also subject to metaphysical indeterminacy or ‘slack’. The causal influence that eventually produces an auditory experience propagates through various stages of processing and transduction. In a digital audio system the information which eventually determines the vibratory behaviour may be an array of sample values stored in an mp3 audio file. These need to converted into an AC/DC current by a digital analog converted which itself will control a speaker diaphragm generating pressure waves in the air. A speaker diaphragm doesn’t resonate on its own – it needs electrical input.

Thus it might seem that a sound produced by in such a setup is located in the whole system (computer-DAC-speaker) and not in the speaker diaphragm. On the other hand, the computer doesn’t vibrate so as to output the sound stored in the mp3: its activity simply consists in producing a stream of numerical values which tell the DAC what to do. So where is the sound event?   If you cut off the digital stream the sound will stop, so it is tempting to view the computer-dac-speaker system as a single resonating system since it is the whole shebang which produces and maintains the sound. On the other hand, the computer does not vibrate so as to distrub the air; only the speaker diaphragm does that. So there are reasons for locating the sound in the speaker if you are Casati and Dokic or on the interface between diaphragm and air if you are O’Callaghan.

In Anderson’s setup, however, the sound is produced by a tape source in the table (one candidate for event location) but what you hear is also due to resonance in your own cranial cavities. So have we one sound located in the system tape-table-screws-elbows-skull represented in the diagram or a series of sonic events (including the one under the table and the one in your head)? Need there be any metaphysical fact to the matter about where the sound is? I think the claim that there need not be is quite supportable. The sound event occurs, but certain facts about its extent and location are inherently vague.

Interestingly, this does not imply that the sound event is some weird noumenal pulsion welling up beyond our representational capacities. Clearly, we do (in some sense) locate sounds, run fourier transforms on them, record them, sample them, etc. Representing sounds is what our auditory systems are designed to do and what studio technicians are paid to do.

Roden, David 2010 ‘Sonic Art and the Nature of Sonic Events’, in Bullot, N.J. & Egré, P. (eds.) Objects and Sound Perception special issue, Review of Philosophy and Psychology 1(1), pp. 141-156

Tagged with:
 

Radical Computer Music

On February 9, 2011, in Uncategorized, by enemyin1

SYGNOK & The War For Radical Computer Music

Here’s intriguing extract from a film about Gæoudjiparl van den Dobbelsteen, originator of Radical Computer Music. RCM is not merely made with computers but designed for computers; or, more specifically, for prospective artificial intelligences/life forms that do not currently exist.

I need to research this further. RCM could  an exemplary aesthetic vector for the posthuman, an elaborate metafictional joke, or both.

Tagged with: