Drawing pixels with javascript: An animated voronoi

For about 20 years I have managed to avoid javascript for interactive and animated content. So far I have used flash, director and java. But now with the coming of HTML5, I am starting to like this (old) option. Desktops are fast enough to render javascripts directly to screen.

The example uses Canvas and the getContext() method. Here we see 350 x 350 = 122500 pixels being calculated on every frame. The pixels are coloured depending on their distance to the moving points. The result is an animated Voronoi.

Feel free to play with the settings below the animation. The source javascript is also available for download.


Voronoi2Preview
Your browser does not support the HTML5 canvas tag.


Posted in Research, Uncategorized | Tagged , , | Comments Off

Markov Chain – the implementation

This article builds on the previous article Markov Chain – or how to predict the future. Assuming that one understands the basics of the Markov Chain, I will describe an implementation using the programming language processing.

Generating the Markov Chain

To generate the Markov Chain, we will need some source data to generate the Markov Property. One can use for instance statistic on web browsing, musical notes, events of the day, letters or words. Any chain of events that expose a certain predictability will do. In our example we will use words, since language is something we all understand.

We will implement the following strategy:

  1. Load a text and turn it into a single lowercase string.
  2. Extract sentences using the dot(.), question-mark(?), exclamation-mark(!) as sentence separator.
  3. Remove any special character (those that are not letters or spaces).
  4. WordList: Create a list containing all words. If a word appears several times, this list will contain the word several times.
  5. ComboList: Create another list containing all 1st degree word combinations (i.e. “hello there”).
  6. Generate your first word by picking a random word from the WordList. (i.e. “and”)
  7. Make a sub list of all combinations that start with the previous word. (i.e. “and then”, “and he”, etc..)
  8. Create a new word:
    a. If the sub list contains items, pick the last word of a random combo. (i.e. “and then”)
    b. If the sub list is empty, pick a new random word

  9. Go to 7 using the new word (repeating as much as you want)

If you would want to create proper sentences with a beginning and and ending, we would have to implement three more things:

  • Create a list of words that are only at the beginning of sentences to be used in (6).
  • In (2), add the dot(.) at the ending of each sentence seperated by a space to be identified as a word.
  • In (9) we would have to repeat until the last generated word would be a dot.

Implementation

1. Load text and turn into a single string:

String[] allLines = loadStrings("bbe_noChap.txt");
StringBuilder sb = new StringBuilder();
for (int i=0;i<allLines.length;i++) {
  sb.append(allLines[i].toLowerCase()+" ");
}
String fullText = sb.toString();

2. Extract sentences

String[] allSentences = fullText.split("[\\.?!]");

Ok, that looks simple, but What The Fuck? \\.?! The String.split() function splits a String into an array of Strings using a give text pattern. For instance “1-2-3″.split(“-”) will result in an array of Strings containing “1″, “2″ and “3″.

Note that the text pattern is interpreted as a regular expression. The use of regular expressions is a powerful tool for matching text patterns. In our case the String “[\\.?!]” just splits the string using any of the above characters. Since the dot(.) is special character in regular expressions, we need to escape it using to backslashes.

3. Remove any special charaters.

If we don’t want to have words like “man,”, we need remove these special characters from our sentences. Another regular expression is used to replace anything that is not a letter or a whitespace with an empty character. The expression “[a-zA-Z]” stands for any character. “\\s” stands for the white space. The “^” character negates the expression.

NOT ( a to z OR A to Z OR whitespace )
[^ a-z A-Z \\s ]

for (int i=0;i<allSentences.length;i++) {
  allSentences[i] = allSentences[i].replaceAll("[^a-zA-Z\\s]","");
}

4. Create a list containing all words.

Now we will create the ArrayList containing all words. Because there might be an empty space here and there, we will not add this to our list.

ArrayList allWordList = new ArrayList();
for (int i=0;i<allSentences.length;i++) {
  String[] words = allSentences[i].split(" +");
  for (int j=0;ji<words.length;j++) {
    if (!words[j].equals(" ")) {
      allWordList.add(words[j]);
    }
  }
}

5. Create a list containing all 1st degree combinations.

Very similar to 4, but instead we are extracting word pairs. A sentence like “Markov Chains are fun” would result in the array of Strings “Markov Chains”, “Chains are”, “are fun”.

ArrayList allComboList = new ArrayList();
for (int i=0;ii<allSentences.length;i++) {
  String[] words = allSentences[i].split(" ");
  String lastWord = "";
  for (int j=0;ji<words.length;j++) {
    if (!words[j].equals(" ")) {
      if (!lastWord.equals("")) {
        String combo = lastWord+" "+words[j];
        allComboList.add(combo);
      }
      lastWord = words[j];
    }
  }
}

6. Generate first word.

The word list represents an equal distribution of all words. Choosing random words from this list will result in a list of words exposing the same statistical occurrences as the source texts.

String word = (String)allWordList.get((int)random(allWordList.size()));

List of combinations starting with the word "but"

List of combinations starting with the word “but”

7. Generate a sub list of word pairs to choose from.

We are now looking for a word pair that starts with the previous word. To do so, we simply need to check the word combination list if these start with the previous word plus ” ” and add it to our temporary list.

String toFind = word+" ";
ArrayList subComboList = new ArrayList();
for (int i=0;i<allComboList.size();i++) {
  String combo = (String)allComboList.get(i);
  if (combo.startsWith(toFind)) {
    subComboList.add(combo);
  }
}

8. Create a new word

If our previous list contains words, we just pick a random combination and extract the last word. If the list is empty, we just pick another random word.

String newWord = "";
if (subComboList.size()>0) {
  String combo = (String)subComboList.get((int)random(subComboList.size()));
  newWord = combo.split(" ")[1];
} else {
  newWord = (String)allWordList.get((int)random(allWordList.size()));
}

9. Go to (7) using the newWord as word

word = newWord;

Feel free to download the processing implementation and play around with it. Markov Chain Processing

Posted in Optimal artist, Processing, Research, Uncategorized | Tagged , , , , | Comments Off

Markov Chain – or how to predict the future

When stepping into the domains of artificial intelligence, one soon comes confronted with the Markov Chain. In this article I will try to unravel the mysteries behind it without using too much mathematics.

1st order Markov Chain

The Markov Chain is a mathematical system in which future “events” can be statistically predicted by past events. Let’s look at a small example:

Markov Chain example

A Markov Chain for Robert’s morning routine

This chart represents a statistical distribution of events during a morning routine of a fictional character Robert. This chart could have been created by recording the events and their chronological order each day.

Let’s say the day starts with [Alarm goes]. The chance that Robert will [Snooze] the alarm is 65% and 35% that he will [Get up]. After this, the chance to [Go to bathroom] will be 70% and so on.

We could now create a computer program that models the behaviour of Robert by creating a random number after each event and predicting the next future event. Although the future event might not be what Robert actually does, the overall chain of events will expose the same statistical properties as the Markov Chain.

Application of the Markov Chain

Markov Chains can become very interesting when using them in the domain of art. Since each language has its own statistical distribution of words, they can be used to generate pseudo texts or poetry. Markov Chains are also popular in the use of music.

If I were to record all my improvisations on a piano and see each note as an event, I could create a simple Markov Chain describing the distributional properties of notes of my personal style. When doing so, one quickly runs into the problem that these melodies would sound like aimless wandering. The same happens with language. Let’s look at another example.

I have taken the Bible as my source (because it’s a freely available text) where each letter represents an event. When applying this chain to generate new texts, I ended up with a bizarre conglomeration of characters that only remotely resembles the English language:

“Ayoto in he yo thok tomend irs tong i tounto wis tasnd s id bed tom thing t an isaw, aceeminthifremoudene he arofr wis, g ga je thas then f d simess s fa tt m s we ther ther ht car trofu fon y orat we masaseanghe be, puire whagr saweve t jace thing bll t.”

Interesting. Silly. Bizarre.

Higher order Markov Chain

So why is this not producing the results we want? We are getting results that expose similar properties to the English language, but we are missing the deeper structure.

In this case the chance for a transition from ‘t’ to ‘h’ is fairly high (3%). The chance for a transition from ‘h’ to ‘a’ is also fairly high (0.75%). This would result in a word beginning with ‘tha’, of which there are not many.

Instead of looking at the transition of one event to an other, we can look at several previous events for determining a future event. If we go back to the example of Robert and start with the event [Go to kitchen]. There would be a 75% chance of him to [Make coffee] at any time of the day, rendering him completely hyper. If we consider previous events as well, we can create a more sensible chain:

Markov Chain 2nd order

Example for using a higher-order Markov Chain.

Now Robert will very likely make coffee when he got up before going to the kitchen. If he came home before going to the kitchen, he will very likely get himself a beer. This sounds like a more plausible event. This is what we call a second-order Markov Chain.

Applying this to our Bible, we can start producing more intelligible results:

2nd order (still a bit weird):

“and hime ame to for frearaohshamillor for of and our dim, is is of eas said”
“and words ween to haver lacoving whour orter thent”

3rd order (much better):

“them on wher have this placing of jacob said the was go circumcision their for the his prod after, god.”
“and even offertain, and ish, went on him.”
“and before in the worden yountry, which for he withom your fath joseph, the cattle are had and hough padditing they dawn i with to your of the how and field to years gone taken this people.”

4th order (haha, awesome!):

“and said, it cain, and son, which he is which had not kept jacobs daughters of the god of them.”
“every daughter face the sent wide a hundred and her vessel of the land i have to what evil,”

Different approaches

For generating “Biblish” words, it seems that I have succeeded. But for generating Biblish sentences, I don’t think I did. When thinking of language, we can approach texts from a greater distance. Instead of interpreting characters as single events, we can use entire words instead. We will then have chains that tell us something about the relationship of words, not characters, thus building proper sentences.

1st order (weird, but they seem sentences):

“and he said, and it, causing its value of all on israel did you, and see that it, i have to you are to destruction. ”
“and in the servant ; and a dream by the cover of one will be as moses , sent him.”

2nd order (getting better):

“and enoch went on living with me if i will take away the sense of all god had given his journey with all he has given all he was with his father said to moses: and when moses’ hand was stretched out to the child the name of the fruit of the mountain the lord gave orders to put me to rest there for some time.”

3rd order (sensible text, yay!):

“and there will be a wanderer in flight over the earth.”
“and god said to him, truly, i who am only dust, have undertaken to put my honour to the test”

Where to go from here

Now that the mysteries of the Markov Chain have been cleared (I hope), we can start implementing this in code. In the next article I will show how to go about implementing this. For now, I will leave you with a non-Markov-Chain generated composition.

Posted in Optimal artist, Research, Uncategorized | Tagged , , , | Comments Off