-
Notifications
You must be signed in to change notification settings - Fork 0
Markov Chains
Markov chains are a way of representing stochastic processes as probabilities of state changes. This is easier to describe graphically with the diagram below: we are currently in state A
, which has a 60% probability of staying at state A
and a 40% probability of moving to state E
. State E
has a 70% chance of returning to A
, and a 30% chance of staying in state E
.
Markov chains are useful for generating stochastic musical streams with weighted values, for example we could model chord changes as arrows between chords (I => V
) and weight them depending on which changes should happen more frequently.
Markov syntax is enabled by pulling in the markov._
module, and we can create new chains using the MarkovChain
class. The arguments for MarkovChain
are the source, destination, and probability as a value between 0.0
and 1.0
. Keep in mind that source and destination of MarkovChain
s need to be of the same type; for example it would not make sense for a PitchClass
to change into a Beat
.
import rc.dsl.gen.markov._
val chain = MarkovChain(C, G, 0.5) // C -> G, 50% chance
This module also adds a sugary version of chain generation:
val chain2 = C.markov(G)(0.5)
val chain3 = "Hello".markov("World!")(1.0)
To get values out of Markov chains, we put them into a "universe" wrapper with an initial state. This will give us a stream of values generated by the chains:
val universe = MarkovUniverse(C, MarkovChain(C, G, 1.0), MarkovChain(G, C, 1.0))
universe.stream take 10 foreach println
Advice for non-Scala folks:
It would make sense that you might have a list of chains that you want to pass into the MarkovUniverse
, rather
than defining all of them when you define the universe. This will require using an expansion, as see below:
val chains = List(MarkovChain(C, G, 0.5), MarkovChain(C, C, 0.5), MarkovChain(G, C, 1.0))
val universe = MarkovUniverse(C, chains:_*)