Coupled Oscillators Systems and Auditory Percepts of Synchrony

*Click here to read the paper: Phase Coherence as a Measure of Perceptual Synchrony

Coupled oscillator models are useful in describing synchronistic behaviors found in a broad array of biological and chemical systems including the mechanisms involved in firefly synchronization, pacemaker cell interactivity, and circadian rhythms.

Within the realm of music cognition, much research has drawn upon this dynamical system to describe the auditory processing involved in the detection of rhythmic periodicity. Neural Resonance and Dynamic Attending Theory have asserted that neural rhythms account for the interplay between attentional coordination and the stimuli that comprise our external phenomenological world2. Both of these theories use a variety of coupled oscillator models to describe the complex interactions that occur between attention, temporal expectancy, and sensory-motor coordination. From these oscillatory interactions, percepts dealing with pulse and meter arise in the context of complex musical rhythms.

I’ve been interested in applying the strange synchronistic behaviors of coupled oscillators to sound and music, namely as a generative, controllable compositional device. From an auditory perspective, I’m interested in the perceptual coherence of sound mass, timbre, and rhythmic regularity, and synchronization as an auditory percept.


  • At what point does an overall timbre cohere to suggest a composite sound mass that may induce a pulse percept, or entrain(ing) rhythm
  • To what extent does a listener perceive of a ‘continuity of synchrony’ as occurring over time as compared to a phase coherence (see below) parameter inherent in the model.
  • How is this subjective measure of ‘synchrony’ change with respect to parameterization, such as number of sounding events, periodic frequency, etc…

I wrote a collective synchronization program in ChucK that allows for the instantiation of different networks of phase and frequency coupled oscillators. I then interfaced this program with a Raspberry Pi to create a motor syncing device that uses several prepared dc motors that are event-triggered. What follows is a background on the dynamic system and a outline of one such implementation of this interface in a physical system….

Kuramoto Model

The Kuramoto Model4 describes a non-linear dynamical system that models system dynamics of phase-coupled oscillators

System behavior is ultimately a function of:

  1. Limit-cycle oscillators whose initialized frequencies are drawn from a Gaussian distribution
  2. Coupling Strength, K, determines whether system may change phase state (e.g. phase transitions, bifurcations, etc.)

Conceptually, this is a “swarm of points” moving around a circle at different speeds5.eq_00


Phase Coherence and Synchrony

The equation above can be written in terms of a complex order parameter, r(t), also called the phase coherence.

This can be thought of as the “collective rhythm” that is produced by the entire oscillator population. In terms of our swarming points, r is the radius vector that moves at a frequency of  $ \psi $


r(t) is an indication of phase coherence and synchrony (for more details concerning this connection, see my paper: Phase coherence as a measure of Perceptual Synchrony in Coupled-Oscillator Systems).  $ \psi\, (t) $ indicates the group’s average velocity.

Some basic properties of this system are illustrated in this video.


Self-Syncing Motor Program Overview

This hardware interface uses the coupled-oscillator model to drive several prepared dc motors that are event-triggered in concert with the system.

This flow chart shows the high-level view of the program –> hardware.



My ChucK program uses OSC messaging to deliver the phase positions of each oscillator and the complex order parameters to a python-listening program (on a Raspberry Pi) that parses the messages. Because of the complex timing routines needed to implement these independent signals, these messages need to be multi-threaded using the threads module in python. These control threads are transmitted by i2C (via GPIO pins) and delivered to a 16 channel PWM driver (PCA9685). This driver is actually designed for controlling LED arrays and therefore it can’t sink enough current to drive the 12 V DC motors I’m trying to control. Therefore, the output PWM signals are sent to simple H-Bridge drivers (TB6612) that can easily drive the small DC motors. This program flow is kind of strange because it’s using ChucK as the controller (in a CMV sense) even though ChucK actually isn’t providing any output audio (although it can ).

The ChucK program is long, and slightly arduous, so I won’t try to detail any specifics of it here but in the end it’s using the basic phase-coupled model highlighted above (see Equation 1). The user has control over the coupled oscillator system instantiation, number of oscillators in the system, initial frequency distribution (I use a Gaussian distribution), phase coupling matrix, frequency coupling matrix, and the sound source to apply to the individual oscillators. In this physical system implementation, I’m not using the sampled audio generated from the program but just using ChucK to provide control signals in the form of OSC messages.

The OSC messages are passed as an array (up to 200 oscillators per instantiated system) to the RPi which is running an OSC-listening script. Here are a couple code snippets. The first illustrates setting up an OSC-listener on the server that can call separate execution threads to control the motors.

#!/usr/bin/env python
import thread 
import RPi.GPIO as GPIO
import OSC

**** A BUNCH OF instantiation stuff ***** 
*** assumes the GPIO pins have been initialized and set as outputs

receive_address = '', 8888 # just locally 
# initialize OSC server
s = OSC.OSCServer(receive_address) # basic
s.addDefaultHandlers() # for unmatched messages

# define a msg handler function for the server to call 
def printing_handler(addr, tags, phases, source): 
    for x, num in enumerate(phases): # go through the OSC phase array from ChucK
        if num==0.0: # ChucK provides the cycle-limiting function, therefore the oscillator's phase angle will always be a 0.0 upon each oscillator cycle 
            print 'oscillator %r bang!' %(x) # print when cycle starts over
            # call triggerMotor function upon each cycle  
            thread.start_new_thread( triggerMotor, ("thread-" + str(x), delay time, x)) 

For the example provided, the PWM clocks are running at a base frequency of 60 Hz.

Here’s the simple triggerMotor() function that make calls to the i2c pwm driver and gets called as separate threads within the osc-listener function above. It uses a nice little library the folks at adafruit put together to streamline communication with the pwm driver. Among a bunch of other things, this assumes the GPIO pins are initialized and setup as outputs. I use them to toggle the polarity of the H-Bridge motor drivers (not shown in code). The triggerMotor function simply turns on the PWM to some value (0-4095) which will determine how quickly the motors will spin (it actually adjusts the duty cycle which effectively gets integrated by the dc motors themselves). In the end, this was adjusted by trial and error.

# need to have 
import Adafruit_PCA9685
import time

***bunch of stuff****

# have to initialize the PCA9685 using the default address (0x40)
pwm = Adafruit_PCA9685.PCA9685()

# triggerMotor() function, takes threadname as input, length of delay in seconds, and the oscillator number that will get triggered
def triggerMotor(threadName, delay, oscillator_num):
    pwm.set_pwm(oscillator_num, 0, servo_ON_value) 
    time.sleep(delay) # delay for sometime
    pwm_set_pwm(oscillator_num, 0, 0) # turn off motor

You can view the entire code along with the ChucK coupled oscillator program here:


Future Work

I’m particularly interested in scaling up this physical system to include massive amounts of sync-able kinetic-sonic objects. The PWM driver, as it is connected through i2c, is chainable such that it could technically drive 992 separate outputs. The main limitations at this scale would probably occur on the ChucK + Raspberry Pi side of things. Also, this implementation only looked at phase-coupled oscillator systems. Adding another state variable, frequency, into the mix would allow for a lot more unusual phase states. For instance, I’m interested in exploring the topology of the system’s mode-locking states. By virtue of their non-linearity, these systems can be induced/entrained into various states (bifurcations) by modifying their parameters over time. See the regime diagram below.


Simply put, this diagram represents the dynamics of a simple two oscillator system where one oscillator is the driver and the other is driven. The dark zones represent mode-locked states (also called attractors, and resonances) that the oscillators in the system become entrained to. By increasing the number of oscillators, the map shows a marked increase in complexity. This means that the system can be pushed in and out of different behavioral states by changing different variables of the underlying system (namely coupling strength). I’m interested in exploring to what extent these system states can induce interesting acoustic outcomes by way of navigating this phase-state topology.

Nevertheless, I imagine there are many interesting possibilities for this type of generative, neural-network-based system. What’s interesting to me about this system is how it manages to create interesting, hypnotic sounds from the simple periodic nature of its constituent oscillators. Oftentimes, generative music can sound dispassionate, lifeless or repetitive and un-dynamic. What’s built into these types of system are the heuristics of persistence and learning ( and hence, their use in neural networks). The fact that they have a mind of their own enables them the potential for surprise and their ability to be controlled, albeit limitedly, creates a feedback loop between ‘performer’ and ‘machine’ that I’ve found to be ripe with aesthetic potential. The gestalt of ‘coming-together’ is such a basic way to orientate our awareness of time-varying sound and an elegant way to explore the dynamic polyphony of collectively sounding events.




Large, E. W, Herrera, J. A, & Velasco, M. J. 2015. Neural Networks for Beat Perception in Musical Rhythm. Frontiers in Systems Neuroscience, 9. 1-14.

Large, E. W., & Kolen, J. F. 1994. Resonance and the Perception of Musical Meter. Connection Science, 6. 177-208. doi: 10.3389/fnsys.2015.00159

Jones, M. R. (1976). Time, Our Lost Dimension: Toward a New Theory of Perception, Attention, and Memory. Psychological Review, 83. 323-355.

Large, E. W., & Jones, M. R. 1999. The Dynamics of Attending: How People Track Time-Varying Events. Psychological Review, 106, 119-159.

Povel, D. J., & Essens, P. (1985). Perception of temporal patterns. Music Perception, 2, 411-440.

Strogatz, S.H. 2000. From Kuramoto to Crawford: exploring the onset of synchronization in populations of coupled oscillators. Physical D, 143. 1 – 20.

Strogatz, S. H., & Stewart, I. 1993. Coupled Oscillators and Biological Synchronization. Scientific American, December 1993. 102- 109.