An Australian PhD researcher from Queensland University of Technology (QUT) has created a jamming robot which can improvise musical rhythms and melodies to the music it listens to. If Jambot, as it is named, is a musical robot, it's not made of metal. Here is how the researcher describes Jambot. 'It is a piece of computer software that could play along to a soloist, providing a backing track, or improvising a conga drum solo over recorded music being played by a DJ, for example.' Jambot, which is still in its prototype stage, will be commercialized by the computer scientist when he completes his PhD next year. But read more...
You can see above a photograph of PhD researcher Toby Gifford, from QUT's Faculty of Creative Industries. (Credit: QUT) Gifford belongs to the Computational Arts Research Group led by Associate Professor Andrew Brown. Gifford was named a runner-up in the Ideas Competition, run by QUT's commercialization company bluebox, winning him $7,500 to help commercialize his concept. You'll find the other winners by reading this press release (PDF format, 2 pages, 592 KB, September 22, 2008).
Here are some additional quotes from the QUT news article. "'It could also allow a live band to have a fuller sound when they can't have as many band members as they would like on stage.' Mr Gifford said Jambot utilised some recent discoveries in audio music perception combined with his own clever algorithms to keep up with musicians as they played. ''Under the hood, I have created new algorithms which quickly identify what's going on in the music as it's being played,' he said. 'It keeps track of tempo changes as they occur during a performance. It also judges pitch and users will be able to choose the particular instruments they wish to jam with.'"
There are not many technical details about this software. However, Gifford and Brown co-signed Listening for noise: An approach to percussive onset detection (PDF format, 5 pages, 659 KB), a paper presented at the Australasian Computer Music Conference held in Sydney on July 10-12, 2008. You'll find more information about the event on the Australasian Computer Music Association website.
Here is the abstract of this paper. "This paper introduces a new onset detection algorithm for the extraction of percussive attack times from a musical audio signal. The crux of the technique is to search for patterns of increasing noise in the signal. We therefore refer to it as the Stochastic Onset Detection (SOD) technique. This technique is designed for use with complex audio signals consisting of both pitched and percussive instrumental sounds together, and aims to report solely on the timing of percussive attacks. In contrast to most onset detection algorithms it operates in the time domain and is very efficient; suiting our requirements for real-time detection. In this paper we describe our approach to onset detection, compare this with other approaches, outline our detection algorithm and provide preliminary results from musical trials to validate the algorithm’s effectiveness."
And here is an excerpt from the conclusions. "We have developed an algorithm based on this approach and provided preliminary test results that indicate that it is efficient and effective. The algorithm seems to be particularly good at detecting high pitched percussive sounds such as high-hats, which could be useful for tempo tracking of dance/rock tracks as the high-hat is often used to keep the pulse. [...] We have plans to undertake future developments of this approach that include the addition of predictive assistance based on regularities and psychoacoustic models of expectation that we anticipate will particularly allow for variations in transient attack rates and allow the algorithm to have more sense of syncopated or irregular rhythms."
Sources: Queensland University of Technology News, October 22, 2008; and various websites
You'll find related stories by following the links below.