Difference between revisions of "2005:Audio Drum Detection Results"
From MIREX Wiki
Line 31: | Line 31: | ||
{| border="1" | {| border="1" | ||
|- style="background: yellow; text-align: center;" | |- style="background: yellow; text-align: center;" | ||
− | ! colspan=" | + | ! colspan="9" | CHRISTIAN DITTMAR COLLECTION |
|-style="background: yellow;" | |-style="background: yellow;" | ||
− | ! Rank !! Participant !! Total Average Classification F-measure !! Total Overall Onset Precision !! Total Overall Onset Recall !! Total Overall Onset F-measure !! BD Average F-measure !! HH Average F-measure !! SD Average F-measure | + | ! Rank !! Participant !! Total Average Classification F-measure !! Total Overall Onset Precision !! Total Overall Onset Recall !! Total Overall Onset F-measure !! BD Average F-measure !! HH Average F-measure !! SD Average F-measure |
|- | |- | ||
|1 || Dittmar, C. || 0.753 || 77.73% || 72.56% || 0.751 || 0.783 || 0.696 || 0.790 | |1 || Dittmar, C. || 0.753 || 77.73% || 72.56% || 0.751 || 0.783 || 0.696 || 0.790 |
Revision as of 22:06, 26 July 2010
Goal: To detect the occurences of drum events in polyphonic audio.
Dataset: At least 50 files of both live and sequenced music, with many genres encompassed and various degrees of drum audio contained in the files. Three collections of music were used: Christian Dittmar (CD), Koen Tanghe (KT) and Masataka Goto (MG). Participants were evaluated against music from each individual collection, and then the three collection scores are averaged to produce a composite score.
OVERALL | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Rank | Participant | Total Average Classification F-measure | Total Overall Onset Precision | Total Overall Onset Recall | Total Overall Onset F-measure | BD Average F-measure | HH Average F-measure | SD Average F-measure | Runtime (s) | Machine |
1 | Yoshii, Goto, & Okuno | 0.670 | 64.92% | 67.02% | 0.659 | 0.728 | 0.574 | 0.702 | 8534 | B 0 |
2 | Tanghe, Degroeve, & De Baets 3 | 0.611 | 63.30% | 71.19% | 0.670 | 0.688 | 0.601 | 0.555 | 1337 | Y |
3 | Tanghe, Degroeve, & De Baets 4 | 0.609 | 62.57% | 71.09% | 0.666 | 0.686 | 0.590 | 0.562 | 1342 | Y |
4 | Tanghe, Degroeve, & De Baets 1 | 0.599 | 60.02% | 72.45% | 0.657 | 0.677 | 0.588 | 0.542 | 1350 | Y |
5 | Dittmar, C. | 0.588 | 65.68% | 63.38% | 0.645 | 0.606 | 0.585 | 0.581 | 673 | R |
6 | Paulus, J. | 0.499 | 59.61% | 64.86% | 0.621 | 0.527 | 0.587 | 0.430 | 1137 | L |
7 | Gillet & Richard 2 | 0.443 | 77.09% | 40.63% | 0.532 | 0.598 | 0.334 | 0.428 | 21248 | F |
8 | Gillet & Richard 1 | 0.391 | 69.84% | 37.98% | 0.492 | 0.533 | 0.343 | 0.317 | 21997 | F |
CHRISTIAN DITTMAR COLLECTION | ||||||||
---|---|---|---|---|---|---|---|---|
Rank | Participant | Total Average Classification F-measure | Total Overall Onset Precision | Total Overall Onset Recall | Total Overall Onset F-measure | BD Average F-measure | HH Average F-measure | SD Average F-measure |
1 | Dittmar, C. | 0.753 | 77.73% | 72.56% | 0.751 | 0.783 | 0.696 | 0.790 |
2 | Yoshii, Goto, & Okuno | 0.690 | 64.25% | 62.75% | 0.660 | 0.714 | 0.533 | 0.811 |
3 | Tanghe, Degroeve, & De Baets 3 | 0.595 | 61.85% | 64.85% | 0.633 | 0.685 | 0.568 | 0.548 |
4 | Tanghe, Degroeve, & De Baets 4 | 0.589 | 62.45% | 64.22% | 0.628 | 0.668 | 0.555 | 0.559 |
5 | Tanghe, Degroeve, & De Baets 1 | 0.580 | 57.78% | 65.94% | 0.616 | 0.669 | 0.553 | 0.533 |
6 | Paulus, J. | 0.440 | 55.82% | 54.36% | 0.551 | 0.430 | 0.497 | 0.424 |
7 | Gillet & Richard 2 | 0.401 | 77.22% | 30.16% | 0.434 | 0.658 | 0.156 | 0.364 |
8 | Gillet & Richard 1 | 0.339 | 66.33% | 30.57% | 0.418 | 0.466 | 0.279 | 0.265 |
KOEN TANGHE COLLECTION | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Rank | Participant | Total Average Classification F-measure | Total Overall Onset Precision | Total Overall Onset Recall | Total Overall Onset F-measure | BD Average F-measure | HH Average F-measure | SD Average F-measure | Runtime (s) | Machine |
MASATAKA GOTO COLLECTION (50 songs from RWC Music Database: RWC-MDB-P-2001) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
Rank | Participant | Total Average Classification F-measure | Total Overall Onset Precision | Total Overall Onset Recall | Total Overall Onset F-measure | BD Average F-measure | HH Average F-measure | SD Average F-measure | Runtime (s) | Machine |