2017:Multiple Fundamental Frequency Estimation & Tracking Results - MIREX Dataset
Contents
Introduction
These are the results for the 2017 running of the Multiple Fundamental Frequency Estimation and Tracking task on MIREX dataset. For background information about this task set please refer to the 2017:Multiple Fundamental Frequency Estimation & Tracking page.
General Legend
Sub code | Submission name | Abstract | Contributors |
---|---|---|---|
CB1 | Silvet | Chris Cannam, Emmanouil Benetos | |
CB2 | Silvet Live | Chris Cannam, Emmanouil Benetos | |
KD1 | multiF0_sampled | Karin Dressler | |
KD2 | multiF0_midi | Karin Dressler | |
MHMTM1 | End-to-End Multi-instrumental ConvNet | Gaku Hatanaka, Shinjiro Mita, Alexis Meneses, Daiki Miura, Nattapong Thammasan | |
MHMTM2 | Ensemble category ConvNet to F0 ConvNet | Gaku Hatanaka, Shinjiro Mita, Alexis Meneses, Daiki Miura, Nattapong Thammasan | |
PR1 | LPCR | Leonid Pogorelyuk, Clarence Rowley | |
PRGR1 | SOT MFFE&T 901 | Katarzyna Rokicka, Adam Pluta, Rafal Rokicki, Marcin Gawrysz | |
PRGR2 | SOT MFFE&T 902 | Katarzyna Rokicka, Adam Pluta, Rafal Rokicki, Marcin Gawrysz | |
THK1 | Spectral Convolutions | John Thickstun, Zaid Harchaoui, Dean Foster, Sham Kakade | |
WCS1 | Piano_Transcription | Li Su, Derek Wu, Berlin Chen | |
ZCY2 | Multiple pitch estimation | Fuliang Yin, Weiwei Zhang, Zhe Chen | |
CT1 | convlstm | Carl Thomé | |
SL1 | samuel-li-onset-detector | Samuel Li |
Task 1: Multiple Fundamental Frequency Estimation (MF0E)
MF0E Overall Summary Results
Below are the average scores across 40 test files. These files come from 3 different sources: woodwind quintet recording of bassoon, clarinet, horn,flute and oboe (UIUC); Rendered MIDI using RWC database donated by IRCAM and a quartet recording of bassoon, clarinet, violin and sax donated by Dr. Bryan Pardo`s Interactive Audio Lab (IAL). 20 files coming from 5 sections of the woodwind recording where each section has 4 files ranging from 2 polyphony to 5 polyphony. 12 files from IAL, coming from 4 different songs ranging from 2 polyphony to 4 polyphony and 8 files from RWC synthesized midi ranging from 2 different songs ranging from 2 polphony to 5 polyphony.
Detailed Results
Precision | Recall | Accuracy | Etot | Esubs | Emiss | Efa | ||
---|---|---|---|---|---|---|---|---|
CB1.results.task1 | 0.804 | 0.519 | 0.498 | 0.529 | 0.093 | 0.389 | 0.047 | |
CB2.results.task1 | 0.656 | 0.460 | 0.420 | 0.636 | 0.174 | 0.367 | 0.095 | |
KD1.results.task1 | 0.724 | 0.811 | 0.669 | 0.419 | 0.104 | 0.085 | 0.231 | |
KD2.results.task1 | 0.724 | 0.811 | 0.669 | 0.419 | 0.104 | 0.085 | 0.231 | |
MHMTM1.results.task1 | 0.727 | 0.782 | 0.655 | 0.441 | 0.119 | 0.099 | 0.223 | |
MHMTM2.results.task1 | 0.391 | 0.200 | 0.190 | 0.860 | 0.288 | 0.512 | 0.060 | |
PR1.results.task1 | 0.487 | 0.584 | 0.418 | 0.926 | 0.237 | 0.178 | 0.511 | |
PRGR1.results.task1 | 0.603 | 0.443 | 0.408 | 0.646 | 0.225 | 0.332 | 0.089 | |
PRGR2.results.task1 | 0.514 | 0.681 | 0.476 | 0.781 | 0.216 | 0.103 | 0.462 | |
THK1.results.task1 | 0.822 | 0.789 | 0.720 | 0.316 | 0.076 | 0.136 | 0.104 | |
WCS1.results.task1 | 0.640 | 0.806 | 0.593 | 0.569 | 0.101 | 0.094 | 0.375 | |
ZCY2.results.task1 | 0.627 | 0.562 | 0.506 | 0.601 | 0.236 | 0.203 | 0.163 |
Detailed Chroma Results
Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)
Precision | Recall | Accuracy | Etot | Esubs | Emiss | Efa | ||
---|---|---|---|---|---|---|---|---|
CB1.results.task1 | 0.851 | 0.551 | 0.527 | 0.497 | 0.062 | 0.389 | 0.047 | |
CB2.results.task1 | 0.747 | 0.527 | 0.479 | 0.568 | 0.106 | 0.367 | 0.095 | |
KD1.results.task1 | 0.751 | 0.842 | 0.694 | 0.389 | 0.074 | 0.085 | 0.231 | |
KD2.results.task1 | 0.751 | 0.842 | 0.694 | 0.389 | 0.074 | 0.085 | 0.231 | |
MHMTM1.results.task1 | 0.758 | 0.817 | 0.682 | 0.406 | 0.084 | 0.099 | 0.223 | |
MHMTM2.results.task1 | 0.602 | 0.314 | 0.295 | 0.746 | 0.174 | 0.512 | 0.060 | |
PR1.results.task1 | 0.569 | 0.689 | 0.489 | 0.822 | 0.132 | 0.178 | 0.511 | |
PRGR1.results.task1 | 0.712 | 0.522 | 0.479 | 0.567 | 0.146 | 0.332 | 0.089 | |
PRGR2.results.task1 | 0.595 | 0.783 | 0.547 | 0.680 | 0.115 | 0.103 | 0.462 | |
THK1.results.task1 | 0.833 | 0.800 | 0.730 | 0.304 | 0.064 | 0.136 | 0.104 | |
WCS1.results.task1 | 0.660 | 0.831 | 0.612 | 0.544 | 0.075 | 0.094 | 0.375 | |
ZCY2.results.task1 | 0.740 | 0.670 | 0.600 | 0.493 | 0.128 | 0.203 | 0.163 |
Individual Results Files for Task 1
CB1= Chris Cannam, Emmanouil Benetos
CB2= Chris Cannam, Emmanouil Benetos
KD1= Karin Dressler
KD2= Karin Dressler
MHMTM1= Gaku Hatanaka, Shinjiro Mita, Alexis Meneses, Daiki Miura, Nattapong Thammasan
MHMTM2= Gaku Hatanaka, Shinjiro Mita, Alexis Meneses, Daiki Miura, Nattapong Thammasan
PR1= Leonid Pogorelyuk, Clarence Rowley
PRGR1= Katarzyna Rokicka, Adam Pluta, Rafal Rokicki, Marcin Gawrysz
PRGR2= Katarzyna Rokicka, Adam Pluta, Rafal Rokicki, Marcin Gawrysz
THK1= John Thickstun, Zaid Harchaoui, Dean Foster, Sham Kakade
WCS1= Li Su, Derek Wu, Berlin Chen
ZCY2= Fuliang Yin, Weiwei Zhang, Zhe Chen
Info about the filenames
The filenames starting with part* comes from acoustic woodwind recording, the ones starting with RWC are synthesized. The legend about the instruments are:
bs = bassoon, cl = clarinet, fl = flute, hn = horn, ob = oboe, vl = violin, cel = cello, gtr = guitar, sax = saxophone, bass = electric bass guitar
Friedman tests for Multiple Fundamental Frequency Estimation (MF0E)
The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the performance (accuracy) on individual files.
Tukey-Kramer HSD Multi-Comparison
TeamID | TeamID | Lowerbound | Mean | Upperbound | Significance |
---|---|---|---|---|---|
THK1.results.task1 | KD2.results.task1 | -0.9423 | 1.6875 | 4.3173 | FALSE |
THK1.results.task1 | KD1.results.task1 | -0.9423 | 1.6875 | 4.3173 | FALSE |
THK1.results.task1 | MHMTM1.results.task1 | -0.5298 | 2.1000 | 4.7298 | FALSE |
THK1.results.task1 | WCS1.results.task1 | 1.3452 | 3.9750 | 6.6048 | TRUE |
THK1.results.task1 | ZCY2.results.task1 | 3.1702 | 5.8000 | 8.4298 | TRUE |
THK1.results.task1 | CB1.results.task1 | 3.0202 | 5.6500 | 8.2798 | TRUE |
THK1.results.task1 | PRGR2.results.task1 | 3.7702 | 6.4000 | 9.0298 | TRUE |
THK1.results.task1 | CB2.results.task1 | 5.1202 | 7.7500 | 10.3798 | TRUE |
THK1.results.task1 | PR1.results.task1 | 4.8952 | 7.5250 | 10.1548 | TRUE |
THK1.results.task1 | PRGR1.results.task1 | 5.2702 | 7.9000 | 10.5298 | TRUE |
THK1.results.task1 | MHMTM2.results.task1 | 7.7952 | 10.4250 | 13.0548 | TRUE |
KD2.results.task1 | KD1.results.task1 | -2.6298 | 0.0000 | 2.6298 | FALSE |
KD2.results.task1 | MHMTM1.results.task1 | -2.2173 | 0.4125 | 3.0423 | FALSE |
KD2.results.task1 | WCS1.results.task1 | -0.3423 | 2.2875 | 4.9173 | FALSE |
KD2.results.task1 | ZCY2.results.task1 | 1.4827 | 4.1125 | 6.7423 | TRUE |
KD2.results.task1 | CB1.results.task1 | 1.3327 | 3.9625 | 6.5923 | TRUE |
KD2.results.task1 | PRGR2.results.task1 | 2.0827 | 4.7125 | 7.3423 | TRUE |
KD2.results.task1 | CB2.results.task1 | 3.4327 | 6.0625 | 8.6923 | TRUE |
KD2.results.task1 | PR1.results.task1 | 3.2077 | 5.8375 | 8.4673 | TRUE |
KD2.results.task1 | PRGR1.results.task1 | 3.5827 | 6.2125 | 8.8423 | TRUE |
KD2.results.task1 | MHMTM2.results.task1 | 6.1077 | 8.7375 | 11.3673 | TRUE |
KD1.results.task1 | MHMTM1.results.task1 | -2.2173 | 0.4125 | 3.0423 | FALSE |
KD1.results.task1 | WCS1.results.task1 | -0.3423 | 2.2875 | 4.9173 | FALSE |
KD1.results.task1 | ZCY2.results.task1 | 1.4827 | 4.1125 | 6.7423 | TRUE |
KD1.results.task1 | CB1.results.task1 | 1.3327 | 3.9625 | 6.5923 | TRUE |
KD1.results.task1 | PRGR2.results.task1 | 2.0827 | 4.7125 | 7.3423 | TRUE |
KD1.results.task1 | CB2.results.task1 | 3.4327 | 6.0625 | 8.6923 | TRUE |
KD1.results.task1 | PR1.results.task1 | 3.2077 | 5.8375 | 8.4673 | TRUE |
KD1.results.task1 | PRGR1.results.task1 | 3.5827 | 6.2125 | 8.8423 | TRUE |
KD1.results.task1 | MHMTM2.results.task1 | 6.1077 | 8.7375 | 11.3673 | TRUE |
MHMTM1.results.task1 | WCS1.results.task1 | -0.7548 | 1.8750 | 4.5048 | FALSE |
MHMTM1.results.task1 | ZCY2.results.task1 | 1.0702 | 3.7000 | 6.3298 | TRUE |
MHMTM1.results.task1 | CB1.results.task1 | 0.9202 | 3.5500 | 6.1798 | TRUE |
MHMTM1.results.task1 | PRGR2.results.task1 | 1.6702 | 4.3000 | 6.9298 | TRUE |
MHMTM1.results.task1 | CB2.results.task1 | 3.0202 | 5.6500 | 8.2798 | TRUE |
MHMTM1.results.task1 | PR1.results.task1 | 2.7952 | 5.4250 | 8.0548 | TRUE |
MHMTM1.results.task1 | PRGR1.results.task1 | 3.1702 | 5.8000 | 8.4298 | TRUE |
MHMTM1.results.task1 | MHMTM2.results.task1 | 5.6952 | 8.3250 | 10.9548 | TRUE |
WCS1.results.task1 | ZCY2.results.task1 | -0.8048 | 1.8250 | 4.4548 | FALSE |
WCS1.results.task1 | CB1.results.task1 | -0.9548 | 1.6750 | 4.3048 | FALSE |
WCS1.results.task1 | PRGR2.results.task1 | -0.2048 | 2.4250 | 5.0548 | FALSE |
WCS1.results.task1 | CB2.results.task1 | 1.1452 | 3.7750 | 6.4048 | TRUE |
WCS1.results.task1 | PR1.results.task1 | 0.9202 | 3.5500 | 6.1798 | TRUE |
WCS1.results.task1 | PRGR1.results.task1 | 1.2952 | 3.9250 | 6.5548 | TRUE |
WCS1.results.task1 | MHMTM2.results.task1 | 3.8202 | 6.4500 | 9.0798 | TRUE |
ZCY2.results.task1 | CB1.results.task1 | -2.7798 | -0.1500 | 2.4798 | FALSE |
ZCY2.results.task1 | PRGR2.results.task1 | -2.0298 | 0.6000 | 3.2298 | FALSE |
ZCY2.results.task1 | CB2.results.task1 | -0.6798 | 1.9500 | 4.5798 | FALSE |
ZCY2.results.task1 | PR1.results.task1 | -0.9048 | 1.7250 | 4.3548 | FALSE |
ZCY2.results.task1 | PRGR1.results.task1 | -0.5298 | 2.1000 | 4.7298 | FALSE |
ZCY2.results.task1 | MHMTM2.results.task1 | 1.9952 | 4.6250 | 7.2548 | TRUE |
CB1.results.task1 | PRGR2.results.task1 | -1.8798 | 0.7500 | 3.3798 | FALSE |
CB1.results.task1 | CB2.results.task1 | -0.5298 | 2.1000 | 4.7298 | FALSE |
CB1.results.task1 | PR1.results.task1 | -0.7548 | 1.8750 | 4.5048 | FALSE |
CB1.results.task1 | PRGR1.results.task1 | -0.3798 | 2.2500 | 4.8798 | FALSE |
CB1.results.task1 | MHMTM2.results.task1 | 2.1452 | 4.7750 | 7.4048 | TRUE |
PRGR2.results.task1 | CB2.results.task1 | -1.2798 | 1.3500 | 3.9798 | FALSE |
PRGR2.results.task1 | PR1.results.task1 | -1.5048 | 1.1250 | 3.7548 | FALSE |
PRGR2.results.task1 | PRGR1.results.task1 | -1.1298 | 1.5000 | 4.1298 | FALSE |
PRGR2.results.task1 | MHMTM2.results.task1 | 1.3952 | 4.0250 | 6.6548 | TRUE |
CB2.results.task1 | PR1.results.task1 | -2.8548 | -0.2250 | 2.4048 | FALSE |
CB2.results.task1 | PRGR1.results.task1 | -2.4798 | 0.1500 | 2.7798 | FALSE |
CB2.results.task1 | MHMTM2.results.task1 | 0.0452 | 2.6750 | 5.3048 | TRUE |
PR1.results.task1 | PRGR1.results.task1 | -2.2548 | 0.3750 | 3.0048 | FALSE |
PR1.results.task1 | MHMTM2.results.task1 | 0.2702 | 2.9000 | 5.5298 | TRUE |
PRGR1.results.task1 | MHMTM2.results.task1 | -0.1048 | 2.5250 | 5.1548 | FALSE |
Task 2:Note Tracking (NT)
NT Mixed Set Overall Summary Results
This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger.
A total of 34 files were used in this subtask: 16 from woodwind recording, 8 from IAL quintet recording and 6 piano.
CB1 | CB2 | CT1 | KD2 | PR1 | PRGR1 | PRGR2 | SL1 | ZCY2 | |
---|---|---|---|---|---|---|---|---|---|
Ave. F-Measure Onset-Offset | 0.3047 | 0.2064 | 0.5292 | 0.4357 | 0.2489 | 0.1955 | 0.2826 | 0.0011 | 0.0854 |
Ave. F-Measure Onset Only | 0.5029 | 0.3742 | 0.7675 | 0.6969 | 0.4497 | 0.4982 | 0.4908 | 0.3593 | 0.2256 |
Ave. F-Measure Chroma | 0.3207 | 0.2365 | 0.5381 | 0.4465 | 0.3036 | 0.2219 | 0.3319 | 0.0014 | 0.0996 |
Ave. F-Measure Onset Only Chroma | 0.5343 | 0.4276 | 0.7785 | 0.7020 | 0.5099 | 0.5595 | 0.5489 | 0.4080 | 0.2717 |
Detailed Results
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.312 | 0.304 | 0.305 | 0.865 |
CB2 | 0.201 | 0.230 | 0.206 | 0.862 |
CT1 | 0.513 | 0.550 | 0.529 | 0.863 |
KD2 | 0.412 | 0.468 | 0.436 | 0.886 |
PR1 | 0.202 | 0.340 | 0.249 | 0.863 |
PRGR1 | 0.229 | 0.176 | 0.195 | 0.882 |
PRGR2 | 0.264 | 0.346 | 0.283 | 0.903 |
SL1 | 0.001 | 0.001 | 0.001 | -0.039 |
ZCY2 | 0.080 | 0.095 | 0.085 | 0.780 |
Detailed Chroma Results
Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.329 | 0.320 | 0.321 | 0.860 |
CB2 | 0.229 | 0.265 | 0.237 | 0.858 |
CT1 | 0.522 | 0.560 | 0.538 | 0.861 |
KD2 | 0.423 | 0.479 | 0.447 | 0.883 |
PR1 | 0.245 | 0.415 | 0.304 | 0.862 |
PRGR1 | 0.261 | 0.199 | 0.222 | 0.883 |
PRGR2 | 0.308 | 0.411 | 0.332 | 0.904 |
SL1 | 0.001 | 0.002 | 0.001 | -0.064 |
ZCY2 | 0.093 | 0.111 | 0.100 | 0.826 |
Results Based on Onset Only
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.525 | 0.493 | 0.503 | 0.720 |
CB2 | 0.375 | 0.403 | 0.374 | 0.677 |
CT1 | 0.741 | 0.802 | 0.767 | 0.728 |
KD2 | 0.666 | 0.741 | 0.697 | 0.767 |
PR1 | 0.363 | 0.619 | 0.450 | 0.696 |
PRGR1 | 0.604 | 0.444 | 0.498 | -2.108 |
PRGR2 | 0.488 | 0.582 | 0.491 | 0.731 |
SL1 | 0.301 | 0.477 | 0.359 | -0.023 |
ZCY2 | 0.214 | 0.246 | 0.226 | 0.656 |
Chroma Results Based on Onset Only
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.558 | 0.524 | 0.534 | 0.700 |
CB2 | 0.426 | 0.465 | 0.428 | 0.652 |
CT1 | 0.751 | 0.814 | 0.778 | 0.700 |
KD2 | 0.671 | 0.746 | 0.702 | 0.760 |
PR1 | 0.410 | 0.704 | 0.510 | 0.689 |
PRGR1 | 0.678 | 0.498 | 0.559 | -2.246 |
PRGR2 | 0.540 | 0.659 | 0.549 | 0.728 |
SL1 | 0.342 | 0.544 | 0.408 | -0.025 |
ZCY2 | 0.257 | 0.297 | 0.272 | 0.634 |
Friedman Tests for Note Tracking
The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the F-measure on individual files.
Tukey-Kramer HSD Multi-Comparison for Task2
TeamID | TeamID | Lowerbound | Mean | Upperbound | Significance |
---|---|---|---|---|---|
CT1 | KD2 | -1.2803 | 0.7794 | 2.8391 | FALSE |
CT1 | CB1 | 1.4109 | 3.4706 | 5.5303 | TRUE |
CT1 | PRGR1 | 0.9844 | 3.0441 | 5.1038 | TRUE |
CT1 | PRGR2 | 0.9403 | 3.0000 | 5.0597 | TRUE |
CT1 | PR1 | 2.0285 | 4.0882 | 6.1479 | TRUE |
CT1 | CB2 | 3.6168 | 5.6765 | 7.7362 | TRUE |
CT1 | SL1 | 3.2197 | 5.2794 | 7.3391 | TRUE |
CT1 | ZCY2 | 4.8962 | 6.9559 | 9.0156 | TRUE |
KD2 | CB1 | 0.6315 | 2.6912 | 4.7509 | TRUE |
KD2 | PRGR1 | 0.2050 | 2.2647 | 4.3244 | TRUE |
KD2 | PRGR2 | 0.1609 | 2.2206 | 4.2803 | TRUE |
KD2 | PR1 | 1.2491 | 3.3088 | 5.3685 | TRUE |
KD2 | CB2 | 2.8374 | 4.8971 | 6.9568 | TRUE |
KD2 | SL1 | 2.4403 | 4.5000 | 6.5597 | TRUE |
KD2 | ZCY2 | 4.1168 | 6.1765 | 8.2362 | TRUE |
CB1 | PRGR1 | -2.4862 | -0.4265 | 1.6332 | FALSE |
CB1 | PRGR2 | -2.5303 | -0.4706 | 1.5891 | FALSE |
CB1 | PR1 | -1.4421 | 0.6176 | 2.6773 | FALSE |
CB1 | CB2 | 0.1462 | 2.2059 | 4.2656 | TRUE |
CB1 | SL1 | -0.2509 | 1.8088 | 3.8685 | FALSE |
CB1 | ZCY2 | 1.4256 | 3.4853 | 5.5450 | TRUE |
PRGR1 | PRGR2 | -2.1038 | -0.0441 | 2.0156 | FALSE |
PRGR1 | PR1 | -1.0156 | 1.0441 | 3.1038 | FALSE |
PRGR1 | CB2 | 0.5727 | 2.6324 | 4.6921 | TRUE |
PRGR1 | SL1 | 0.1756 | 2.2353 | 4.2950 | TRUE |
PRGR1 | ZCY2 | 1.8521 | 3.9118 | 5.9715 | TRUE |
PRGR2 | PR1 | -0.9715 | 1.0882 | 3.1479 | FALSE |
PRGR2 | CB2 | 0.6168 | 2.6765 | 4.7362 | TRUE |
PRGR2 | SL1 | 0.2197 | 2.2794 | 4.3391 | TRUE |
PRGR2 | ZCY2 | 1.8962 | 3.9559 | 6.0156 | TRUE |
PR1 | CB2 | -0.4715 | 1.5882 | 3.6479 | FALSE |
PR1 | SL1 | -0.8685 | 1.1912 | 3.2509 | FALSE |
PR1 | ZCY2 | 0.8079 | 2.8676 | 4.9273 | TRUE |
CB2 | SL1 | -2.4568 | -0.3971 | 1.6626 | FALSE |
CB2 | ZCY2 | -0.7803 | 1.2794 | 3.3391 | FALSE |
SL1 | ZCY2 | -0.3832 | 1.6765 | 3.7362 | FALSE |
NT Piano-Only Overall Summary Results
This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger. 6 piano recordings are evaluated separately for this subtask.
CB1 | CB2 | CT1 | KD2 | PR1 | PRGR1 | PRGR2 | SL1 | ZCY2 | |
---|---|---|---|---|---|---|---|---|---|
Ave. F-Measure Onset-Offset | 0.2380 | 0.1750 | 0.6201 | 0.1549 | 0.2104 | 0.0462 | 0.1250 | 0.0052 | 0.0131 |
Ave. F-Measure Onset Only | 0.6681 | 0.4970 | 0.8139 | 0.7196 | 0.5568 | 0.4242 | 0.5529 | 0.3122 | 0.1482 |
Ave. F-Measure Chroma | 0.2538 | 0.1863 | 0.6260 | 0.1370 | 0.2204 | 0.0462 | 0.1205 | 0.0061 | 0.0227 |
Ave. F-Measure Onset Only Chroma | 0.6787 | 0.5145 | 0.8150 | 0.6614 | 0.5276 | 0.4311 | 0.5333 | 0.3485 | 0.1846 |
Detailed Results
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.275 | 0.211 | 0.238 | 0.813 |
CB2 | 0.209 | 0.153 | 0.175 | 0.797 |
CT1 | 0.622 | 0.621 | 0.620 | 0.786 |
KD2 | 0.150 | 0.161 | 0.155 | 0.844 |
PR1 | 0.192 | 0.249 | 0.210 | 0.818 |
PRGR1 | 0.070 | 0.036 | 0.046 | 0.844 |
PRGR2 | 0.163 | 0.107 | 0.125 | 0.860 |
SL1 | 0.005 | 0.005 | 0.005 | -0.056 |
ZCY2 | 0.012 | 0.016 | 0.013 | 0.440 |
Detailed Chroma Results
Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.292 | 0.226 | 0.254 | 0.801 |
CB2 | 0.221 | 0.164 | 0.186 | 0.796 |
CT1 | 0.627 | 0.627 | 0.626 | 0.782 |
KD2 | 0.132 | 0.142 | 0.137 | 0.835 |
PR1 | 0.199 | 0.263 | 0.220 | 0.817 |
PRGR1 | 0.070 | 0.036 | 0.046 | 0.837 |
PRGR2 | 0.157 | 0.103 | 0.121 | 0.859 |
SL1 | 0.006 | 0.006 | 0.006 | -0.195 |
ZCY2 | 0.020 | 0.028 | 0.023 | 0.697 |
Results Based on Onset Only
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.744 | 0.613 | 0.668 | 0.585 |
CB2 | 0.561 | 0.454 | 0.497 | 0.576 |
CT1 | 0.814 | 0.817 | 0.814 | 0.694 |
KD2 | 0.690 | 0.756 | 0.720 | 0.559 |
PR1 | 0.488 | 0.687 | 0.557 | 0.606 |
PRGR1 | 0.639 | 0.334 | 0.424 | -0.438 |
PRGR2 | 0.689 | 0.493 | 0.553 | 0.544 |
SL1 | 0.325 | 0.324 | 0.312 | -0.053 |
ZCY2 | 0.142 | 0.161 | 0.148 | 0.453 |
Chroma Results Based on Onset Only
Precision | Recall | Ave. F-measure | Ave. Overlap | |
---|---|---|---|---|
CB1 | 0.755 | 0.623 | 0.679 | 0.585 |
CB2 | 0.580 | 0.470 | 0.514 | 0.577 |
CT1 | 0.815 | 0.819 | 0.815 | 0.676 |
KD2 | 0.634 | 0.696 | 0.661 | 0.557 |
PR1 | 0.461 | 0.652 | 0.528 | 0.606 |
PRGR1 | 0.643 | 0.341 | 0.431 | -0.557 |
PRGR2 | 0.665 | 0.477 | 0.533 | 0.541 |
SL1 | 0.368 | 0.361 | 0.348 | -0.060 |
ZCY2 | 0.176 | 0.203 | 0.185 | 0.449 |
Individual Results Files for Task 2
CB1= Chris Cannam, Emmanouil Benetos
CB2= Chris Cannam, Emmanouil Benetos
CT1= Carl Thomé
KD2= Karin Dressler
PR1= Leonid Pogorelyuk, Clarence Rowley
PRGR1= Katarzyna Rokicka, Adam Pluta, Rafal Rokicki, Marcin Gawrysz
PRGR2= Katarzyna Rokicka, Adam Pluta, Rafal Rokicki, Marcin Gawrysz
SL1= Samuel Li
ZCY2= Fuliang Yin, Weiwei Zhang, Zhe Chen