Introduction
These are the results for the 2008 running of the Multiple Fundamental Frequency Estimation and Tracking task. For background information about this task set please refer to the 2013:Multiple Fundamental Frequency Estimation & Tracking page.
General Legend
Task 1: Multiple Fundamental Frequency Estimation (MF0E)
MF0E Overall Summary Results
Below are the average scores across 40 test files. These files come from 3 different sources: woodwind quintet recording of bassoon, clarinet, horn,flute and oboe (UIUC); Rendered MIDI using RWC database donated by IRCAM and a quartet recording of bassoon, clarinet, violin and sax donated by Dr. Bryan Pardo`s Interactive Audio Lab (IAL). 20 files coming from 5 sections of the woodwind recording where each section has 4 files ranging from 2 polyphony to 5 polyphony. 12 files from IAL, coming from 4 different songs ranging from 2 polyphony to 4 polyphony and 8 files from RWC synthesized midi ranging from 2 different songs ranging from 2 polphony to 5 polyphony.
|
BW1 |
CDM1 |
CDM2 |
Accuracy |
0.662 |
0.620 |
0.620 |
Accuracy Chroma |
0.694 |
0.660 |
0.660 |
download these results as csv
Detailed Results
|
Precision |
Recall |
Accuracy |
Etot |
Esubs |
Emiss |
Efa |
|
BW1 |
0.771 |
0.735 |
0.662 |
0.392 |
0.112 |
0.153 |
0.128 |
CDM1 |
0.741 |
0.694 |
0.620 |
0.443 |
0.128 |
0.178 |
0.137 |
CDM2 |
0.741 |
0.694 |
0.620 |
0.443 |
0.128 |
0.178 |
0.137 |
download these results as csv
Detailed Chroma Results
Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)
|
Precision |
Recall |
Accuracy |
Etot |
Esubs |
Emiss |
Efa |
|
BW1 |
0.808 |
0.774 |
0.694 |
0.354 |
0.073 |
0.153 |
0.128 |
CDM1 |
0.789 |
0.742 |
0.660 |
0.395 |
0.080 |
0.178 |
0.137 |
CDM2 |
0.789 |
0.742 |
0.660 |
0.395 |
0.080 |
0.178 |
0.137 |
download these results as csv
Individual Results Files for Task 1
BW1= Emmanouil Benetos, Tillman Weyde
CDM1= Tian Cheng, Simon Dixon, Matthias Mauch
CDM2= Tian Cheng, Simon Dixon, Matthias Mauch
Info about the filenames
The filenames starting with part* comes from acoustic woodwind recording, the ones starting with RWC are synthesized. The legend about the instruments are:
bs = bassoon,
cl = clarinet,
fl = flute,
hn = horn,
ob = oboe,
vl = violin,
cel = cello,
gtr = guitar,
sax = saxophone,
bass = electric bass guitar
Run Times
sub_ID |
Runtime(sec) |
BW1 |
3374 |
CDM1 |
62027 |
CDM2 |
61992 |
download these results as csv
Friedman tests for Multiple Fundamental Frequency Estimation (MF0E)
The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the performance (accuracy) on individual files.
Tukey-Kramer HSD Multi-Comparison
TeamID |
TeamID |
Lowerbound |
Mean |
Upperbound |
Significance |
BW1 |
CDM2 |
0.8211 |
1.2750 |
1.7289 |
TRUE |
BW1 |
CDM1 |
0.8211 |
1.2750 |
1.7289 |
TRUE |
CDM2 |
CDM1 |
-0.4539 |
0.0000 |
0.4539 |
FALSE |
download these results as csv
Task 2:Note Tracking (NT)
NT Mixed Set Overall Summary Results
This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger.
A total of 34 files were used in this subtask: 16 from woodwind recording, 8 from IAL quintet recording and 6 piano.
|
BW2 |
BW3 |
CDM3 |
Ave. F-Measure Onset-Offset |
0.3264 |
0.2745 |
0.2880 |
Ave. F-Measure Onset Only |
0.5533 |
0.4754 |
0.5071 |
Ave. F-Measure Chroma |
0.3383 |
0.3142 |
0.3058 |
Ave. F-Measure Onset Only Chroma |
0.5751 |
0.5512 |
0.5439 |
download these results as csv
Detailed Results
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.335 |
0.334 |
0.326 |
0.879 |
BW3 |
0.237 |
0.342 |
0.274 |
0.879 |
CDM3 |
0.268 |
0.328 |
0.288 |
0.857 |
download these results as csv
Detailed Chroma Results
Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.347 |
0.347 |
0.338 |
0.875 |
BW3 |
0.270 |
0.395 |
0.314 |
0.877 |
CDM3 |
0.283 |
0.350 |
0.306 |
0.854 |
download these results as csv
Results Based on Onset Only
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.592 |
0.546 |
0.553 |
0.726 |
BW3 |
0.419 |
0.576 |
0.475 |
0.720 |
CDM3 |
0.479 |
0.562 |
0.507 |
0.702 |
download these results as csv
Chroma Results Based on Onset Only
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.615 |
0.570 |
0.575 |
0.712 |
BW3 |
0.483 |
0.675 |
0.551 |
0.684 |
CDM3 |
0.513 |
0.606 |
0.544 |
0.682 |
download these results as csv
Run Times
sub_ID |
Runtime(sec) |
BW2 |
3706 |
BW3 |
2203 |
CDM3 |
68470 |
download these results as csv
Friedman Tests for Note Tracking
The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the F-measure on individual files.
Tukey-Kramer HSD Multi-Comparison for Task2
TeamID |
TeamID |
Lowerbound |
Mean |
Upperbound |
Significance |
BW2 |
CDM3 |
0.0786 |
0.6471 |
1.2155 |
TRUE |
BW2 |
BW3 |
0.2845 |
0.8529 |
1.4214 |
TRUE |
CDM3 |
BW3 |
-0.3625 |
0.2059 |
0.7743 |
FALSE |
download these results as csv
NT Piano-Only Overall Summary Results
This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger.
6 piano recordings are evaluated separately for this subtask.
|
BW2 |
BW3 |
CDM3 |
Ave. F-Measure Onset-Offset |
0.1596 |
0.1940 |
0.1551 |
Ave. F-Measure Onset Only |
0.5359 |
0.6110 |
0.5723 |
Ave. F-Measure Chroma |
0.1653 |
0.2055 |
0.1692 |
Ave. F-Measure Onset Only Chroma |
0.5455 |
0.6274 |
0.5904 |
download these results as csv
Detailed Results
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.186 |
0.141 |
0.160 |
0.831 |
BW3 |
0.193 |
0.195 |
0.194 |
0.831 |
CDM3 |
0.158 |
0.153 |
0.155 |
0.788 |
download these results as csv
Detailed Chroma Results
Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.192 |
0.146 |
0.165 |
0.824 |
BW3 |
0.205 |
0.207 |
0.205 |
0.819 |
CDM3 |
0.172 |
0.167 |
0.169 |
0.780 |
download these results as csv
Results Based on Onset Only
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.610 |
0.483 |
0.536 |
0.559 |
BW3 |
0.608 |
0.616 |
0.611 |
0.562 |
CDM3 |
0.575 |
0.574 |
0.572 |
0.538 |
download these results as csv
Chroma Results Based on Onset Only
|
Precision |
Recall |
Ave. F-measure |
Ave. Overlap |
BW2 |
0.622 |
0.491 |
0.546 |
0.555 |
BW3 |
0.624 |
0.633 |
0.627 |
0.557 |
CDM3 |
0.593 |
0.592 |
0.590 |
0.538 |
download these results as csv
Individual Results Files for Task 2
BW2= Emmanouil Benetos, Tillman Weyde
BW3= Emmanouil Benetos, Tillman Weyde
CDM3= Tian Cheng, Simon Dixon, Matthias Mauch