2009:Multiple Fundamental Frequency Estimation & Tracking Results

From MIREX Wiki
Revision as of 15:53, 23 July 2010 by Singh14 (talk | contribs) (Individual Results Files for Task 1)

Introduction

These are the results for the 2008 running of the Multiple Fundamental Frequency Estimation and Tracking task. For background information about this task set please refer to the 2009:Multiple Fundamental Frequency Estimation & Tracking page.


General Legend

Team ID

BVB = Nancy Bertin, Emmanuel Vincent, Roland Badeau
DHP1 = Zhiyao Duan, Jinyu Han,Bryan Pardo
DHP2 = Zhiyao Duan, Jinyu Han,Bryan Pardo
NEOS1 = Masahiro Nakano, Koji Egashira, Nobutaka Ono, Shigeki Sagayama
NEOS2 = Masahiro Nakano, Koji Egashira, Nobutaka Ono, Shigeki Sagayama
NPA1 = Paolo Nesi, Gianni Pantaleo, Fabrizio Argenti
NPA2 = Paolo Nesi, Gianni Pantaleo, Fabrizio Argenti
RS1 = S. A. Raczyński, S. Sagayma
RS2 = S. A. Raczyński, S. Sagayma
RS3 = S. A. Raczyński, S. Sagayma
RS4 = S. A. Raczyński, S. Sagayma
RS5 = S. A. Raczyński, S. Sagayma
RS6 = S. A. Raczyński, S. Sagayma
YR1 = Chunghsin Yeh, Axel Roebel
YR2 = Chunghsin Yeh, Axel Roebel
ZL = Xueliang Zhang, Wenju Liu

Task 1: Multiple Fundamental Frequency Estimation (MF0E)

MF0E Overall Summary Results

Below are the average scores across 40 test files. These files come from 3 different sources: woodwind quintet recording of bassoon, clarinet, horn,flute and oboe (UIUC); Rendered MIDI using RWC database donated by IRCAM and a quartet recording of bassoon, clarinet, violin and sax donated by Dr. Bryan Pardo`s Interactive Audio Lab (IAL). 20 files coming from 5 sections of the woodwind recording where each section has 4 files ranging from 2 polyphony to 5 polyphony. 12 files from IAL, coming from 4 different songs ranging from 2 polyphony to 4 polyphony and 8 files from RWC synthesized midi ranging from 2 different songs ranging from 2 polphony to 5 polyphony.


BVB DHP1 DHP2 NEOS1 NEOS2 NPA1 NPA2 RS1 RS2 YR1 YR2 ZL
Accuracy 0.03 0.55 0.57 0.35 0.49 0.44 0.48 0.26 0.27 0.65 0.69 0.49
Accuracy Chroma 0.15 0.59 0.61 0.46 0.57 0.5 0.52 0.29 0.3 0.68 0.71 0.55

download these results as csv

Detailed Results

Precision Recall Accuracy Etot Esubs Emiss Efa
BVB 0.057 0.044 0.033 1.270 0.384 0.572 0.314
DHP1 0.735 0.614 0.548 0.510 0.113 0.273 0.124
DHP2 0.648 0.702 0.569 0.562 0.169 0.129 0.264
NEOS1 0.369 0.579 0.354 1.146 0.361 0.060 0.725
NEOS2 0.604 0.570 0.488 0.623 0.203 0.227 0.193
NPA1 0.457 0.713 0.437 1.016 0.224 0.063 0.729
NPA2 0.506 0.760 0.483 0.904 0.179 0.061 0.664
RS1 0.733 0.256 0.261 0.721 0.067 0.613 0.040
RS2 0.718 0.272 0.272 0.718 0.070 0.595 0.053
YR1 0.740 0.752 0.646 0.432 0.119 0.129 0.183
YR2 0.766 0.808 0.688 0.395 0.091 0.101 0.203
ZL 0.717 0.517 0.492 0.532 0.167 0.317 0.048

download these results as csv

Detailed Chroma Results

Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)

Precision Recall Accuracy Etot Esubs Emiss Efa
BVB 0.264 0.194 0.147 1.120 0.234 0.572 0.314
DHP1 0.788 0.656 0.585 0.468 0.071 0.273 0.124
DHP2 0.698 0.757 0.612 0.507 0.114 0.129 0.264
NEOS1 0.479 0.771 0.461 0.953 0.169 0.059 0.724
NEOS2 0.705 0.663 0.567 0.531 0.108 0.229 0.194
NPA1 0.521 0.824 0.499 0.905 0.112 0.063 0.729
NPA2 0.540 0.817 0.516 0.847 0.122 0.061 0.664
RS1 0.798 0.282 0.286 0.695 0.042 0.613 0.040
RS2 0.780 0.298 0.297 0.692 0.044 0.595 0.053
YR1 0.780 0.793 0.681 0.391 0.078 0.129 0.183
YR2 0.785 0.830 0.705 0.373 0.069 0.101 0.203
ZL 0.801 0.577 0.548 0.472 0.107 0.317 0.048

download these results as csv

Individual Results Files for Task 1

BVB = Nancy Bertin, Emmanuel Vincent, Roland Badeau
DHP1 = Zhiyao Duan, Jinyu Han,Bryan Pardo
DHP2 = Zhiyao Duan, Jinyu Han,Bryan Pardo
NEOS1 = Masahiro Nakano, Koji Egashira, Nobutaka Ono, Shigeki Sagayama
NEOS2 = Masahiro Nakano, Koji Egashira, Nobutaka Ono, Shigeki Sagayama
NPA1 = Paolo Nesi, Gianni Pantaleo, Fabrizio Argenti
NPA2 = Paolo Nesi, Gianni Pantaleo, Fabrizio Argenti
RS1 = S. A. Raczyński, S. Sagayma
RS2 = S. A. Raczyński, S. Sagayma
RS3 = S. A. Raczyński, S. Sagayma
RS4 = S. A. Raczyński, S. Sagayma
RS5 = S. A. Raczyński, S. Sagayma
RS6 = S. A. Raczyński, S. Sagayma
YR1 = Chunghsin Yeh, Axel Roebel
YR2 = Chunghsin Yeh, Axel Roebel
ZL = Xueliang Zhang, Wenju Liu

Info about the filenames

The filenames starting with part* comes from acoustic woodwind recording, the ones starting with RWC are synthesized. The legend about the instruments are:

bs = bassoon, cl = clarinet, fl = flute, hn = horn, ob = oboe, vl = violin, cel = cello, gtr = guitar, sax = saxophone, bass = electric bass guitar

Run Times

file /nema-raid/www/mirex/results/2009/multif0/task1_runtimes.csv not found

TBA

Task 2A:Mixed Set Note Tracking (NT)

NT Mixed Set Overall Summary Results

This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger.

A total of 34 files were used in this subtask: 16 from woodwind recording, 8 from IAL quintet recording and 6 piano.

BVB DHP1 NEOS1 NEOS2 NPA1 NPA2 RS1 RS2 RS3 RS4 RS5 RS6 YR
Ave. F-Measure Onset-Offset 0.0026 0.2219 0.0366 0.3187 0.1364 0.2275 0.1577 0.1447 0.1591 0.1505 0.1441 0.1490 0.3077
Ave. F-Measure Onset Only 0.0111 0.4259 0.2765 0.5329 0.3087 0.3664 0.2984 0.2838 0.3057 0.2851 0.2920 0.2838 0.5003
Ave. F-Measure Chroma 0.0124 0.2413 0.0570 0.3607 0.1913 0.2787 0.1794 0.1640 0.1826 0.1704 0.1666 0.1663 0.3246
Ave. F-Measure Onset Only Chroma 0.0622 0.4722 0.3629 0.5906 0.3950 0.4500 0.3323 0.3138 0.3394 0.3189 0.3232 0.3182 0.5216

download these results as csv

Detailed Results

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.002 0.003 0.003 0.251
DHP1 0.196 0.268 0.222 0.846
NEOS1 0.037 0.038 0.037 0.855
NEOS2 0.316 0.328 0.319 0.876
NPA1 0.096 0.246 0.136 0.860
NPA2 0.171 0.357 0.228 0.881
RS1 0.214 0.140 0.158 0.881
RS2 0.195 0.130 0.145 0.878
RS3 0.209 0.145 0.159 0.888
RS4 0.192 0.139 0.151 0.830
RS5 0.191 0.133 0.144 0.878
RS6 0.199 0.134 0.149 0.853
YR 0.270 0.377 0.308 0.888

download these results as csv

Detailed Chroma Results

Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.012 0.013 0.012 0.661
DHP1 0.215 0.290 0.241 0.844
NEOS1 0.056 0.060 0.057 0.854
NEOS2 0.358 0.371 0.361 0.874
NPA1 0.135 0.349 0.191 0.855
NPA2 0.210 0.434 0.279 0.875
RS1 0.238 0.161 0.179 0.879
RS2 0.216 0.149 0.164 0.877
RS3 0.235 0.168 0.183 0.883
RS4 0.215 0.159 0.170 0.829
RS5 0.216 0.155 0.167 0.878
RS6 0.219 0.151 0.166 0.851
YR 0.284 0.399 0.325 0.885

download these results as csv

Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.011 0.012 0.011 0.394
DHP1 0.406 0.490 0.426 0.650
NEOS1 0.293 0.274 0.276 0.481
NEOS2 0.527 0.554 0.533 0.717
NPA1 0.220 0.551 0.309 0.622
NPA2 0.278 0.562 0.366 0.730
RS1 0.421 0.265 0.298 0.620
RS2 0.403 0.253 0.284 0.617
RS3 0.417 0.277 0.306 0.625
RS4 0.387 0.258 0.285 0.618
RS5 0.401 0.266 0.292 0.609
RS6 0.400 0.252 0.284 0.616
YR 0.431 0.630 0.500 0.714

download these results as csv

Chroma Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.062 0.065 0.062 0.451
DHP1 0.448 0.546 0.472 0.626
NEOS1 0.379 0.364 0.363 0.473
NEOS2 0.586 0.612 0.591 0.706
NPA1 0.280 0.718 0.395 0.607
NPA2 0.342 0.691 0.450 0.683
RS1 0.468 0.298 0.332 0.604
RS2 0.446 0.281 0.314 0.597
RS3 0.463 0.311 0.339 0.608
RS4 0.430 0.293 0.319 0.582
RS5 0.444 0.296 0.323 0.598
RS6 0.447 0.286 0.318 0.593
YR 0.450 0.656 0.522 0.672

download these results as csv

Task 2B:Piano-Only Note Tracking (NT)

NT Piano-Only Overall Summary Results

This subtask is evaluated in two different ways. In the first setup , a returned note is assumed correct if its onset is within +-50ms of a ref note and its F0 is within +- quarter tone of the corresponding reference note, ignoring the returned offset values. In the second setup, on top of the above requirements, a correct returned note is required to have an offset value within 20% of the ref notes duration around the ref note`s offset, or within 50ms whichever is larger. 6 piano recordings are evaluated separately for this subtask.

BVB DHP1 NEOS1 NEOS2 NPA1 NPA2 RS1 RS2 RS3 RS4 RS5 RS6 YR
Ave. F-Measure Onset-Offset 0.0048 0.1430 0.0274 0.1386 0.1236 0.1715 0.1194 0.0987 0.1099 0.1050 0.0860 0.1110 0.1489
Ave. F-Measure Onset Only 0.0205 0.3756 0.3864 0.5271 0.4416 0.4058 0.3397 0.3594 0.3396 0.2775 0.3702 0.2795 0.4560
Ave. F-Measure Chroma 0.0142 0.1487 0.0299 0.1517 0.1903 0.2475 0.1233 0.1017 0.1140 0.1085 0.0881 0.1124 0.1395
Ave. F-Measure Onset Only Chroma 0.0724 0.3861 0.4042 0.5251 0.5150 0.4967 0.3462 0.3655 0.3466 0.2845 0.3768 0.2869 0.4431

download these results as csv

Detailed Results

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.006 0.004 0.005 0.395
DHP1 0.144 0.146 0.143 0.790
NEOS1 0.029 0.027 0.027 0.822
NEOS2 0.146 0.138 0.139 0.788
NPA1 0.097 0.176 0.124 0.799
NPA2 0.137 0.231 0.171 0.822
RS1 0.240 0.080 0.119 0.844
RS2 0.193 0.067 0.099 0.821
RS3 0.210 0.075 0.110 0.844
RS4 0.208 0.071 0.105 0.687
RS5 0.157 0.060 0.086 0.826
RS6 0.226 0.074 0.111 0.825
YR 0.114 0.215 0.149 0.804

download these results as csv

Detailed Chroma Results

Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.016 0.014 0.014 0.733
DHP1 0.148 0.154 0.149 0.785
NEOS1 0.032 0.029 0.030 0.827
NEOS2 0.156 0.155 0.152 0.783
NPA1 0.150 0.267 0.190 0.792
NPA2 0.198 0.334 0.247 0.802
RS1 0.247 0.083 0.123 0.845
RS2 0.198 0.069 0.102 0.821
RS3 0.217 0.078 0.114 0.839
RS4 0.214 0.073 0.108 0.685
RS5 0.161 0.061 0.088 0.827
RS6 0.229 0.075 0.112 0.825
YR 0.107 0.204 0.140 0.796

download these results as csv

Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.022 0.020 0.020 0.491
DHP1 0.365 0.402 0.376 0.561
NEOS1 0.439 0.362 0.386 0.369
NEOS2 0.541 0.539 0.527 0.550
NPA1 0.345 0.635 0.442 0.558
NPA2 0.323 0.552 0.406 0.609
RS1 0.650 0.233 0.340 0.595
RS2 0.689 0.246 0.359 0.561
RS3 0.628 0.236 0.340 0.591
RS4 0.514 0.192 0.278 0.573
RS5 0.679 0.257 0.370 0.562
RS6 0.535 0.191 0.279 0.599
YR 0.345 0.680 0.456 0.543

download these results as csv

Chroma Results Based on Onset Only

Precision Recall Ave. F-measure Ave. Overlap
BVB 0.074 0.074 0.072 0.423
DHP1 0.373 0.416 0.386 0.552
NEOS1 0.459 0.379 0.404 0.364
NEOS2 0.539 0.537 0.525 0.546
NPA1 0.403 0.738 0.515 0.548
NPA2 0.395 0.678 0.497 0.577
RS1 0.661 0.237 0.346 0.596
RS2 0.700 0.250 0.366 0.563
RS3 0.641 0.241 0.347 0.593
RS4 0.526 0.198 0.285 0.553
RS5 0.691 0.262 0.377 0.565
RS6 0.548 0.197 0.287 0.602
YR 0.334 0.666 0.443 0.509

download these results as csv


Individual Results Files for Task 2

BVB = Nancy Bertin, Emmanuel Vincent, Roland Badeau
DHP1 = Zhiyao Duan, Jinyu Han,Bryan Pardo
DHP2 = Zhiyao Duan, Jinyu Han,Bryan Pardo
NEOS1 = Masahiro Nakano, Koji Egashira, Nobutaka Ono, Shigeki Sagayama
NEOS2 = Masahiro Nakano, Koji Egashira, Nobutaka Ono, Shigeki Sagayama
NPA1 = Paolo Nesi, Gianni Pantaleo, Fabrizio Argenti
NPA2 = Paolo Nesi, Gianni Pantaleo, Fabrizio Argenti
RS1 = S. A. Racz├╜nski, S. Sagayma
RS2 = S. A. Racz├╜nski, S. Sagayma
RS3 = S. A. Racz├╜nski, S. Sagayma
RS4 = S. A. Racz├╜nski, S. Sagayma
RS5 = S. A. Racz├╜nski, S. Sagayma
RS6 = S. A. Racz├╜nski, S. Sagayma
YR = Chunghsin Yeh, Axel Roebel

Info About Filenames

The filenames starting with part* come from acoustic woodwind recording, the ones starting with RWC are synthesized. The piano files are: RA_C030_align.wav, bach_847TESTp.wav, beet_pathetique_3TESTp.wav, mz_333_1TESTp.wav, scn_4TESTp.wav.note, ty_januarTESTp.wav.note. The filenames starting with 01*, 03*, 07*, 09* are coming from the quartet recording.

Run Times

file /nema-raid/www/mirex/results/2009/multif0/task2_runtimes.csv not found TBA

Task 3 Instrument Tracking

Same dataset was used as in Task1. The evaluations were performed by first one-to-one matching the detected contours to the ground-truth contours. This is done by selecting the best scoring duo`s of detected and ground-truth contours. If there are extra detected contours that are not matched to any of the ground-truth contours, all the returned F0`s in those contours are added to false positives. If there are extra ground-truth contours that are not matched to any detected contours, all the F0`s in the ground-truth contours are added to false negatives.


MF0It Detailed Results

Precision Recall Accuracy F-measure
DHP1 0.407 0.289 0.206 0.335

download these results as csv

Detailed Chroma Results

Here, accuracy is assessed on chroma results (i.e. all F0's are mapped to a single octave before evaluating)

Precision Recall Accuracy F-measure
DHP1 0.446 0.309 0.225 0.361

download these results as csv

Individual Results Files for Task 3

DHP1 = Zhiyao Duan, Jinyu Han,Bryan Pardo

Info About Filenames

The filenames starting with part* come from acoustic woodwind recording, the ones starting with RWC are synthesized. The piano files are: RA_C030_align.wav, bach_847TESTp.wav, beet_pathetique_3TESTp.wav, mz_333_1TESTp.wav, scn_4TESTp.wav.note, ty_januarTESTp.wav.note. The filenames starting with 01*, 03*, 07*, 09* are coming from the quartet recording.

Friedman's Test for Significant Differences

Task 1: Multiple Fundamental Frequency Estimation (MF0E)

The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the performance (accuracy) on individual files.

Tukey-Kramer HSD Multi-Comparison
TeamID TeamID Lowerbound Mean Upperbound Significance
YR2 YR1 -1.4847 1.1500 3.7847 FALSE
YR2 DHP2 -0.0597 2.5750 5.2097 FALSE
YR2 DHP1 0.3653 3.0000 5.6347 TRUE
YR2 ZL 1.7653 4.4000 7.0347 TRUE
YR2 NEOS2 1.8903 4.5250 7.1597 TRUE
YR2 NPA2 1.4903 4.1250 6.7597 TRUE
YR2 NPA1 2.9903 5.6250 8.2597 TRUE
YR2 NEOS1 4.5653 7.2000 9.8347 TRUE
YR2 RS2 5.6153 8.2500 10.8847 TRUE
YR2 RS1 6.2903 8.9250 11.5597 TRUE
YR2 BVB 7.8903 10.5250 13.1597 TRUE
YR1 DHP2 -1.2097 1.4250 4.0597 FALSE
YR1 DHP1 -0.7847 1.8500 4.4847 FALSE
YR1 ZL 0.6153 3.2500 5.8847 TRUE
YR1 NEOS2 0.7403 3.3750 6.0097 TRUE
YR1 NPA2 0.3403 2.9750 5.6097 TRUE
YR1 NPA1 1.8403 4.4750 7.1097 TRUE
YR1 NEOS1 3.4153 6.0500 8.6847 TRUE
YR1 RS2 4.4653 7.1000 9.7347 TRUE
YR1 RS1 5.1403 7.7750 10.4097 TRUE
YR1 BVB 6.7403 9.3750 12.0097 TRUE
DHP2 DHP1 -2.2097 0.4250 3.0597 FALSE
DHP2 ZL -0.8097 1.8250 4.4597 FALSE
DHP2 NEOS2 -0.6847 1.9500 4.5847 FALSE
DHP2 NPA2 -1.0847 1.5500 4.1847 FALSE
DHP2 NPA1 0.4153 3.0500 5.6847 TRUE
DHP2 NEOS1 1.9903 4.6250 7.2597 TRUE
DHP2 RS2 3.0403 5.6750 8.3097 TRUE
DHP2 RS1 3.7153 6.3500 8.9847 TRUE
DHP2 BVB 5.3153 7.9500 10.5847 TRUE
DHP1 ZL -1.2347 1.4000 4.0347 FALSE
DHP1 NEOS2 -1.1097 1.5250 4.1597 FALSE
DHP1 NPA2 -1.5097 1.1250 3.7597 FALSE
DHP1 NPA1 -0.0097 2.6250 5.2597 FALSE
DHP1 NEOS1 1.5653 4.2000 6.8347 TRUE
DHP1 RS2 2.6153 5.2500 7.8847 TRUE
DHP1 RS1 3.2903 5.9250 8.5597 TRUE
DHP1 BVB 4.8903 7.5250 10.1597 TRUE
ZL NEOS2 -2.5097 0.1250 2.7597 FALSE
ZL NPA2 -2.9097 -0.2750 2.3597 FALSE
ZL NPA1 -1.4097 1.2250 3.8597 FALSE
ZL NEOS1 0.1653 2.8000 5.4347 TRUE
ZL RS2 1.2153 3.8500 6.4847 TRUE
ZL RS1 1.8903 4.5250 7.1597 TRUE
ZL BVB 3.4903 6.1250 8.7597 TRUE
NEOS2 NPA2 -3.0347 -0.4000 2.2347 FALSE
NEOS2 NPA1 -1.5347 1.1000 3.7347 FALSE
NEOS2 NEOS1 0.0403 2.6750 5.3097 TRUE
NEOS2 RS2 1.0903 3.7250 6.3597 TRUE
NEOS2 RS1 1.7653 4.4000 7.0347 TRUE
NEOS2 BVB 3.3653 6.0000 8.6347 TRUE
NPA2 NPA1 -1.1347 1.5000 4.1347 FALSE
NPA2 NEOS1 0.4403 3.0750 5.7097 TRUE
NPA2 RS2 1.4903 4.1250 6.7597 TRUE
NPA2 RS1 2.1653 4.8000 7.4347 TRUE
NPA2 BVB 3.7653 6.4000 9.0347 TRUE
NPA1 NEOS1 -1.0597 1.5750 4.2097 FALSE
NPA1 RS2 -0.0097 2.6250 5.2597 FALSE
NPA1 RS1 0.6653 3.3000 5.9347 TRUE
NPA1 BVB 2.2653 4.9000 7.5347 TRUE
NEOS1 RS2 -1.5847 1.0500 3.6847 FALSE
NEOS1 RS1 -0.9097 1.7250 4.3597 FALSE
NEOS1 BVB 0.6903 3.3250 5.9597 TRUE
RS2 RS1 -1.9597 0.6750 3.3097 FALSE
RS2 BVB -0.3597 2.2750 4.9097 FALSE
RS1 BVB -1.0347 1.6000 4.2347 FALSE

download these results as csv

https://music-ir.org/mirex/results/2009/mf0/est/summary/small.Accuracy_Per_Song_Friedman_Mean_Rankstask1.friedman.Friedman_Mean_Ranks.png

Task 2: Note Tracking

The Friedman test was run in MATLAB to test significant differences amongst systems with regard to the F-measure on individual files.

Tukey-Kramer HSD Multi-Comparison for Task2A
TeamID TeamID Lowerbound Mean Upperbound Significance
NEOS2 YR -2.3320 0.7941 3.9202 FALSE
NEOS2 NPA2 -1.4496 1.6765 4.8026 FALSE
NEOS2 DHP1 -0.3026 2.8235 5.9496 FALSE
NEOS2 RS3 1.8445 4.9706 8.0967 TRUE
NEOS2 RS1 1.4768 4.6029 7.7290 TRUE
NEOS2 RS4 2.2562 5.3824 8.5085 TRUE
NEOS2 RS6 2.2857 5.4118 8.5379 TRUE
NEOS2 RS2 2.7710 5.8971 9.0232 TRUE
NEOS2 RS5 2.5210 5.6471 8.7732 TRUE
NEOS2 NPA1 2.3592 5.4853 8.6114 TRUE
NEOS2 NEOS1 5.7857 8.9118 12.0379 TRUE
NEOS2 BVB 7.5945 10.7206 13.8467 TRUE
YR NPA2 -2.2438 0.8824 4.0085 FALSE
YR DHP1 -1.0967 2.0294 5.1555 FALSE
YR RS3 1.0504 4.1765 7.3026 TRUE
YR RS1 0.6827 3.8088 6.9349 TRUE
YR RS4 1.4621 4.5882 7.7143 TRUE
YR RS6 1.4915 4.6176 7.7438 TRUE
YR RS2 1.9768 5.1029 8.2290 TRUE
YR RS5 1.7268 4.8529 7.9790 TRUE
YR NPA1 1.5651 4.6912 7.8173 TRUE
YR NEOS1 4.9915 8.1176 11.2438 TRUE
YR BVB 6.8004 9.9265 13.0526 TRUE
NPA2 DHP1 -1.9790 1.1471 4.2732 FALSE
NPA2 RS3 0.1680 3.2941 6.4202 TRUE
NPA2 RS1 -0.1996 2.9265 6.0526 FALSE
NPA2 RS4 0.5798 3.7059 6.8320 TRUE
NPA2 RS6 0.6092 3.7353 6.8614 TRUE
NPA2 RS2 1.0945 4.2206 7.3467 TRUE
NPA2 RS5 0.8445 3.9706 7.0967 TRUE
NPA2 NPA1 0.6827 3.8088 6.9349 TRUE
NPA2 NEOS1 4.1092 7.2353 10.3614 TRUE
NPA2 BVB 5.9180 9.0441 12.1702 TRUE
DHP1 RS3 -0.9790 2.1471 5.2732 FALSE
DHP1 RS1 -1.3467 1.7794 4.9055 FALSE
DHP1 RS4 -0.5673 2.5588 5.6849 FALSE
DHP1 RS6 -0.5379 2.5882 5.7143 FALSE
DHP1 RS2 -0.0526 3.0735 6.1996 FALSE
DHP1 RS5 -0.3026 2.8235 5.9496 FALSE
DHP1 NPA1 -0.4643 2.6618 5.7879 FALSE
DHP1 NEOS1 2.9621 6.0882 9.2143 TRUE
DHP1 BVB 4.7710 7.8971 11.0232 TRUE
RS3 RS1 -3.4938 -0.3676 2.7585 FALSE
RS3 RS4 -2.7143 0.4118 3.5379 FALSE
RS3 RS6 -2.6849 0.4412 3.5673 FALSE
RS3 RS2 -2.1996 0.9265 4.0526 FALSE
RS3 RS5 -2.4496 0.6765 3.8026 FALSE
RS3 NPA1 -2.6114 0.5147 3.6408 FALSE
RS3 NEOS1 0.8151 3.9412 7.0673 TRUE
RS3 BVB 2.6239 5.7500 8.8761 TRUE
RS1 RS4 -2.3467 0.7794 3.9055 FALSE
RS1 RS6 -2.3173 0.8088 3.9349 FALSE
RS1 RS2 -1.8320 1.2941 4.4202 FALSE
RS1 RS5 -2.0820 1.0441 4.1702 FALSE
RS1 NPA1 -2.2438 0.8824 4.0085 FALSE
RS1 NEOS1 1.1827 4.3088 7.4349 TRUE
RS1 BVB 2.9915 6.1176 9.2438 TRUE
RS4 RS6 -3.0967 0.0294 3.1555 FALSE
RS4 RS2 -2.6114 0.5147 3.6408 FALSE
RS4 RS5 -2.8614 0.2647 3.3908 FALSE
RS4 NPA1 -3.0232 0.1029 3.2290 FALSE
RS4 NEOS1 0.4033 3.5294 6.6555 TRUE
RS4 BVB 2.2121 5.3382 8.4643 TRUE
RS6 RS2 -2.6408 0.4853 3.6114 FALSE
RS6 RS5 -2.8908 0.2353 3.3614 FALSE
RS6 NPA1 -3.0526 0.0735 3.1996 FALSE
RS6 NEOS1 0.3739 3.5000 6.6261 TRUE
RS6 BVB 2.1827 5.3088 8.4349 TRUE
RS2 RS5 -3.3761 -0.2500 2.8761 FALSE
RS2 NPA1 -3.5379 -0.4118 2.7143 FALSE
RS2 NEOS1 -0.1114 3.0147 6.1408 FALSE
RS2 BVB 1.6974 4.8235 7.9496 TRUE
RS5 NPA1 -3.2879 -0.1618 2.9643 FALSE
RS5 NEOS1 0.1386 3.2647 6.3908 TRUE
RS5 BVB 1.9474 5.0735 8.1996 TRUE
NPA1 NEOS1 0.3004 3.4265 6.5526 TRUE
NPA1 BVB 2.1092 5.2353 8.3614 TRUE
NEOS1 BVB -1.3173 1.8088 4.9349 FALSE

download these results as csv

https://music-ir.org/mirex/results/2009/mf0/nt/summary/small.Accuracy_Per_Song_Friedman_Mean_Rankstask2.friedman.Friedman_Mean_Ranks.png

Tukey-Kramer HSD Multi-Comparison for Task2B (Piano)
TeamID TeamID Lowerbound Mean Upperbound Significance
NPA2 YR -6.5930 0.8333 8.2597 FALSE
NPA2 DHP1 -6.0930 1.3333 8.7597 FALSE
NPA2 NEOS2 -6.5930 0.8333 8.2597 FALSE
NPA2 NPA1 -5.8430 1.5833 9.0097 FALSE
NPA2 RS1 -4.0930 3.3333 10.7597 FALSE
NPA2 RS6 -4.3430 3.0833 10.5097 FALSE
NPA2 RS3 -2.7597 4.6667 12.0930 FALSE
NPA2 RS4 -3.4263 4.0000 11.4263 FALSE
NPA2 RS2 -2.9263 4.5000 11.9263 FALSE
NPA2 RS5 -2.8430 4.5833 12.0097 FALSE
NPA2 NEOS1 0.0737 7.5000 14.9263 TRUE
NPA2 BVB 1.8237 9.2500 16.6763 TRUE
YR DHP1 -6.9263 0.5000 7.9263 FALSE
YR NEOS2 -7.4263 0.0000 7.4263 FALSE
YR NPA1 -6.6763 0.7500 8.1763 FALSE
YR RS1 -4.9263 2.5000 9.9263 FALSE
YR RS6 -5.1763 2.2500 9.6763 FALSE
YR RS3 -3.5930 3.8333 11.2597 FALSE
YR RS4 -4.2597 3.1667 10.5930 FALSE
YR RS2 -3.7597 3.6667 11.0930 FALSE
YR RS5 -3.6763 3.7500 11.1763 FALSE
YR NEOS1 -0.7597 6.6667 14.0930 FALSE
YR BVB 0.9903 8.4167 15.8430 TRUE
DHP1 NEOS2 -7.9263 -0.5000 6.9263 FALSE
DHP1 NPA1 -7.1763 0.2500 7.6763 FALSE
DHP1 RS1 -5.4263 2.0000 9.4263 FALSE
DHP1 RS6 -5.6763 1.7500 9.1763 FALSE
DHP1 RS3 -4.0930 3.3333 10.7597 FALSE
DHP1 RS4 -4.7597 2.6667 10.0930 FALSE
DHP1 RS2 -4.2597 3.1667 10.5930 FALSE
DHP1 RS5 -4.1763 3.2500 10.6763 FALSE
DHP1 NEOS1 -1.2597 6.1667 13.5930 FALSE
DHP1 BVB 0.4903 7.9167 15.3430 TRUE
NEOS2 NPA1 -6.6763 0.7500 8.1763 FALSE
NEOS2 RS1 -4.9263 2.5000 9.9263 FALSE
NEOS2 RS6 -5.1763 2.2500 9.6763 FALSE
NEOS2 RS3 -3.5930 3.8333 11.2597 FALSE
NEOS2 RS4 -4.2597 3.1667 10.5930 FALSE
NEOS2 RS2 -3.7597 3.6667 11.0930 FALSE
NEOS2 RS5 -3.6763 3.7500 11.1763 FALSE
NEOS2 NEOS1 -0.7597 6.6667 14.0930 FALSE
NEOS2 BVB 0.9903 8.4167 15.8430 TRUE
NPA1 RS1 -5.6763 1.7500 9.1763 FALSE
NPA1 RS6 -5.9263 1.5000 8.9263 FALSE
NPA1 RS3 -4.3430 3.0833 10.5097 FALSE
NPA1 RS4 -5.0097 2.4167 9.8430 FALSE
NPA1 RS2 -4.5097 2.9167 10.3430 FALSE
NPA1 RS5 -4.4263 3.0000 10.4263 FALSE
NPA1 NEOS1 -1.5097 5.9167 13.3430 FALSE
NPA1 BVB 0.2403 7.6667 15.0930 TRUE
RS1 RS6 -7.6763 -0.2500 7.1763 FALSE
RS1 RS3 -6.0930 1.3333 8.7597 FALSE
RS1 RS4 -6.7597 0.6667 8.0930 FALSE
RS1 RS2 -6.2597 1.1667 8.5930 FALSE
RS1 RS5 -6.1763 1.2500 8.6763 FALSE
RS1 NEOS1 -3.2597 4.1667 11.5930 FALSE
RS1 BVB -1.5097 5.9167 13.3430 FALSE
RS6 RS3 -5.8430 1.5833 9.0097 FALSE
RS6 RS4 -6.5097 0.9167 8.3430 FALSE
RS6 RS2 -6.0097 1.4167 8.8430 FALSE
RS6 RS5 -5.9263 1.5000 8.9263 FALSE
RS6 NEOS1 -3.0097 4.4167 11.8430 FALSE
RS6 BVB -1.2597 6.1667 13.5930 FALSE
RS3 RS4 -8.0930 -0.6667 6.7597 FALSE
RS3 RS2 -7.5930 -0.1667 7.2597 FALSE
RS3 RS5 -7.5097 -0.0833 7.3430 FALSE
RS3 NEOS1 -4.5930 2.8333 10.2597 FALSE
RS3 BVB -2.8430 4.5833 12.0097 FALSE
RS4 RS2 -6.9263 0.5000 7.9263 FALSE
RS4 RS5 -6.8430 0.5833 8.0097 FALSE
RS4 NEOS1 -3.9263 3.5000 10.9263 FALSE
RS4 BVB -2.1763 5.2500 12.6763 FALSE
RS2 RS5 -7.3430 0.0833 7.5097 FALSE
RS2 NEOS1 -4.4263 3.0000 10.4263 FALSE
RS2 BVB -2.6763 4.7500 12.1763 FALSE
RS5 NEOS1 -4.5097 2.9167 10.3430 FALSE
RS5 BVB -2.7597 4.6667 12.0930 FALSE
NEOS1 BVB -5.6763 1.7500 9.1763 FALSE

download these results as csv

https://www.music-ir.org/mirex/results/2009/mf0/nt/summary_piano_subtask/small.Accuracy_Per_Song_Friedman_Mean_Rankstask2_piano.friedman.Friedman_Mean_Ranks.png