Difference between revisions of "2016:Audio Downbeat Estimation Results"
From MIREX Wiki
Line 30: | Line 30: | ||
|- | |- | ||
! CD4 | ! CD4 | ||
− | | qm-barbeattracker || style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2016/CD4.pdf PDF] || Chris Cannam | + | | qm-barbeattracker || style="text-align: center;" | [https://www.music-ir.org/mirex/abstracts/2016/CD4.pdf PDF] || Matthew Davies, Chris Cannam |
|- | |- | ||
|} | |} |
Revision as of 23:30, 27 July 2016
Submitted Algorithms
Submission code | Submission name | Abstract | Contributors |
---|---|---|---|
DBDR1 | DB1_no_beatles | Simon Durand, Juan Bello, Bertrand David, Gael Richard | |
DBDR2 | DB2_no_ballroom | Simon Durand, Juan Bello, Bertrand David, Gael Richard | |
KB1 | beats_2013 | Florian Krebs, Sebastian Böck | |
KB2 | beats_2015 | Florian Krebs, Sebastian Böck | |
BK4 | joint_tracker | Sebastian Böck, Florian Krebs | |
DSR1 | downbeater | Matthew Davies, Adam Stark, Andrew Robertson | |
CD4 | qm-barbeattracker | Matthew Davies, Chris Cannam |
Results
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1* | 0.838 | 0.874 | 0.846 |
DBDR2 | 0.783 | 0.808 | 0.804 |
BK4* | 0.908 | 0.906 | 0.917 |
CD4 | 0.412 | 0.416 | 0.419 |
DSR1 | 0.463 | 0.476 | 0.468 |
KB1* | 0.898 | 0.888 | 0.917 |
KB2* | 0.860 | 0.853 | 0.890 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.849 | 0.861 | 0.868 |
DBDR2* | 0.872 | 0.861 | 0.909 |
BK4* | 0.865 | 0.872 | 0.876 |
CD4 | 0.604 | 0.586 | 0.642 |
DSR1 | 0.665 | 0.646 | 0.708 |
KB1 | 0.803 | 0.776 | 0.859 |
KB2* | 0.818 | 0.799 | 0.870 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.201 | 0.199 | 0.240 |
DBDR2 | 0.231 | 0.194 | 0.330 |
BK4* | 0.369 | 0.290 | 0.566 |
CD4 | 0.186 | 0.154 | 0.258 |
DSR1 | 0.184 | 0.155 | 0.251 |
KB1 | 0.269 | 0.221 | 0.380 |
KB2* | 0.330 | 0.263 | 0.487 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.306 | 0.292 | 0.379 |
DBDR2 | 0.415 | 0.360 | 0.554 |
BK4* | 0.537 | 0.468 | 0.729 |
CD4 | 0.218 | 0.186 | 0.291 |
DSR1 | 0.317 | 0.281 | 0.411 |
KB1 | 0.352 | 0.301 | 0.498 |
KB2* | 0.336 | 0.269 | 0.513 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.426 | 0.715 | 0.308 |
DBDR2 | 0.418 | 0.637 | 0.311 |
BK4* | 0.635 | 0.951 | 0.476 |
CD4 | 0.250 | 0.377 | 0.188 |
DSR1 | 0.265 | 0.398 | 0.199 |
KB1 | 0.433 | 0.641 | 0.328 |
KB2* | 0.443 | 0.661 | 0.334 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.578 | 0.613 | 0.561 |
DBDR2 | 0.629 | 0.628 | 0.638 |
BK4* | 0.970 | 0.970 | 0.970 |
CD4 | 0.334 | 0.341 | 0.329 |
DSR1 | 0.208 | 0.232 | 0.196 |
KB1 | 0.690 | 0.693 | 0.688 |
KB2* | 0.851 | 0.854 | 0.848 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.527 | 0.570 | 0.529 |
DBDR2 | 0.532 | 0.539 | 0.574 |
BK4* | 0.599 | 0.659 | 0.598 |
CD4 | 0.174 | 0.189 | 0.185 |
DSR1 | 0.251 | 0.260 | 0.279 |
KB1 | 0.436 | 0.475 | 0.447 |
KB2* | 0.428 | 0.459 | 0.444 |
Algorithm | F-Measure | Precision | Recall |
---|---|---|---|
DBDR1 | 0.615 | 0.651 | 0.631 |
DBDR2 | 0.619 | 0.628 | 0.666 |
BK4 | 0.638 | 0.636 | 0.669 |
CD4 | 0.460 | 0.461 | 0.482 |
DSR1 | 0.397 | 0.397 | 0.423 |
KB1 | 0.630 | 0.647 | 0.634 |
KB2 | 0.647 | 0.665 | 0.653 |
*) Rows marked by an asterisk should be taken with care as in those cases overlapping test and training sets were used. This could lead to overestimated metrics.
Runtime
All submissions finished the computations in less than 24 hours.