Difference between revisions of "2017:Audio Chord Estimation Results"
JohanPauwels (talk | contribs) (Creation of 2017 ACE results) |
JohanPauwels (talk | contribs) m (Remove outdated reference) |
||
(One intermediate revision by the same user not shown) | |||
Line 7: | Line 7: | ||
==Software== | ==Software== | ||
− | All software used for the evaluation has been made open-source. The evaluation framework is described by [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6637748 Pauwels and Peeters (2013)]. The corresponding binaries and code repository can be found on [https://github.com/jpauwels/MusOOEvaluator GitHub] and the used measures are available as presets. The raw algorithmic output | + | All software used for the evaluation has been made open-source. The evaluation framework is described by [http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6637748 Pauwels and Peeters (2013)]. The corresponding binaries and code repository can be found on [https://github.com/jpauwels/MusOOEvaluator GitHub] and the used measures are available as presets. The raw algorithmic output provided below makes it possible to calculate the additional measures from the paper (separate results for tetrads, etc.), in addition to those presented below. More help can be found in the [https://github.com/jpauwels/MusOOEvaluator/blob/master/README.md readme]. |
The statistical comparison between the different submissions is explained in [http://www.terasoft.com.tw/conf/ismir2014/proceedings/T095_250_Paper.pdf Burgoyne et al. (2014)]. The software is available at [https://bitbucket.org/jaburgoyne/mirexace BitBucket]. It uses the detailed results provided below as input. | The statistical comparison between the different submissions is explained in [http://www.terasoft.com.tw/conf/ismir2014/proceedings/T095_250_Paper.pdf Burgoyne et al. (2014)]. The software is available at [https://bitbucket.org/jaburgoyne/mirexace BitBucket]. It uses the detailed results provided below as input. | ||
Line 67: | Line 67: | ||
===Detailed Results=== | ===Detailed Results=== | ||
− | + | More details about the performance of the algorithms, including per-song performance and supplementary statistics, are available from [https://github.com/ismir-mirex/ace-results/tree/master/2017 this repository]. | |
− | |||
===Algorithmic Output=== | ===Algorithmic Output=== | ||
− | + | The raw output of the algorithms are available in [https://github.com/ismir-mirex/ace-output/tree/master/2017 this repository]. They can be used to experiment with alternative evaluation measures and statistics. | |
− |
Latest revision as of 04:18, 14 October 2020
Contents
Introduction
This page contains the results of the 2017 edition of the MIREX automatic chord estimation tasks. This edition was the fifth one since the reorganization of the evaluation procedure in 2013. The results can therefore be directly compared to those of the last four years. Chord labels are evaluated according to five different chord vocabularies and the segmentation is also assessed. Additional information about the used measures can be found on the page of the 2013 edition.
What’s new?
- This year the algorithms have additionally been evaluated on the "RWC-Popular" and "USPOP2002Chords" dataset annotated at the Music and Audio Research Lab of NYU, whose annotations are publicly available. The RWC-Popular dataset contains 100 pop songs recorded specifically for music information retrieval research. The USPOP2002Chords set is the 195 file subset of the USPOP2002 dataset that have been annotated with chord sequences.
Software
All software used for the evaluation has been made open-source. The evaluation framework is described by Pauwels and Peeters (2013). The corresponding binaries and code repository can be found on GitHub and the used measures are available as presets. The raw algorithmic output provided below makes it possible to calculate the additional measures from the paper (separate results for tetrads, etc.), in addition to those presented below. More help can be found in the readme.
The statistical comparison between the different submissions is explained in Burgoyne et al. (2014). The software is available at BitBucket. It uses the detailed results provided below as input.
Submissions
Abstract | Contributors | |
---|---|---|
CM2 | Chris Cannam, Matthias Mauch | |
JLW1, JLW2 | Junyan Jiang, Wei Li, Yiming Wu | |
KBK1, KBK2 | Filip Korzeniowski, Sebastian Böck, Florian Krebs | |
WL1 | Yiming Wu, Wei Li |
Results
Summary
All figures can be interpreted as percentages and range from 0 (worst) to 100 (best).
Isophonics2009
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 78.66 | 75.51 | 72.58 | 54.78 | 52.36 | 85.87 | 87.22 | 85.98 |
JLW1 | 81.45 | 78.97 | 77.18 | 67.64 | 66.11 | 86.35 | 83.73 | 91.08 |
JLW2 | 80.93 | 78.49 | 76.71 | 67.22 | 65.70 | 85.59 | 82.22 | 91.33 |
KBK1 | 83.62 | 82.24 | 79.27 | 72.06 | 69.35 | 87.11 | 83.82 | 92.48 |
KBK2 | 87.38 | 86.80 | 83.43 | 75.56 | 72.60 | 89.29 | 87.24 | 92.35 |
WL1 | 81.38 | 79.70 | 74.33 | 69.01 | 64.11 | 84.03 | 79.19 | 91.90 |
Billboard2012
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 74.23 | 72.31 | 70.25 | 55.44 | 53.48 | 83.60 | 85.33 | 83.31 |
JLW1 | 78.00 | 76.42 | 75.08 | 62.06 | 60.93 | 84.23 | 82.33 | 87.86 |
JLW2 | 78.66 | 77.14 | 75.79 | 62.49 | 61.32 | 84.52 | 81.88 | 89.22 |
KBK1 | 79.81 | 79.19 | 76.71 | 56.90 | 54.84 | 85.78 | 82.61 | 90.72 |
KBK2 | 86.30 | 86.02 | 83.12 | 61.12 | 58.75 | 88.38 | 86.78 | 90.97 |
WL1 | 79.49 | 78.54 | 74.67 | 62.54 | 59.57 | 83.61 | 79.64 | 89.80 |
Billboard2013
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 71.23 | 67.36 | 65.28 | 49.07 | 47.25 | 81.50 | 83.13 | 82.53 |
JLW1 | 75.00 | 72.30 | 70.93 | 57.11 | 55.87 | 81.28 | 79.15 | 87.35 |
JLW2 | 75.40 | 72.77 | 71.41 | 57.49 | 56.25 | 81.58 | 78.57 | 89.02 |
KBK1 | 75.17 | 72.31 | 69.88 | 52.27 | 50.13 | 81.25 | 77.78 | 89.40 |
KBK2 | 80.60 | 78.37 | 75.85 | 55.81 | 53.57 | 83.59 | 83.01 | 87.54 |
WL1 | 75.22 | 72.53 | 69.06 | 57.87 | 55.14 | 79.69 | 75.54 | 88.89 |
JayChou29
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 72.82 | 72.15 | 65.55 | 54.46 | 49.05 | 86.58 | 86.94 | 86.84 |
JLW1 | 76.26 | 75.85 | 72.47 | 58.10 | 55.42 | 89.71 | 88.79 | 90.95 |
JLW2 | 76.06 | 75.70 | 72.36 | 57.67 | 55.08 | 89.72 | 88.23 | 91.56 |
KBK1 | 78.11 | 77.41 | 66.03 | 50.86 | 41.66 | 86.93 | 83.46 | 91.47 |
KBK2 | 81.42 | 80.48 | 69.66 | 51.77 | 43.18 | 89.08 | 87.67 | 90.87 |
WL1 | 83.04 | 82.34 | 78.57 | 62.02 | 58.60 | 89.01 | 86.38 | 91.93 |
RobbieWilliams
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 81.94 | 78.29 | 76.09 | 57.97 | 55.94 | 87.95 | 89.00 | 87.39 |
JLW1 | 83.72 | 80.48 | 78.81 | 70.08 | 68.40 | 88.80 | 87.26 | 91.12 |
JLW2 | 84.00 | 80.68 | 79.04 | 70.07 | 68.41 | 89.01 | 86.97 | 91.91 |
KBK1 | 84.36 | 81.98 | 79.38 | 77.98 | 75.66 | 88.33 | 85.61 | 92.14 |
KBK2 | 89.38 | 88.12 | 85.06 | 83.38 | 80.68 | 90.96 | 89.52 | 92.84 |
WL1 | 83.34 | 80.65 | 77.55 | 71.23 | 68.38 | 86.38 | 82.19 | 92.14 |
RWC-Popular
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 79.12 | 77.95 | 74.30 | 63.33 | 59.89 | 88.62 | 88.11 | 89.67 |
JLW1 | 81.84 | 81.40 | 79.12 | 69.76 | 67.50 | 89.20 | 86.65 | 92.75 |
JLW2 | 82.22 | 81.76 | 79.44 | 70.06 | 67.79 | 89.56 | 86.54 | 93.72 |
KBK1 | 81.32 | 80.48 | 74.79 | 55.46 | 50.10 | 87.85 | 84.67 | 91.84 |
KBK2 | 87.20 | 86.58 | 80.92 | 58.04 | 52.78 | 89.94 | 87.94 | 92.37 |
WL1 | 84.05 | 82.87 | 79.08 | 70.12 | 66.91 | 87.48 | 83.54 | 92.51 |
USPOP2002Chords
Algorithm | MirexRoot | MirexMajMin | MirexMajMinBass | MirexSevenths | MirexSeventhsBass | MeanSeg | UnderSeg | OverSeg |
---|---|---|---|---|---|---|---|---|
CM2 | 78.41 | 76.24 | 72.74 | 59.84 | 56.67 | 85.81 | 86.81 | 86.02 |
JLW1 | 80.85 | 79.33 | 77.34 | 67.30 | 65.54 | 86.53 | 84.50 | 89.97 |
JLW2 | 81.47 | 80.06 | 78.12 | 68.00 | 66.29 | 87.11 | 84.48 | 91.32 |
KBK1 | 79.56 | 77.19 | 72.47 | 62.52 | 58.30 | 85.49 | 82.25 | 90.52 |
KBK2 | 82.34 | 80.72 | 76.04 | 64.50 | 60.30 | 86.59 | 85.49 | 89.17 |
WL1 | 81.97 | 81.20 | 77.30 | 68.91 | 65.48 | 85.49 | 81.41 | 91.70 |
Detailed Results
More details about the performance of the algorithms, including per-song performance and supplementary statistics, are available from this repository.
Algorithmic Output
The raw output of the algorithms are available in this repository. They can be used to experiment with alternative evaluation measures and statistics.