Difference between revisions of "2005:Audio Artist Identification Results"
From MIREX Wiki
(8 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
==Introduction== | ==Introduction== | ||
+ | These are the results for the 2005 running of the Audio Artist Identification task. | ||
− | ==Goal== | + | ===Goal=== |
− | + | To identify artist from music audio (in PCM format). | |
− | == | + | ===Datasets=== |
− | + | Two sets of data were used: Magnatune and USPOP. The audio sampling rates used were either 44.1 KHz or 22.05 KHz (mono). More data information is in the following table. | |
{| border="1" cellspacing="0" | {| border="1" cellspacing="0" | ||
Line 18: | Line 19: | ||
|} | |} | ||
− | == | + | ==Results== |
− | ==Overall== | + | ===Overall=== |
{| border="1" cellspacing="0" | {| border="1" cellspacing="0" | ||
|- style="background: yellow; text-align: center;" | |- style="background: yellow; text-align: center;" | ||
! colspan="3" | OVERALL | ! colspan="3" | OVERALL | ||
|-style="background: yellow;" | |-style="background: yellow;" | ||
− | ! Rank !! Participant !! Mean of Magnatune Raw Classification Accuracy and USPOP Raw Classification Accuracy | + | ! Rank !! Participant !! Mean of Magnatune Raw Classification Accuracy <br> and USPOP Raw Classification Accuracy |
|- | |- | ||
| 1 ||[https://www.music-ir.org/mirex/abstracts/2005/mandel.pdf Mandel & Ellis] || 72.45% | | 1 ||[https://www.music-ir.org/mirex/abstracts/2005/mandel.pdf Mandel & Ellis] || 72.45% | ||
Line 43: | Line 44: | ||
|} | |} | ||
− | + | ===Magnatune Dataset=== | |
− | ==Magnatune Dataset== | ||
{| border="1" cellspacing="0" | {| border="1" cellspacing="0" | ||
|- style="background: yellow; text-align: center;" | |- style="background: yellow; text-align: center;" | ||
! colspan="7" | Magnatune Dataset | ! colspan="7" | Magnatune Dataset | ||
|-style="background: yellow;" | |-style="background: yellow;" | ||
− | ! Rank !! Participant !! Raw Classification Accuracy !! Normalized Raw | + | ! Rank !! Participant !! Raw Classification Accuracy !! Normalized Raw Classification Accuracy !! Runtime (s) !! Machine !! Confusion Matrix Files |
|- | |- | ||
| 1 || Bergstra, Casagrande, & Eck (1) || 77.26% || 79.64% || 24 hours || B0 ||[https://www.music-ir.org/mirex/results/2005/audio-artist/BCE_1_MTeval.txt BCE_1_MTeval.txt] | | 1 || Bergstra, Casagrande, & Eck (1) || 77.26% || 79.64% || 24 hours || B0 ||[https://www.music-ir.org/mirex/results/2005/audio-artist/BCE_1_MTeval.txt BCE_1_MTeval.txt] | ||
Line 73: | Line 73: | ||
|} | |} | ||
− | ==USPOP Dataset== | + | ===USPOP Dataset=== |
{| border="1" cellspacing="0" | {| border="1" cellspacing="0" | ||
|- style="background: yellow; text-align: center;" | |- style="background: yellow; text-align: center;" | ||
! colspan="7" | USPOP Dataset | ! colspan="7" | USPOP Dataset | ||
|-style="background: yellow;" | |-style="background: yellow;" | ||
− | ! Rank !! Participant !! Raw Classification Accuracy !! Normalized Raw | + | ! Rank !! Participant !! Raw Classification Accuracy !! Normalized Raw Classification Accuracy !! Runtime (s) !! Machine !! Confusion Matrix Files |
|- | |- | ||
| 1 || Mandel & Ellis || 68.30% || 67.96% || 10240 || R || [https://www.music-ir.org/mirex/results/2005/audio-artist/ME_USeval.txt ME_USeval.txt] | | 1 || Mandel & Ellis || 68.30% || 67.96% || 10240 || R || [https://www.music-ir.org/mirex/results/2005/audio-artist/ME_USeval.txt ME_USeval.txt] |
Latest revision as of 10:23, 2 August 2010
Contents
Introduction
These are the results for the 2005 running of the Audio Artist Identification task.
Goal
To identify artist from music audio (in PCM format).
Datasets
Two sets of data were used: Magnatune and USPOP. The audio sampling rates used were either 44.1 KHz or 22.05 KHz (mono). More data information is in the following table.
Dataset | Size (@ 44.1 KHz) | Number of Training Files | Number of Testing Files |
---|---|---|---|
Magnatune | 35.2 GB | 1158 | 642 |
USPOP | 37.3 GB | 1158 | 653 |
Results
Overall
OVERALL | ||
---|---|---|
Rank | Participant | Mean of Magnatune Raw Classification Accuracy and USPOP Raw Classification Accuracy |
1 | Mandel & Ellis | 72.45% |
2 | Bergstra, Casagrande, & Eck (1) | 68.57% |
3 | Bergstra, Casagrande, & Eck (2) | 66.71% |
4 | Pampalk, E. | 61.28% |
5 | West & Lamere | 47.24% |
6 | Tzanetakis, G. | 42.05% |
7 | Logan, B | 25.95% |
Magnatune Dataset
Magnatune Dataset | ||||||
---|---|---|---|---|---|---|
Rank | Participant | Raw Classification Accuracy | Normalized Raw Classification Accuracy | Runtime (s) | Machine | Confusion Matrix Files |
1 | Bergstra, Casagrande, & Eck (1) | 77.26% | 79.64% | 24 hours | B0 | BCE_1_MTeval.txt |
2 | Mandel & Ellis | 76.60% | 76.62% | 11073 | R | ME_MTeval.txt |
3 | Bergstra, Casagrande, & Eck (2) | 74.45% | 74.51% | -- | -- | BCE_2_MTeval.txt |
4 | Pampalk, E. | 66.36% | 66.48% | 4272 | B1 | P_MTeval.txt |
5 | Tzanetakis, G. | 55.45% | 55.59% | 2632 | B0 | T_MTeval.txt |
6 | West & Lamere | 53.43% | 53.48% | 27480 | B3 | WL_MTeval.txt |
7 | Logan, B | 37.07% | 37.10% | N/A | B3 | L_MTeval.txt |
8 | Lidy & Rauber (SSD+RH) | TO * | -- | -- | -- | -- |
8 | Lidy & Rauber (RP+SSD) | TO * | -- | -- | -- | -- |
8 | Lidy & Rauber (RP+SSD+RH) | TO * | -- | -- | -- | -- |
USPOP Dataset
USPOP Dataset | ||||||
---|---|---|---|---|---|---|
Rank | Participant | Raw Classification Accuracy | Normalized Raw Classification Accuracy | Runtime (s) | Machine | Confusion Matrix Files |
1 | Mandel & Ellis | 68.30% | 67.96% | 10240 | R | ME_USeval.txt |
2 | Bergstra, Casagrande, & Eck (1) | 59.88% | 60.90% | 24 Hours | B0 | ME_USeval.txt |
3 | Bergstra, Casagrande, & Eck (2) | 58.96% | 58.96% | -- | -- | BCE_2_USeval.txt |
4 | Pampalk, E. | 56.20% | 56.03% | 4321 | B1 | P_USeval.txt |
5 | West & Lamere | 41.04% | 41.00% | 26871 | B3 | WL_USeval.txt |
6 | Tzanetakis, G. | 28.64% | 28.48% | 2443 | B0 | T_USeval.txt |
7 | Logan, B. | 14.83% | 14.76% | N/A | B3 | L_USeval.txt |
8 | Lidy & Rauber (SSD+RH) | TO * | -- | -- | -- | -- |
8 | Lidy & Rauber (RP+SSD) | TO * | -- | -- | -- | -- |
8 | Lidy & Rauber (RP+SSD+RH) | TO * | -- | -- | -- | -- |
Note:
DNC: did not complete ( error in execution).
TO: timed out (did not complete within 24 hours).