Difference between revisions of "2007:Audio Classical Composer Identification Results"
From MIREX Wiki
IMIRSELBot (talk | contribs) m (Robot: Automated text replacement (-mirex/abs/ +mirex/abstracts/)) |
(→Team ID) |
||
Line 21: | Line 21: | ||
====Team ID==== | ====Team ID==== | ||
'''ME''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_mandel.pdf Michael I. Mandel, Daniel P. W. Ellis]<br /> | '''ME''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_mandel.pdf Michael I. Mandel, Daniel P. W. Ellis]<br /> | ||
− | '''TL''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_lidy.pdf Thomas Lidy, Andreas Rauber, Antonio Pertusa, | + | '''TL''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_lidy.pdf Thomas Lidy, Andreas Rauber, Antonio Pertusa, José Manuel Iñesta]<br /> |
'''GT''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_tzanetakis.pdf George Tzanetakis]<br /> | '''GT''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_tzanetakis.pdf George Tzanetakis]<br /> | ||
'''KL''' = Kyogu Lee<br /> | '''KL''' = Kyogu Lee<br /> |
Latest revision as of 16:29, 23 July 2010
Contents
Introduction
These are the results for the inaugural 2007 running of the Audio Classical Composer Identification task. For background information about this task set please refer to the 2007:Audio Classical Composer Identification page.
The data set consisted of 2772 30 second audio clips. The composers represented were:
- Bach
- Beethoven
- Brahms
- Chopin
- Dvorak
- Handel
- Haydn
- Mendelssohnn
- Mozart
- Schubert
- Vivaldi
The goal was to correctly identify the composer who wrote each of the pieces represented.
General Legend
Team ID
ME = Michael I. Mandel, Daniel P. W. Ellis
TL = Thomas Lidy, Andreas Rauber, Antonio Pertusa, José Manuel Iñesta
GT = George Tzanetakis
KL = Kyogu Lee
IM = IMIRSEL M2K
Overall Summary Results
MIREX 2007 Audio Classical Composer Classification Summary Results - Raw Classification Accuracy Averaged Over Three Train/Test Folds
Participant | Avg. Raw Classification Accuracy |
---|---|
IM_knn | 48.38% |
IM_svm | 53.72% |
KL | 19.70% |
TL | 47.26% |
ME | 47.84% |
ME_spec | 52.02% |
GT | 44.59% |
MIREX 2007 Audio Classical Composer Classification Evaluation Logs and Confusion Matrices
IM_knn
IM_svm
KL
TL
ME
ME_spec
GT
MIREX 2007 Audio Classical Composer Classification Run Times
Participant | Runtime (sec) / Fold |
---|---|
IM-knn | Feat Ex: 2457 Train/Classify: 199 |
IM-svm | Feat Ex: 2457 Train/Classify: 30 |
KL | Feat Ex: 1664 Train/Classify: 1320 |
TL | Feat Ex: 20790 Train/Classify: 101 |
ME | Feat Ex: 3863 Train/Classify: 31 |
ME_spec | Feat Ex: 2807 Train/Classify: 31 |
GT | Feat Ex/Train/Classify: 557 |