Difference between revisions of "2007:Audio Classical Composer Identification Results"

From MIREX Wiki
m (Robot: Automated text replacement (-mirex2007/results +mirex/results/2007))
m (Robot: Automated text replacement (-mirex/abs/ +mirex/abstracts/))
Line 20: Line 20:
 
===General Legend===
 
===General Legend===
 
====Team ID====
 
====Team ID====
'''ME''' = [https://www.music-ir.org/mirex/abs/2007/AI_CC_GC_MC_AS_mandel.pdf Michael I. Mandel, Daniel P. W. Ellis]<br />
+
'''ME''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_mandel.pdf Michael I. Mandel, Daniel P. W. Ellis]<br />
'''TL''' = [https://www.music-ir.org/mirex/abs/2007/AI_CC_GC_MC_AS_lidy.pdf Thomas Lidy, Andreas Rauber, Antonio Pertusa, José Manuel Iñesta]<br />
+
'''TL''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_lidy.pdf Thomas Lidy, Andreas Rauber, Antonio Pertusa, José Manuel Iñesta]<br />
'''GT''' = [https://www.music-ir.org/mirex/abs/2007/AI_CC_GC_MC_AS_tzanetakis.pdf George Tzanetakis]<br />
+
'''GT''' = [https://www.music-ir.org/mirex/abstracts/2007/AI_CC_GC_MC_AS_tzanetakis.pdf George Tzanetakis]<br />
 
'''KL''' = Kyogu Lee<br />
 
'''KL''' = Kyogu Lee<br />
 
'''IM''' = IMIRSEL M2K<br />
 
'''IM''' = IMIRSEL M2K<br />

Revision as of 22:50, 13 May 2010

Introduction

These are the results for the inaugural 2007 running of the Audio Classical Composer Identification task. For background information about this task set please refer to the 2007:Audio Classical Composer Identification page.

The data set consisted of 2772 30 second audio clips. The composers represented were:

  1. Bach
  2. Beethoven
  3. Brahms
  4. Chopin
  5. Dvorak
  6. Handel
  7. Haydn
  8. Mendelssohnn
  9. Mozart
  10. Schubert
  11. Vivaldi

The goal was to correctly identify the composer who wrote each of the pieces represented.

General Legend

Team ID

ME = Michael I. Mandel, Daniel P. W. Ellis
TL = Thomas Lidy, Andreas Rauber, Antonio Pertusa, José Manuel Iñesta
GT = George Tzanetakis
KL = Kyogu Lee
IM = IMIRSEL M2K

Overall Summary Results

MIREX 2007 Audio Classical Composer Classification Summary Results - Raw Classification Accuracy Averaged Over Three Train/Test Folds

Participant Avg. Raw Classification Accuracy
IM_knn 48.38%
IM_svm 53.72%
KL 19.70%
TL 47.26%
ME 47.84%
ME_spec 52.02%
GT 44.59%

download these results as csv

MIREX 2007 Audio Classical Composer Classification Evaluation Logs and Confusion Matrices

IM_knn
IM_svm
KL
TL
ME
ME_spec
GT

MIREX 2007 Audio Classical Composer Classification Run Times

Participant Runtime (sec) / Fold
IM-knn Feat Ex: 2457 Train/Classify: 199
IM-svm Feat Ex: 2457 Train/Classify: 30
KL Feat Ex: 1664 Train/Classify: 1320
TL Feat Ex: 20790 Train/Classify: 101
ME Feat Ex: 3863 Train/Classify: 31
ME_spec Feat Ex: 2807 Train/Classify: 31
GT Feat Ex/Train/Classify: 557

download these results as csv