Difference between revisions of "2009:Audio Tag Classification Results"
IMIRSELBot (talk | contribs) m (Robot: Automated text replacement (-\[\[([A-Z][^:]+)\]\] +2009:\1)) |
IMIRSELBot (talk | contribs) m (Robot: Automated text replacement (-<csv([^>]*)> +<csv\1>2009/)) |
||
Line 23: | Line 23: | ||
==Overall Summary Results== | ==Overall Summary Results== | ||
− | <csv>tag/tag.grand.summary.show.csv</csv> | + | <csv>2009/tag/tag.grand.summary.show.csv</csv> |
===Summary Positive Example Accuracy (Average Across All Folds)=== | ===Summary Positive Example Accuracy (Average Across All Folds)=== | ||
− | <csv>tag/rounded/tag.binary_avg_positive_example_Accuracy.csv</csv> | + | <csv>2009/tag/rounded/tag.binary_avg_positive_example_Accuracy.csv</csv> |
===Summary Negative Example Accuracy (Average Across All Folds)=== | ===Summary Negative Example Accuracy (Average Across All Folds)=== | ||
− | <csv>tag/rounded/tag.binary_avg_negative_example_Accuracy.csv</csv> | + | <csv>2009/tag/rounded/tag.binary_avg_negative_example_Accuracy.csv</csv> |
===Summary Binary relevance F-Measure (Average Across All Folds)=== | ===Summary Binary relevance F-Measure (Average Across All Folds)=== | ||
− | <csv>tag/rounded/tag.binary_avg_Fmeasure.csv</csv> | + | <csv>2009/tag/rounded/tag.binary_avg_Fmeasure.csv</csv> |
===Summary Binary Accuracy (Average Across All Folds)=== | ===Summary Binary Accuracy (Average Across All Folds)=== | ||
− | <csv>tag/rounded/tag.binary_avg_Accuracy.csv</csv> | + | <csv>2009/tag/rounded/tag.binary_avg_Accuracy.csv</csv> |
===Summary AUC-ROC Tag (Average Across All Folds)=== | ===Summary AUC-ROC Tag (Average Across All Folds)=== | ||
− | <csv>tag/rounded/tag.affinity_tag_AUC_ROC.csv</csv> | + | <csv>2009/tag/rounded/tag.affinity_tag_AUC_ROC.csv</csv> |
Revision as of 19:57, 13 May 2010
Contents
- 1 Introduction
- 2 Overall Summary Results
- 2.1 Summary Positive Example Accuracy (Average Across All Folds)
- 2.2 Summary Negative Example Accuracy (Average Across All Folds)
- 2.3 Summary Binary relevance F-Measure (Average Across All Folds)
- 2.4 Summary Binary Accuracy (Average Across All Folds)
- 2.5 Summary AUC-ROC Tag (Average Across All Folds)
Introduction
General Legend
Team ID
BP1 2009:Juan José Burred, Geoffroy Peeters
BP2 2009:Juan José Burred, Geoffroy Peeters
CL1 2009:Chuan Cao, Ming Li
CL2 2009:Chuan Cao, Ming Li
CL3 2009:Chuan Cao, Ming Li
CL4 2009:Chuan Cao, Ming Li
GP 2009:Geoffroy Peeters
GT1 2009:George Tzanetakis
GT2 2009:George Tzanetakis
HBC 2009:Matt Hoffman, David Blei, Perry Cook
LWW1 2009:Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
LWW2 2009:Hung-Yi Lo, Ju-Chiang Wang, Hsin-Min Wang
Overall Summary Results
file /nema-raid/www/mirex/results/2009/tag/tag.grand.summary.show.csv not found
Summary Positive Example Accuracy (Average Across All Folds)
file /nema-raid/www/mirex/results/2009/tag/rounded/tag.binary_avg_positive_example_Accuracy.csv not found
Summary Negative Example Accuracy (Average Across All Folds)
file /nema-raid/www/mirex/results/2009/tag/rounded/tag.binary_avg_negative_example_Accuracy.csv not found
Summary Binary relevance F-Measure (Average Across All Folds)
file /nema-raid/www/mirex/results/2009/tag/rounded/tag.binary_avg_Fmeasure.csv not found
Summary Binary Accuracy (Average Across All Folds)
file /nema-raid/www/mirex/results/2009/tag/rounded/tag.binary_avg_Accuracy.csv not found
Summary AUC-ROC Tag (Average Across All Folds)
file /nema-raid/www/mirex/results/2009/tag/rounded/tag.affinity_tag_AUC_ROC.csv not found