Difference between revisions of "2008:Real-time Audio to Score Alignment (a.k.a. Score Following) Results"
From MIREX Wiki
IMIRSELBot (talk | contribs) m (Robot: Automated text replacement (-mirex/abs/ +mirex/abstracts/)) |
IMIRSELBot (talk | contribs) m (Robot: Automated text replacement (-/mirex/2008/results/ +/mirex/results/2008/)) |
||
Line 17: | Line 17: | ||
===Individual Results=== | ===Individual Results=== | ||
− | '''MO''' = [https://www.music-ir.org/mirex/2008 | + | '''MO''' = [https://www.music-ir.org/mirex/results/2008/scofo/MOResults.zip N. Montecchio & Orio]<br /> |
− | '''RM''' = [https://www.music-ir.org/mirex/2008 | + | '''RM''' = [https://www.music-ir.org/mirex/results/2008/scofo/RMResults.zip R. Macrae ]<br /> |
===Summary Results w.r.t R. Macrae`s Evaluation Script=== | ===Summary Results w.r.t R. Macrae`s Evaluation Script=== | ||
Line 24: | Line 24: | ||
===Individual Results w.r.t R. Macrae`s Evaluation Script=== | ===Individual Results w.r.t R. Macrae`s Evaluation Script=== | ||
− | '''MO''' = [https://www.music-ir.org/mirex/2008 | + | '''MO''' = [https://www.music-ir.org/mirex/results/2008/scofo/MOresults_withRobsEvalScript.zip N. Montecchio & Orio]<br /> |
− | '''RM''' = [https://www.music-ir.org/mirex/2008 | + | '''RM''' = [https://www.music-ir.org/mirex/results/2008/scofo/RMresults_withRobsEvalScript.zip R. Macrae ]<br /> |
Latest revision as of 13:54, 7 June 2010
Contents
Introduction
These are the results for the 2008 running of the Real-time Audio to Score Alignment (a.k.a Score Following) task. For background information about this task set please refer to the 2008:Real-time Audio to Score Alignment (a.k.a Score Following) page.
General Legend
Team ID
MO1 = N. Montecchio & Orio 1
MO2 = N. Montecchio & Orio 2
RM1 = R. Macrae
RM2 = R. Macrae
Summary Results
MO1 | MO2 | RM1 | RM2 | |
---|---|---|---|---|
Piecewise Precision (MO GT) | 84.45% | 68.84% | 17.10% | 19.50% |
Piecewise Precision (RM GT) | 48.55% | 41.67% | 25.34% | 26.19% |
Ave. Piecewise Precision | 66.50% | 55.26% | 21.22% | 22.85% |
Individual Results
MO = N. Montecchio & Orio
RM = R. Macrae
Summary Results w.r.t R. Macrae`s Evaluation Script
MO1 | MO2 | RM1 | RM2 | |
---|---|---|---|---|
Piecewise Precision (MO GT) | 81.59% | 65.63% | 24.27% | 17.03% |
Piecewise Precision (RM GT) | 25.93% | 25.36% | 44.77% | 28.80% |
Ave. Piecewise Precision | 53.76% | 45.50% | 34.52% | 22.92% |
Individual Results w.r.t R. Macrae`s Evaluation Script
MO = N. Montecchio & Orio
RM = R. Macrae
The systems are evaluated against the ground truth that is prepared by parsing the score files by each systems own midi parser (MO GT, RM GT).