Difference between revisions of "2009:Music Structure Segmentation Results"
m (→Individual Participant Results) |
(→Individual Participant Results) |
||
Line 35: | Line 35: | ||
===Individual Participant Results=== | ===Individual Participant Results=== | ||
*[[2009:Music_Structure_Segmentation_Results:_AN01]] | *[[2009:Music_Structure_Segmentation_Results:_AN01]] | ||
− | *[[Music_Structure_Segmentation_Results:_AN02]] | + | *[[2009:Music_Structure_Segmentation_Results:_AN02]] |
− | *[[Music_Structure_Segmentation_Results:_GP]] | + | *[[2009:Music_Structure_Segmentation_Results:_GP]] |
− | *[[Music_Structure_Segmentation_Results:_MND]] | + | *[[2009:Music_Structure_Segmentation_Results:_MND]] |
− | *[[Music_Structure_Segmentation_Results:_PK]] | + | *[[2009:Music_Structure_Segmentation_Results:_PK]] |
Latest revision as of 14:41, 1 June 2010
Contents
Introduction
This task concerns itself with analyzing the structure of music audio files, and labeling the corresponding segments, e.g. {verse, chorus, bridge, etc}, {A, B, C, etc.}. A more detailed description can be found at the task page 2009:Structural_Segmentation. The dataset consists of 297 popular music songs.
General Legend
Team ID
ANO1 = Anonymous
ANO2 = Anonymous
MND = Matthias Mauch, Katy Noland, Simon Dixon
PK = Jouni Paulus, Anssi Klapuri
GP = Geoffroy Peeters
Evaluation Measures
overSegScore - normalised conditional entropy based over-segmentation score, S_o from Lukashevich ISMIR2008
underSegScore - normalised conditional entropy based under-segmentation score, S_u from Lukashevich ISMIR2008
pwF - frame pair clustering F-measure from Levy & Sandler TASLP2008
pwPrecision - frame pair clustering precision rate from Levy & Sandler TASLP2008
pwRecall - frame pair clustering recall rate from Levy & Sandler TASLP2008
R - Rand clustering index from Hubert & Arabie, "Comparing partitions", Journal of Classification, 1985
Fmeasure@[0.5, 3]s - segment boundary recovery evaluation measure. claimed boundary is accepted if it is within the specified window length from a true boundary, overall F-measure for boundary recovery
precRate@[0.5, 3]s - segment boundary recovery precision rate
recRate@[0.5, 3]s - segment boundary recovery recall rate
medianTrue2claim - median distance from an annotated segment boundary to the closest found boundary, seconds
medianClaim2true - median distance from a found segment boundary to the closest annotated one, seconds
The calculation of the measures is described in 2009:Structural_Segmentation#Evaluation_Measures.
MIREX 2009 Music Structure Summary Results - Mean of all Measures
EvalMeasure | ANO1 | ANO2 | GP | MND | PK |
---|---|---|---|---|---|
overSegScore | 0.6371 | 0.65438 | 0.60119 | 0.7389 | 0.59335 |
underSegScore | 0.63701 | 0.57517 | 0.67681 | 0.61756 | 0.78957 |
pwF | 0.58183 | 0.57653 | 0.53254 | 0.59988 | 0.54032 |
pwPrecision | 0.59655 | 0.54254 | 0.62689 | 0.56112 | 0.74148 |
pwRecall | 0.61363 | 0.67024 | 0.50493 | 0.71034 | 0.46172 |
R | 0.76223 | 0.73546 | 0.75931 | 0.74778 | 0.79232 |
Fmeasure@0.5s | 0.18312 | 0.12762 | 0.1836 | 0.20951 | 0.27055 |
precRate@0.5s | 0.16013 | 0.12538 | 0.1458 | 0.15754 | 0.2429 |
recRate@0.5s | 0.22211 | 0.13462 | 0.25972 | 0.35956 | 0.32318 |
Fmeasure@3s | 0.5896 | 0.58447 | 0.49952 | 0.39911 | 0.5305 |
precRate@3s | 0.51527 | 0.57485 | 0.39969 | 0.29931 | 0.47665 |
recRate@3s | 0.71422 | 0.61539 | 0.69782 | 0.69249 | 0.63155 |
medianTrue2claim | 1.6264 | 2.4716 | 2.1145 | 2.2301 | 2.4354 |
medianClaim2true | 3.5477 | 3.2091 | 3.52 | 3.4404 | 3.5089 |
MIREX 2009 Music Structure Summary Runtime Data
Participant | Machine | Runtime(hh:mm) |
---|---|---|
AN01 | ALE | 01:42 |
ANO2 | ALE | 01:38 |
GP | CWIN | 04:12 |
MND | ALE | 09:12 |
PK | ALE | 09:53 |