<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://music-ir.org/mirex/w/index.php?action=history&amp;feed=atom&amp;title=2012%3AAudio_Chord_Estimation</id>
	<title>2012:Audio Chord Estimation - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://music-ir.org/mirex/w/index.php?action=history&amp;feed=atom&amp;title=2012%3AAudio_Chord_Estimation"/>
	<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;action=history"/>
	<updated>2026-04-29T16:07:22Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.31.1</generator>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8868&amp;oldid=prev</id>
		<title>J. Ashley Burgoyne at 16:48, 27 August 2012</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8868&amp;oldid=prev"/>
		<updated>2012-08-27T16:48:35Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 16:48, 27 August 2012&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l14&quot; &gt;Line 14:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 14:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Data ==&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Data ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Two &lt;/del&gt;datasets are used to evaluate chord transcription accuracy:&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Three &lt;/ins&gt;datasets are used to evaluate chord transcription accuracy:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Beatles dataset ===&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Beatles dataset ===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l23&quot; &gt;Line 23:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 23:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Queen and Zweieck dataset ===&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;=== Queen and Zweieck dataset ===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Matthias Mauch's Queen and Zweieck dataset consisting of 38 songs from Queen and Zweieck.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Matthias Mauch's Queen and Zweieck dataset consisting of 38 songs from Queen and Zweieck.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;=== Billboard dataset (abridged) ===&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;An abridged version of Ashley Burgoyne's Billboard dataset [9], consisting of about 200 songs for training (previously published) and 200 songs for testing (to be published for the first time at ISMIR).&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===Example ground-truth file ===&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;===Example ground-truth file ===&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l69&quot; &gt;Line 69:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 72:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;X:sus4&amp;lt;br&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;X:sus4&amp;lt;br&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;For comparison of tetrad (quad) chords:&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;For comparison of tetrad (quad) chords &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;(currently only for the Beatles and Queen and Zweieck datasets)&lt;/ins&gt;:&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;N &amp;lt;br&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;N &amp;lt;br&amp;gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l327&quot; &gt;Line 327:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 330:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Bibliography ==&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Bibliography ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;1. Harte,C.A. and Sandler,M.B.(2005). '''Automatic chord identification using a quantised chromagram.''' Proceedings of 118th Audio Engineering Society's Convention.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;1. Harte, C.A. and Sandler, M.B. (2005). '''Automatic chord identification using a quantised chromagram.''' Proceedings of 118th Audio Engineering Society's Convention&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#160;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2. Sailer, C. and Rosenbauer K. (2006). '''A bottom-up approach to chord detection.''' Proceedings of International Computer Music Conference 2006&lt;/ins&gt;.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Sailer&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;C&lt;/del&gt;. and &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Rosenbauer K&lt;/del&gt;.(&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2006&lt;/del&gt;). '''&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;A bottom-up approach to &lt;/del&gt;chord &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;detection&lt;/del&gt;.''' &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Proceedings of International &lt;/del&gt;Computer Music &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Conference 2006&lt;/del&gt;.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;3&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Shenoy&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;A&lt;/ins&gt;. and &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Wang, Y&lt;/ins&gt;. (&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2005&lt;/ins&gt;). '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Key, &lt;/ins&gt;chord&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;, and rythm tracking of popular music recordings&lt;/ins&gt;.''' Computer Music &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Journal 29(3), 75-86&lt;/ins&gt;.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;3&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Shenoy&lt;/del&gt;,A. and &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Wang&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Y&lt;/del&gt;.(&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2005&lt;/del&gt;). '''&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Key, chord, &lt;/del&gt;and &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;rythm tracking of popular music recordings&lt;/del&gt;.''' &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Computer &lt;/del&gt;Music &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Journal 29(3), 75-86&lt;/del&gt;.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;4&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Sheh&lt;/ins&gt;, A. and &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Ellis&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;D.P.W&lt;/ins&gt;. (&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2003&lt;/ins&gt;). '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Chord segmentation &lt;/ins&gt;and &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;recognition using em-trained hidden markov models&lt;/ins&gt;.''' &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Proceedings of 4th International Conference on &lt;/ins&gt;Music &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Information Retrieval&lt;/ins&gt;.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;4&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Sheh&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;A&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;and Ellis,D.P.W&lt;/del&gt;.(&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2003&lt;/del&gt;). '''Chord &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;segmentation &lt;/del&gt;and &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;recognition using em-trained hidden markov models&lt;/del&gt;.''' Proceedings of &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;4th &lt;/del&gt;International Conference on Music Information Retrieval.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;5&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Yoshioka&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;T&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;et al&lt;/ins&gt;. (&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2004&lt;/ins&gt;). '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Automatic &lt;/ins&gt;Chord &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Transcription with concurrent recognition of chord symbols &lt;/ins&gt;and &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;boundaries&lt;/ins&gt;.''' Proceedings of &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;5th &lt;/ins&gt;International Conference on Music Information Retrieval.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;5&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Yoshioka&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;T&lt;/del&gt;. et al.(&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2004&lt;/del&gt;). '''&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Automatic Chord Transcription with concurrent recognition &lt;/del&gt;of &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;chord symbols and boundaries&lt;/del&gt;.''' Proceedings of &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;5th &lt;/del&gt;International Conference on Music Information Retrieval.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;6&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Harte&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;C&lt;/ins&gt;. et al. (&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2005&lt;/ins&gt;). '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Symbolic representation &lt;/ins&gt;of &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;musical chords: a proposed syntax for text annotations&lt;/ins&gt;.''' Proceedings of &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;6th &lt;/ins&gt;International Conference on Music Information Retrieval.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;6&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Harte&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;C&lt;/del&gt;. and &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Sandler&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;M. and Abdallah,S. and G├│mez,E&lt;/del&gt;.(&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2005&lt;/del&gt;). '''&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Symbolic &lt;/del&gt;representation &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;of musical chords: a proposed syntax for text annotations&lt;/del&gt;.''' Proceedings of &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;6th &lt;/del&gt;International Conference on &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Music Information Retrieval&lt;/del&gt;.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;7&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Papadopoulos&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;H&lt;/ins&gt;. and &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Peeters&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;G&lt;/ins&gt;. (&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2007&lt;/ins&gt;). '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Large-scale study of chord estimation algorithms based on chroma &lt;/ins&gt;representation &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;and HMM&lt;/ins&gt;.''' Proceedings of &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;5th &lt;/ins&gt;International Conference on &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Content-Based Multimedia Indexing&lt;/ins&gt;.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;7&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Papadopoulos&lt;/del&gt;,&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;H&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;and Peeters,G&lt;/del&gt;.(&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2007&lt;/del&gt;). '''&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Large-scale study &lt;/del&gt;of &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;chord estimation algorithms based on chroma representation and HMM.&lt;/del&gt;''' &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Proceedings of 5th &lt;/del&gt;International Conference on &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Content-Based Multimedia Indexing&lt;/del&gt;.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;8&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Abdallah&lt;/ins&gt;, &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;S&lt;/ins&gt;. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;et al&lt;/ins&gt;. (&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2005&lt;/ins&gt;). '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Theory and Evaluation &lt;/ins&gt;of &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;a Bayesian Music Structure Extractor&lt;/ins&gt;''' &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;(pp. 420-425) Proc. 6th &lt;/ins&gt;International Conference on &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Music Information Retrieval, ISMIR 2005&lt;/ins&gt;.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;8&lt;/del&gt;. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Samer Abdallah, Katy Noland, Mark Sandler, Michael Casey &amp;amp; Christophe Rhodes: &lt;/del&gt;'''&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Theory &lt;/del&gt;and &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Evaluation of a Bayesian Music Structure Extractor&lt;/del&gt;''' (pp. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;420-425&lt;/del&gt;) Proc. &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;6th &lt;/del&gt;International &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;Conference on &lt;/del&gt;Music Information Retrieval, ISMIR &lt;del class=&quot;diffchange diffchange-inline&quot;&gt;2005&lt;/del&gt;.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;9. John Ashley Burgoyne et al. (2011)&lt;/ins&gt;. '''&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;An expert ground-truth set for audio chord recognition &lt;/ins&gt;and &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;music analysis&lt;/ins&gt;''' (pp. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;633–638&lt;/ins&gt;) Proc. &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;12th &lt;/ins&gt;International &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Society for &lt;/ins&gt;Music Information Retrieval &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;Conference&lt;/ins&gt;, ISMIR &lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;2006. [http://ismir2011.ismir&lt;/ins&gt;.&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;net/papers/OS8-1.pdf (PDF)]&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>J. Ashley Burgoyne</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8852&amp;oldid=prev</id>
		<title>Matthew Davies: /* Discussion */</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8852&amp;oldid=prev"/>
		<updated>2012-08-09T08:32:48Z</updated>

		<summary type="html">&lt;p&gt;‎&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;Discussion&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 08:32, 9 August 2012&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l317&quot; &gt;Line 317:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 317:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Discussion ==&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Discussion ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Please write your comments below with your name and date.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Please write your comments below with your name and date.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Somewhere in the email discussion on the MIREX list, there was a mention that the recent systems run on the Beatles/Queen/Zweieck dataset might have over-learnt the properties of this dataset. I just wondered whether, during or post-MIREX, there was any way to formally/experimentally demonstrate this? I mean, beyond making the observation that there is a &amp;quot;drop&amp;quot; in performance from an open dataset to a closed one. The issue would seem particularly pertinent with regard to this dataset since it's been public for sometime.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;(Matthew Davies, 9th August)&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Potential Participants ==&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Potential Participants ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Matthew Davies</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8851&amp;oldid=prev</id>
		<title>MertBay: /* Time and hardware limits */</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8851&amp;oldid=prev"/>
		<updated>2012-08-09T07:12:31Z</updated>

		<summary type="html">&lt;p&gt;‎&lt;span dir=&quot;auto&quot;&gt;&lt;span class=&quot;autocomment&quot;&gt;Time and hardware limits&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 07:12, 9 August 2012&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l314&quot; &gt;Line 314:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 314:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#160; &amp;#160;&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#160; &amp;#160;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A hard limit of 24 hours will be imposed on runs (total feature extraction and querying times). Submissions that exceed this runtime may not receive a result.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A hard limit of 24 hours will be imposed on runs (total feature extraction and querying times). Submissions that exceed this runtime may not receive a result.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;== Discussion ==&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Please write your comments below with your name and date.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Potential Participants ==&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Potential Participants ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>MertBay</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8740&amp;oldid=prev</id>
		<title>AndreasEhmann at 21:34, 7 June 2012</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8740&amp;oldid=prev"/>
		<updated>2012-06-07T21:34:33Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 21:34, 7 June 2012&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l314&quot; &gt;Line 314:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 314:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#160; &amp;#160;&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#160; &amp;#160;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A hard limit of 24 hours will be imposed on runs (total feature extraction and querying times). Submissions that exceed this runtime may not receive a result.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;A hard limit of 24 hours will be imposed on runs (total feature extraction and querying times). Submissions that exceed this runtime may not receive a result.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;== Submission opening date ==&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Friday August 5th 2012&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;== Submission closing date ==&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;Friday September 2nd 2012&lt;/del&gt;&lt;/div&gt;&lt;/td&gt;&lt;td colspan=&quot;2&quot;&gt;&amp;#160;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Potential Participants ==&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;&amp;#160;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;== Potential Participants ==&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>AndreasEhmann</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8698&amp;oldid=prev</id>
		<title>Kahyun Choi: Created page with &quot;The Utrecht Agreement on Chord Evaluation  ===Evaluation of Chord Transcriptions===  Before the final description of the chord evaluation goes live here, please see the discu...&quot;</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2012:Audio_Chord_Estimation&amp;diff=8698&amp;oldid=prev"/>
		<updated>2012-05-14T19:41:08Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;&lt;a href=&quot;/mirex/wiki/The_Utrecht_Agreement_on_Chord_Evaluation&quot; title=&quot;The Utrecht Agreement on Chord Evaluation&quot;&gt;The Utrecht Agreement on Chord Evaluation&lt;/a&gt;  ===Evaluation of Chord Transcriptions===  Before the final description of the chord evaluation goes live here, please see the discu...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;[[The Utrecht Agreement on Chord Evaluation]]&lt;br /&gt;
&lt;br /&gt;
===Evaluation of Chord Transcriptions===&lt;br /&gt;
&lt;br /&gt;
Before the final description of the chord evaluation goes live here, please see the discussion based on the [[The Utrecht Agreement on Chord Evaluation]].&lt;br /&gt;
&lt;br /&gt;
== Description ==&lt;br /&gt;
This task requires participants to extract or transcribe a sequence of chords from an audio music recording. For many applications in music information retrieval, extracting the harmonic structure of an audio track is very desirable, for example for segmenting pieces into characteristic segments, for finding similar pieces, or for semantic analysis of music.&lt;br /&gt;
&lt;br /&gt;
The extraction of the harmonic structure requires the detection of as many chords as possible in a piece. That includes the characterisation of chords with a key and type as well as a chronological order with onset and duration of the chords.&lt;br /&gt;
&lt;br /&gt;
Although some publications are available on this topic [1,2,3,4,5], comparison of the results is difficult, because different measures are used to assess the performance. To overcome this problem an accurately defined methodology is needed. This includes a repertory of the findable chords, a defined test set along with ground truth and unambiguous calculation rules to measure the performance.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Data ==&lt;br /&gt;
Two datasets are used to evaluate chord transcription accuracy:&lt;br /&gt;
&lt;br /&gt;
=== Beatles dataset ===&lt;br /&gt;
Christopher Harte`s Beatles dataset consisting of annotations of 12 Beatles albums.&lt;br /&gt;
&lt;br /&gt;
The text annotation procedure of musical chords that was used to produce this dataset is presented in [6]. &lt;br /&gt;
&lt;br /&gt;
=== Queen and Zweieck dataset ===&lt;br /&gt;
Matthias Mauch's Queen and Zweieck dataset consisting of 38 songs from Queen and Zweieck.&lt;br /&gt;
&lt;br /&gt;
===Example ground-truth file ===&lt;br /&gt;
The ground-truth files take the form:&lt;br /&gt;
&lt;br /&gt;
 ...&lt;br /&gt;
 41.2631021 44.2456460 B&lt;br /&gt;
 44.2456460 45.7201230 E&lt;br /&gt;
 45.7201230 47.2061900 E:7/3&lt;br /&gt;
 47.2061900 48.6922670 A&lt;br /&gt;
 48.6922670 50.1551240 A:min/b3&lt;br /&gt;
 ...&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Evaluation ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Segmentation Score ===&lt;br /&gt;
&lt;br /&gt;
The segmentation score will be calculated using directional hamming distance as described in [8]. An over-segmentation value (m) and an under-segmentation value (f) will be calculated and the final segmentation score will be calculated using the worst case from these two i.e:&lt;br /&gt;
&lt;br /&gt;
segmentation score = 1 - max(m,f)&lt;br /&gt;
&lt;br /&gt;
m and f are not independent of each other so combining them this way ensures that a good score in one does not hide a bad score in the other. The combined segmentation score can take values between 0 and 1 with 0 being the worst and 1 being the best result.-- Chrish 17:05, 9 September 2009 (UTC)&lt;br /&gt;
&lt;br /&gt;
=== Frame-based recall ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
For recall evaluation, we may define a different chord dictionary for each level of evaluation (dyads, triads, tetrads etc). Each dictionary is a text file containing chord shorthands / interval lists of the chords that will be considered in that evaluation. The following dictionaries are proposed:&lt;br /&gt;
&lt;br /&gt;
For dyad comparison of major/minor chords only:&lt;br /&gt;
&lt;br /&gt;
N&amp;lt;br&amp;gt;&lt;br /&gt;
X:maj&amp;lt;br&amp;gt;&lt;br /&gt;
X:min&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For comparison of standard triad chords:&lt;br /&gt;
&lt;br /&gt;
N&amp;lt;br&amp;gt;&lt;br /&gt;
X:maj&amp;lt;br&amp;gt;&lt;br /&gt;
X:min&amp;lt;br&amp;gt;&lt;br /&gt;
X:aug&amp;lt;br&amp;gt;&lt;br /&gt;
X:dim&amp;lt;br&amp;gt;&lt;br /&gt;
X:sus2&amp;lt;br&amp;gt;&lt;br /&gt;
X:sus4&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For comparison of tetrad (quad) chords:&lt;br /&gt;
&lt;br /&gt;
N &amp;lt;br&amp;gt;&lt;br /&gt;
X:maj &amp;lt;br&amp;gt;&lt;br /&gt;
X:min&amp;lt;br&amp;gt;&lt;br /&gt;
X:aug&amp;lt;br&amp;gt;&lt;br /&gt;
X:dim&amp;lt;br&amp;gt;&lt;br /&gt;
X:sus2&amp;lt;br&amp;gt;&lt;br /&gt;
X:sus4&amp;lt;br&amp;gt;&lt;br /&gt;
X:maj7&amp;lt;br&amp;gt;&lt;br /&gt;
X:7&amp;lt;br&amp;gt;&lt;br /&gt;
X:maj(9)&amp;lt;br&amp;gt;&lt;br /&gt;
X:aug(7)	&amp;lt;br&amp;gt;&lt;br /&gt;
X:min(7)&amp;lt;br&amp;gt;&lt;br /&gt;
X:min7&amp;lt;br&amp;gt;&lt;br /&gt;
X:min(9)&amp;lt;br&amp;gt;&lt;br /&gt;
X:dim(7)&amp;lt;br&amp;gt;&lt;br /&gt;
X:hdim7	&amp;lt;br&amp;gt;&lt;br /&gt;
X:sus4(7)&amp;lt;br&amp;gt;&lt;br /&gt;
X:sus4(b7)&amp;lt;br&amp;gt;&lt;br /&gt;
X:dim7&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
For each evaluation level, the ground truth annotation is compared against the dictionary. Any chord label not belonging to the current dictionary will be replaced with an &amp;quot;X&amp;quot; in a local copy of the annotation and will not be included in the recall calculation.&lt;br /&gt;
&lt;br /&gt;
Note that the level of comparison in terms of intervals can be varied. For example, in a triad evaluation we can consider the first three component intervals in the chord so that a major (1,3,5) and a major7 (1,3,5,7) will be considered the same chord. For a tetrad (quad) evaluation, we would consider the first 4 intervals so major and major7 would then be considered to be different chords.&lt;br /&gt;
&lt;br /&gt;
For the maj/min evaluation (using the first example dictionary), using an interval comparison of 2 (dyad) will compare only the first two intervals of each chord label. This would map augmented and diminished chords to major and minor respectively (and any other symbols that had a major 3rd or minor 3rd as their first interval). Using an interval comparison of 3 with the same dictionary would keep only those chords that have major and minor triads as their first 3 intervals so augmented and diminished chords would be removed from the evaluation.&lt;br /&gt;
&lt;br /&gt;
After the annotation has been &amp;quot;filtered&amp;quot; using a given dictionary, it can be compared against the machine generated estimates output by the algorithm under test. The chord sequences described in the annotation and estimate text files are sampled at a given frame rate (in this case 10ms per frame) to give two sequences of chord frames which may be compared directly with each other. For calculating a hit or a miss, the chord labels from the current frame in each sequence will be compared.  Chord comparison is done by converting each chord label into an ordered list of pitch classes then comparing the two lists element by element. If the lists match to the required number of intervals then a hit is recorded, otherwise the estimate is considered a miss. It should be noted that, by converting to pitch classes in the comparison, this evaluation ignores enharmonic pitch and interval spellings so the following chords (slightly silly example just for illustration) will all evaluate as identical:&lt;br /&gt;
&lt;br /&gt;
C:maj = Dbb:maj = C#:(b1,b3,#4)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Basic recall calculation algorithm:&lt;br /&gt;
&lt;br /&gt;
1) filter annotated transcription using chord dictionary for a defined number of intervals&lt;br /&gt;
&lt;br /&gt;
2) sample annotated transcription and machine estimated transcription at 10ms intervals to create a sequence of annotation frames and estimate frames&lt;br /&gt;
&lt;br /&gt;
3) start at the first frame&lt;br /&gt;
&lt;br /&gt;
4) get chord label for current annotation frame and estimate frame&lt;br /&gt;
&lt;br /&gt;
5) check annotation label:&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
IF symbol is 'X' (i.e. non-dictionary) &amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
THEN ignore frame (record number of ignored frames)&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
ELSE compare annotated/estimated chords for the predefined number of intervals &amp;lt;br&amp;gt;&lt;br /&gt;
increment hit count if chords match&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
ENDIF&lt;br /&gt;
&lt;br /&gt;
6) increment frame count &lt;br /&gt;
&lt;br /&gt;
7) go back to 4 until final chord frame&lt;br /&gt;
--[[User:Chrish|Chrish]] 17:05, 9 September 2009 (UTC)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Submission Format ==&lt;br /&gt;
&lt;br /&gt;
=== Audio Format ===&lt;br /&gt;
Audio tracks will be encoded as 44.1 kHz 16bit mono WAV files.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== I/O Format ===&lt;br /&gt;
The expected output chord transcription file for participating algorithms is that proposed by Christopher Harte [6]. &lt;br /&gt;
&lt;br /&gt;
Hence, algorithms should output text files with a similar format to that used in the ground truth transcriptions. That is to say, they should be flat text files with chord segment labels and times arranged thus:&lt;br /&gt;
&lt;br /&gt;
 start_time end_time chord_label&lt;br /&gt;
&lt;br /&gt;
with elements separated by white spaces, times given in seconds, chord labels corresponding to the syntax described in [6] and one chord segment per line. &lt;br /&gt;
&lt;br /&gt;
The chord root is given as a natural (A|B|C|D|E|F|G) followed by optional sharp or flat modifiers (#|b). For the evaluation process we may assume enharmonic equivalence for chord roots. For a given chord type on root X, the chord labels can be given as a list of intervals or as a shorthand notation as shown in the following table:&lt;br /&gt;
&lt;br /&gt;
{|border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;5&amp;quot; cellspacing=&amp;quot;0&amp;quot; align=&amp;quot;center&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
!NAME&lt;br /&gt;
!INTERVALS&lt;br /&gt;
!SHORTHAND&lt;br /&gt;
|-&lt;br /&gt;
|-*Triads:		&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|major&lt;br /&gt;
|X:(1,3,5)&lt;br /&gt;
|X or X:maj &lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor&lt;br /&gt;
|X:(1,b3,5)&lt;br /&gt;
|X:min &lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|diminished&lt;br /&gt;
|X:(1,b3,b5)&lt;br /&gt;
|X:dim&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|augmented&lt;br /&gt;
|X:(1,3,#5)&lt;br /&gt;
|X:aug&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|suspended4&lt;br /&gt;
|X:(1,4,5)&lt;br /&gt;
|X:sus4&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|possible 6th triad:	&lt;br /&gt;
|&lt;br /&gt;
|	&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|suspended2&lt;br /&gt;
|X:(1,2,5)&lt;br /&gt;
|X:sus2&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|*Quads: 	&lt;br /&gt;
|&lt;br /&gt;
|	&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|major-major7&lt;br /&gt;
|X:(1,3,5,7)&lt;br /&gt;
|X:maj7&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|major-minor7&lt;br /&gt;
|X:(1,3,5,b7)&lt;br /&gt;
|X:7&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|major-add9&lt;br /&gt;
|X:(1,3,5,9)&lt;br /&gt;
|X:maj(9)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|major-major7-#5&lt;br /&gt;
|X:(1,3,#5,7)&lt;br /&gt;
|X:aug(7)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor-major7&lt;br /&gt;
|X:(1,b3,5,7)&lt;br /&gt;
|X:min(7)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor-minor7&lt;br /&gt;
|X:(1,b3,5,b7)&lt;br /&gt;
|X:min7&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor-add9&lt;br /&gt;
|X:(1,b3,5,9)&lt;br /&gt;
|X:min(9)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor 7/b5 (ambiguous - could be either of the following)		&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor-major7-b5&lt;br /&gt;
|X:(1,b3,b5,7)&lt;br /&gt;
|X:dim(7)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|minor-minor7-b5  (a half diminished-7th)&lt;br /&gt;
|X:(1,b3,b5,b7)&lt;br /&gt;
|X:hdim7&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|sus4-major7&lt;br /&gt;
|X:(1,4,5,7)&lt;br /&gt;
|X:sus4(7)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|sus4-minor7&lt;br /&gt;
|X:(1,4,5,b7)&lt;br /&gt;
|X:sus4(b7)&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|omitted from list on wiki:&lt;br /&gt;
|&lt;br /&gt;
|&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|diminished7&lt;br /&gt;
|X:(1,b3,b5,bb7)&lt;br /&gt;
|X:dim7&lt;br /&gt;
|-&lt;br /&gt;
|-&lt;br /&gt;
|No Chord&lt;br /&gt;
|N&lt;br /&gt;
|&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Please note that two things have changed in the syntax since it was originally described in [6]. The first change is that the root is no longer implied as a voiced element of a chord so a C major chord (notes C, E and G) should be written C:(1,3,5) instead of just C:(3,5) if using the interval list representation. As before, the labels C and C:maj are equivalent to C:(1,3,5). The second change is that the shorthand label &amp;quot;sus2&amp;quot; (intervals 1,2,5) has been added to the available shorthand list.--[[User:Chrish|Chrish]] 17:05, 9 September 2009 (UTC)&lt;br /&gt;
&lt;br /&gt;
We still accept participants who would only like to be evaluated on major/minor chords and want to use the number format which is an integer chord id on range 0-24, where values 0-11  denote the C major, C# major, ..., B major  and  12-23 denote the C minor, C# minor, ..., B minor and         24    denotes silence or no-chord segments. '''Please note that the format is still the same'''&lt;br /&gt;
&lt;br /&gt;
 start_time end_time chord_number&lt;br /&gt;
&lt;br /&gt;
Systems are supposed to print out the onset-offset times as opposed to MIREX 2008 chord output format where only onset were used.&lt;br /&gt;
&lt;br /&gt;
=== Command line calling format ===&lt;br /&gt;
&lt;br /&gt;
Submissions have to conform to the specified format below:&lt;br /&gt;
&lt;br /&gt;
 ''extractFeaturesAndTrain  &amp;quot;/path/to/trainFileList.txt&amp;quot;  &amp;quot;/path/to/scratch/dir&amp;quot; '' &lt;br /&gt;
&lt;br /&gt;
Where fileList.txt has the paths to each wav file. The features extracted on this stage can be stored under &amp;quot;/path/to/scratch/dir&amp;quot; &lt;br /&gt;
The ground truth files for the supervised learning will be in the same path with a &amp;quot;.txt&amp;quot; extension at the end. For example for &amp;quot;/path/to/trainFile1.wav&amp;quot;, there will be a corresponding ground truth file called &amp;quot;/path/to/trainFile1.wav.txt&amp;quot; . &lt;br /&gt;
&lt;br /&gt;
For testing:&lt;br /&gt;
&lt;br /&gt;
 ''doChordID.sh &amp;quot;/path/to/testFileList.txt&amp;quot;  &amp;quot;/path/to/scratch/dir&amp;quot; &amp;quot;/path/to/results/dir&amp;quot; '' &lt;br /&gt;
&lt;br /&gt;
If there is no training, you can ignore the second argument here. In the results directory, there should be one file for each testfile with same name as the test file + .txt . &lt;br /&gt;
&lt;br /&gt;
Programs can use their working directory if they need to keep temporary cache files or internal debuggin info. Stdout and stderr will be logged.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Packaging submissions ===&lt;br /&gt;
All submissions should be statically linked to all libraries (the presence of dynamically linked libraries cannot be guaranteed).&lt;br /&gt;
&lt;br /&gt;
All submissions should include a README file including the following information:&lt;br /&gt;
&lt;br /&gt;
* Command line calling format for all executables and an example formatted set of commands&lt;br /&gt;
* Number of threads/cores used or whether this should be specified on the command line&lt;br /&gt;
* Expected memory footprint&lt;br /&gt;
* Expected runtime&lt;br /&gt;
* Any required environments (and versions), e.g. python, java, bash, matlab.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Time and hardware limits ==&lt;br /&gt;
Due to the potentially high number of particpants in this and other audio tasks, hard limits on the runtime of submissions are specified. &lt;br /&gt;
 &lt;br /&gt;
A hard limit of 24 hours will be imposed on runs (total feature extraction and querying times). Submissions that exceed this runtime may not receive a result.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Submission opening date ==&lt;br /&gt;
&lt;br /&gt;
Friday August 5th 2012&lt;br /&gt;
&lt;br /&gt;
== Submission closing date ==&lt;br /&gt;
Friday September 2nd 2012&lt;br /&gt;
&lt;br /&gt;
== Potential Participants ==&lt;br /&gt;
name / email&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Bibliography ==&lt;br /&gt;
&lt;br /&gt;
1. Harte,C.A. and Sandler,M.B.(2005). '''Automatic chord identification using a quantised chromagram.''' Proceedings of 118th Audio Engineering Society's Convention.&lt;br /&gt;
&lt;br /&gt;
2. Sailer,C. and Rosenbauer K.(2006). '''A bottom-up approach to chord detection.''' Proceedings of International Computer Music Conference 2006.&lt;br /&gt;
&lt;br /&gt;
3. Shenoy,A. and Wang,Y.(2005). '''Key, chord, and rythm tracking of popular music recordings.''' Computer Music Journal 29(3), 75-86.&lt;br /&gt;
&lt;br /&gt;
4. Sheh,A. and Ellis,D.P.W.(2003). '''Chord segmentation and recognition using em-trained hidden markov models.''' Proceedings of 4th International Conference on Music Information Retrieval.&lt;br /&gt;
&lt;br /&gt;
5. Yoshioka,T. et al.(2004). '''Automatic Chord Transcription with concurrent recognition of chord symbols and boundaries.''' Proceedings of 5th International Conference on Music Information Retrieval.&lt;br /&gt;
&lt;br /&gt;
6. Harte,C. and Sandler,M. and Abdallah,S. and G├│mez,E.(2005). '''Symbolic representation of musical chords: a proposed syntax for text annotations.''' Proceedings of 6th International Conference on Music Information Retrieval.&lt;br /&gt;
&lt;br /&gt;
7. Papadopoulos,H. and Peeters,G.(2007). '''Large-scale study of chord estimation algorithms based on chroma representation and HMM.''' Proceedings of 5th International Conference on Content-Based Multimedia Indexing.&lt;br /&gt;
&lt;br /&gt;
8. Samer Abdallah, Katy Noland, Mark Sandler, Michael Casey &amp;amp; Christophe Rhodes: '''Theory and Evaluation of a Bayesian Music Structure Extractor''' (pp. 420-425) Proc. 6th International Conference on Music Information Retrieval, ISMIR 2005.&lt;/div&gt;</summary>
		<author><name>Kahyun Choi</name></author>
		
	</entry>
</feed>