<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://music-ir.org/mirex/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=CameronJones</id>
	<title>MIREX Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://music-ir.org/mirex/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=CameronJones"/>
	<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/wiki/Special:Contributions/CameronJones"/>
	<updated>2026-05-08T05:35:25Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.1</generator>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:MIREX2010_Results&amp;diff=7989</id>
		<title>2010:MIREX2010 Results</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:MIREX2010_Results&amp;diff=7989"/>
		<updated>2011-06-06T21:59:18Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Reverted edits by Maldinii (Talk) to last revision by Kriswest&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==OVERALL RESULTS POSTERS (First Version: Will need updating as last runs are completed)==&lt;br /&gt;
[https://www.music-ir.org/mirex/results/2010/mirex_2010_poster.pdf MIREX 2010 Overall Results Posters (PDF)]&lt;br /&gt;
&lt;br /&gt;
==Results by Task ==&lt;br /&gt;
This year we ran many MIREX 2010 Tasks using the new [http://nema.lis.uiuc.edu/drupal/?q=nema/architecture NEMA MIREX DIY] infrastructure. Task results with &amp;quot;(DIY)&amp;quot; appended are those generated using the NEMA MIREX DIY system. Where appropriate, do explore the various new outputs that help visualize both individual and task-wide comparative performances. A demonstration [https://www.music-ir.org/diy-demo/ video] of the NEMA MIREX DIY system can be found is also available.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Train-Test Task Set===&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/4ffcb482-b83c-4ba6-bc42-9b538b31143c/results/evaluation/ Audio Classical Composer Identification Results ]&amp;amp;nbsp;&amp;amp;nbsp; (DIY)&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/6731c97a-240c-4d3d-8be9-90d715ea04e1/results/evaluation/ Audio Latin Genre Classification Results ]&amp;amp;nbsp;&amp;amp;nbsp; (DIY)&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/9b11a5c8-9fcf-4029-95eb-51ed561cfb5f/results/evaluation/ Audio Music Mood Classification Results ]&amp;amp;nbsp;&amp;amp;nbsp; (DIY)&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/2b5839b3-3012-4f76-8807-31823588ae25/results/evaluation/ Audio Mixed Popular Genre Classification Results ]&amp;amp;nbsp;&amp;amp;nbsp; (DIY)&lt;br /&gt;
&lt;br /&gt;
===Other Tasks===&lt;br /&gt;
&lt;br /&gt;
* Audio Beat Tracking Results &lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/abt/mck/ MCK Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/abt/maz/ MAZ Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ace/ Audio Chord Detection]  &amp;amp;nbsp;(DIY)&lt;br /&gt;
* [[2010:Audio_Cover_Song_Identification_Results | Audio Cover Song Identification Results]] &lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/akd/ Audio Key Detection Results] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* Audio Melody Extraction Results&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/adc04/  ADC04 Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex05/ MIREX05 Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/indian08/ INDIAN08 Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex09_0dB/ MIREX09 0dB Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex09_m5dB/ MIREX09 -5dB Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex09_p5dB/ MIREX09 +5dB Dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* [[2010:Audio_Music_Similarity_and_Retrieval_Results | Audio Music Similarity and Retrieval Results]]&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/aod/ Audio Onset Detection Results] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* Audio Tag Classification Results&lt;br /&gt;
** Major Miner Tag dataset&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask1_report/bin/ Binary relevance (classification evaluation)] &amp;amp;nbsp;(DIY)&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask1_report/aff/ Affinity estimation evaluation] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** Mood Tag dataset&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask2_report/bin/ Binary relevance (classification evaluation)] &amp;amp;nbsp;(DIY)&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask2_report/aff/ Affinity estimation evaluation] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ate/ Audio Tempo Estimation Results] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* [[2010:Multiple_Fundamental_Frequency_Estimation_&amp;amp;_Tracking_Results | Multiple Fundamental Frequency Estimation &amp;amp; Tracking Results]]&lt;br /&gt;
* Music Structure Segmentation Results&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/struct/mirex09/ MIREX09 dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/struct/mirex10/ MIREX10 dataset] &amp;amp;nbsp;(DIY)&lt;br /&gt;
* [[2010:Query-by-Singing/Humming_Results | Query-by-Singing/Humming Results]] &lt;br /&gt;
* [[2010:Query-by-Tapping_Results | Query-by-Tapping Results]]&lt;br /&gt;
*[[2010:Real-time_Audio_to_Score_Alignment_(a.k.a._Score_Following)_Results | Real-time Audio to Score Alignment (a.k.a. Score Following) Results ]]&lt;br /&gt;
* [[2010:Symbolic_Melodic_Similarity_Results | Symbolic Melodic Similarity Results]]&lt;br /&gt;
&lt;br /&gt;
== Machine Specifications ==&lt;br /&gt;
&lt;br /&gt;
== Runtime for Submissions Run by NEMA DIY ==&lt;br /&gt;
&lt;br /&gt;
* [[2010:Runtime | Runtime]]&lt;br /&gt;
&lt;br /&gt;
[[Category:Results]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2011:MIREX_Home&amp;diff=7988</id>
		<title>2011:MIREX Home</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2011:MIREX_Home&amp;diff=7988"/>
		<updated>2011-06-06T21:59:11Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Reverted edits by Maldinii (Talk) to last revision by Jdownie&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Welcome to MIREX 2011==&lt;br /&gt;
This is the main page for the sixth running of the Music Information Retrieval Evaluation eXchange (MIREX 2011). The International Music Information Retrieval Systems Evaluation Laboratory ([https://music-ir.org/evaluation IMIRSEL]) at the Graduate School of Library and Information Science ([http://www.lis.illinois.edu GSLIS]), University of Illinois at Urbana-Champaign ([http://www.illinois.edu UIUC]) is the principal organizer of MIREX 2011. &lt;br /&gt;
&lt;br /&gt;
The MIREX 2011 community will hold its annual meeting as part of [http://ismir2011.ismir.net/ The 12th International Conference on Music Information Retrieval], ISMIR 2011, which will be held in Miami, Florida, the week of October the 23rd, 2011. The MIREX plenary (working lunch) and poster sessions will be held at a time to be determined during the conference.&lt;br /&gt;
&lt;br /&gt;
J. Stephen Downie&amp;lt;br&amp;gt;&lt;br /&gt;
Director, IMIRSEL&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===MIREX 2011 Evaluation Tasks===&lt;br /&gt;
&lt;br /&gt;
* [[2011:Audio Classification (Train/Test) Tasks]], incorporating:&lt;br /&gt;
** Audio Artist Identification&lt;br /&gt;
** Audio US Pop Genre Classification&lt;br /&gt;
** Audio Latin Genre Classification&lt;br /&gt;
** Audio Music Mood Classification&lt;br /&gt;
** Audio Classical Composer Identification&lt;br /&gt;
* [[2011:Audio Cover Song Identification]]&lt;br /&gt;
* [[2011:Audio Tag Classification]] &lt;br /&gt;
* [[2011:Audio Music Similarity and Retrieval]]&lt;br /&gt;
* [[2011:Symbolic Melodic Similarity]]&lt;br /&gt;
* [[2011:Audio Onset Detection]]&lt;br /&gt;
* [[2011:Audio Key Detection]]&lt;br /&gt;
* [[2011:Real-time Audio to Score Alignment (a.k.a Score Following)]]&lt;br /&gt;
* [[2011:Query by Singing/Humming]]&lt;br /&gt;
* [[2011:Audio Melody Extraction]]&lt;br /&gt;
* [[2011:Multiple Fundamental Frequency Estimation &amp;amp; Tracking]]&lt;br /&gt;
* [[2011:Audio Chord Estimation]]&lt;br /&gt;
* [[2011:Query by Tapping]]&lt;br /&gt;
* [[2011:Audio Beat Tracking]]&lt;br /&gt;
* [[2011:Structural Segmentation]]&lt;br /&gt;
* [[2011:Audio Tempo Estimation]]&lt;br /&gt;
&lt;br /&gt;
===Note to New Participants===&lt;br /&gt;
Please take the time to read the following review article that explains the history and structure of MIREX.&lt;br /&gt;
&lt;br /&gt;
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):&amp;lt;br&amp;gt;&lt;br /&gt;
A window into music information retrieval research.''Acoustical Science and Technology 29'' (4): 247-255. &amp;lt;br&amp;gt;&lt;br /&gt;
Available at: [http://dx.doi.org/10.1250/ast.29.247 http://dx.doi.org/10.1250/ast.29.247]&lt;br /&gt;
&lt;br /&gt;
===Note to All Participants===&lt;br /&gt;
Because MIREX is premised upon the sharing of ideas and results, '''ALL''' MIREX participants are expected to:&lt;br /&gt;
&lt;br /&gt;
# submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).&lt;br /&gt;
# submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2011 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)&lt;br /&gt;
# present a poster at the MIREX 2011 poster session at ISMIR 2011 (Wednesday, 11 August 2011)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Software Dependency Requests===&lt;br /&gt;
If you have not submitted to MIREX before or are unsure whether IMIRSEL/NEMA currently supports some of the software/architecture dependencies for your submission a [https://spreadsheets.google.com/embeddedform?formkey=dDltRjc4NDBDdkZiaF9qZXV0bU5ScUE6MA dependency request form is available]. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you. &lt;br /&gt;
&lt;br /&gt;
Due to the high volume of submissions expected at MIREX 2011, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.&lt;br /&gt;
&lt;br /&gt;
==Getting Involved in MIREX 2011==&lt;br /&gt;
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2011 the best yet.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Mailing List Participation===&lt;br /&gt;
If you are interested in formal MIR evaluation, you should also subscribe to the &amp;quot;MIREX&amp;quot; (aka &amp;quot;EvalFest&amp;quot;) mail list and participate in the community discussions about defining and running MIREX 2011 tasks. Subscription information at: &lt;br /&gt;
[https://mail.lis.illinois.edu/mailman/listinfo/evalfest EvalFest Central]. &lt;br /&gt;
&lt;br /&gt;
If you are participating in MIREX 2011, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2011 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here. &lt;br /&gt;
&lt;br /&gt;
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2011 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Wiki Participation===&lt;br /&gt;
Please create an account via: [[Special:Userlogin]].&lt;br /&gt;
&lt;br /&gt;
Please note that because of &amp;quot;spam-bots&amp;quot;, MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).&lt;br /&gt;
&lt;br /&gt;
==MIREX 2005 - 2010 Wikis==&lt;br /&gt;
Content from MIREX 2005 - 2010 are available at:&lt;br /&gt;
&lt;br /&gt;
'''[[2009:Main_Page|MIREX 2010]]''' &lt;br /&gt;
'''[[2009:Main_Page|MIREX 2009]]''' &lt;br /&gt;
'''[[2008:Main_Page|MIREX 2008]]''' &lt;br /&gt;
'''[[2007:Main_Page|MIREX 2007]]''' &lt;br /&gt;
'''[[2006:Main_Page|MIREX 2006]]''' &lt;br /&gt;
'''[[2005:Main_Page|MIREX 2005]]'''&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7801</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7801"/>
		<updated>2010-12-02T18:47:15Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
** 2011:Main_Page|MIREX 2011&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
*results by year&lt;br /&gt;
**2010:MIREX2010_Results| MIREX 2010 Results&lt;br /&gt;
**2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
**2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
**2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
**2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
**2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX CENTRAL HOME&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MIREX_HOME&amp;diff=7800</id>
		<title>MIREX HOME</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MIREX_HOME&amp;diff=7800"/>
		<updated>2010-12-02T17:14:46Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Undo revision 7799 by CoryDAllen (Talk)&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;__TOC__&lt;br /&gt;
&lt;br /&gt;
== Current MIREX Wiki (2010) ==&lt;br /&gt;
You can view the current 2010 content here: [[2010:MIREX HOME]]&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
The Music Information Retrieval Evaluation eXchange (MIREX) is an annual evaluation campaign for Music Information Retrieval (MIR) algorithms, coupled to the [http://www.ismir.net International Society (and Conference) for Music Information Retrieval (ISMIR)]. MIREX is hosted by the [https://www.music-ir.org/evaluation/ International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL)] at the [http://www.lis.illinois.edu/ Graduate School of Library Information Sciences (GSLIS)] which is part of the [http://www.illinois.edu/ University of Illinois at Urbana-Champaign (UIUC)].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Current and Future MIREXs are in part facilitated by the work of the [https://nema.lis.illinois.edu/ Networked Environment for Music Analysis (NEMA) project]. The NEMA projects aims to automate and expose the workings of MIREX and MIR experimentation/evaluation to the MIR community, helping to deal with issues of collection/data/code sharing within the community by handling issues relating to copyright and IP restrictions by allowing MIR researchers to work (remotely) with resources without having to obtain licenses to the content/code.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
MIR tasks evaluated at past MIREXs include:&lt;br /&gt;
*  [[Audio Train/Test Tasks]]&lt;br /&gt;
** Audio Artist Identification&lt;br /&gt;
** Audio Genre Classification&lt;br /&gt;
** Audio Music Mood Classification&lt;br /&gt;
** Audio Classical Composer Identification&lt;br /&gt;
* [[Symbolic Genre Classification]]&lt;br /&gt;
* [[Audio Onset Detection]]&lt;br /&gt;
* [[Audio Key Detection]]&lt;br /&gt;
* [[Symbolic Key Detection]]&lt;br /&gt;
* [[Audio Tag Classification]]&lt;br /&gt;
* [[Audio Cover Song Identification]]&lt;br /&gt;
* [[Real-time Audio to Score Alignment (a.k.a Score Following)]]&lt;br /&gt;
* [[Query by Singing/Humming]]&lt;br /&gt;
* [[Multiple Fundamental Frequency Estimation &amp;amp; Tracking]]&lt;br /&gt;
* [[Audio Chord Estimation]]&lt;br /&gt;
* [[Audio Melody Extraction]]&lt;br /&gt;
* [[Query by Tapping]]&lt;br /&gt;
* [[Audio Beat Tracking]]&lt;br /&gt;
* [[Audio Music Similarity and Retrieval]]&lt;br /&gt;
* [[Symbolic Melodic Similarity]]&lt;br /&gt;
* [[Structural Segmentation]]&lt;br /&gt;
* [[Audio Drum Detection]]&lt;br /&gt;
* [[Audio Tempo Extraction]]&lt;br /&gt;
&lt;br /&gt;
== Recent Changes ==&lt;br /&gt;
We recently have merged all current and previous iterations of the MIREX wiki into a single wiki installation to make it easier to manage. All the pages, images, abstracts, and images have been migrated, but some links and images may still be broken. We're currently manually inspecting all pages, but would appreciate your help in correcting any errors you see.&lt;br /&gt;
&lt;br /&gt;
Content on the wiki is now organized into mediawiki namespaces, one for each year. You can view the current 2010 content here: [[2010:MIREX HOME]]&lt;br /&gt;
&lt;br /&gt;
Similarly for previous content.&lt;br /&gt;
* [[2009:Main_Page]]&lt;br /&gt;
* [[2008:Main_Page]]&lt;br /&gt;
* [[2007:Main_Page]]&lt;br /&gt;
* [[2006:Main_Page]]&lt;br /&gt;
* [[2005:Main_Page]]&lt;br /&gt;
&lt;br /&gt;
All links to older wiki content will be redirected to this new wiki, and should take you to the correct page on the new installation, but please update any bookmarks or links you may have which point into current or old wiki content.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:MIREX2010_Results&amp;diff=7773</id>
		<title>2010:MIREX2010 Results</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:MIREX2010_Results&amp;diff=7773"/>
		<updated>2010-08-26T04:49:44Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: fixed poster link&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==OVERALL RESULTS POSTERS (NOT READY YET)==&lt;br /&gt;
[https://www.music-ir.org/mirex/results/2010/mirex_2010_poster.pdf MIREX 2010 Overall Results Posters (PDF)] &lt;br /&gt;
&lt;br /&gt;
==Results by Task ==&lt;br /&gt;
&lt;br /&gt;
===Train-Test Task Set===&lt;br /&gt;
* [http://nema.lis.uiuc.edu/nema_out/664ccbda-d5b6-48ae-8c47-c27e7c2372fe/results/evaluation/ Audio Classical Composer Identification Results ]&lt;br /&gt;
* [http://nema.lis.uiuc.edu/nema_out/d97c8282-883e-4e71-93b7-55283829ad21/results/evaluation/ Audio Latin Genre Classification Results ]&lt;br /&gt;
* [http://nema.lis.uiuc.edu/nema_out/0e2212ca-2c1a-4c4e-b164-de74974afe43/results/evaluation/ Audio Music Mood Classification Results ]&lt;br /&gt;
* [[2010:Audio_Mixed_Popular_Genre_Classification_Results | Audio Mixed Popular Genre Classification Results]]&lt;br /&gt;
&lt;br /&gt;
===Other Tasks===&lt;br /&gt;
&lt;br /&gt;
* Audio Beat Tracking Results&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/abt/mck/ MCK Dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/abt/maz/ MAZ Dataset]&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ace/ Audio Chord Detection]&lt;br /&gt;
* [[2010:Audio_Cover_Song_Identification_Results | Audio Cover Song Identification Results]] &lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/akd/ Audio Key Detection Results]&lt;br /&gt;
* Audio Melody Extraction Results&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/adc04/  ADC04 Dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex05/ MIREX05 Dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/indian08/ INDIAN08 Dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex09_0dB/ MIREX09 0dB Dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex09_m5dB/ MIREX09 -5dB Dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ame/mirex09_p5dB/ MIREX09 +5dB Dataset]&lt;br /&gt;
* [[2010:Audio_Music_Similarity_and_Retrieval_Results | Audio Music Similarity and Retrieval Results]]&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/aod/ Audio Onset Detection Results] &lt;br /&gt;
* Audio Tag Classification Results&lt;br /&gt;
** Major Miner Tag dataset&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask1_report/bin/ Binary relevance (classification evaluation)]&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask1_report/aff/ Affinity estimation evaluation]&lt;br /&gt;
** Mood Tag dataset&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask2_report/bin/ Binary relevance (classification evaluation)]&lt;br /&gt;
*** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/atg/subtask2_report/aff/ Affinity estimation evaluation]&lt;br /&gt;
* [https://nema.lis.illinois.edu/nema_out/mirex2010/results/ate/ Audio Tempo Estimation Results] &lt;br /&gt;
* [[2010:Multiple_Fundamental_Frequency_Estimation_&amp;amp;_Tracking_Results | Multiple Fundamental Frequency Estimation &amp;amp; Tracking Results]]&lt;br /&gt;
* Music Structure Segmentation Results&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/struct/mirex09/ MIREX09 dataset]&lt;br /&gt;
** [https://nema.lis.illinois.edu/nema_out/mirex2010/results/struct/mirex10/ MIREX10 dataset]&lt;br /&gt;
* [[2010:Query-by-Singing/Humming_Results | Query-by-Singing/Humming Results]] &lt;br /&gt;
* [[2010:Query-by-Tapping_Results | Query-by-Tapping Results]]&lt;br /&gt;
*[[2010:Real-time_Audio_to_Score_Alignment_(a.k.a._Score_Following)_Results | Real-time Audio to Score Alignment (a.k.a. Score Following) Results ]]&lt;br /&gt;
* [[2010:Symbolic_Melodic_Similarity_Results | Symbolic Melodic Similarity Results]]&lt;br /&gt;
&lt;br /&gt;
== Machine Specifications ==&lt;br /&gt;
&lt;br /&gt;
== Runtime for Submissions Run by NEMA DIY ==&lt;br /&gt;
&lt;br /&gt;
* [[2010:Runtime | Runtime]]&lt;br /&gt;
&lt;br /&gt;
[[Category:Results]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2011&amp;diff=7748</id>
		<title>2011</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2011&amp;diff=7748"/>
		<updated>2010-08-19T16:57:48Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: redirect&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2011:MIREX Home]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2011:Main_Page&amp;diff=7747</id>
		<title>2011:Main Page</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2011:Main_Page&amp;diff=7747"/>
		<updated>2010-08-19T16:57:28Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirect&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2011:MIREX Home]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7746</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7746"/>
		<updated>2010-08-19T16:56:51Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
** 2011:Main_Page|MIREX 2011&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
*results by year&lt;br /&gt;
**2010:MIREX2010_Results| MIREX 2010 Not Ready&lt;br /&gt;
**2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
**2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
**2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
**2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
**2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX CENTRAL HOME&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2011:MIREX_Home&amp;diff=7745</id>
		<title>2011:MIREX Home</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2011:MIREX_Home&amp;diff=7745"/>
		<updated>2010-08-19T16:56:12Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Welcome to MIREX 2011==&lt;br /&gt;
This is the main page for the sixth running of the Music Information Retrieval Evaluation eXchange (MIREX 2011). The International Music Information Retrieval Systems Evaluation Laboratory ([https://music-ir.org/evaluation IMIRSEL]) at the Graduate School of Library and Information Science ([http://www.lis.illinois.edu GSLIS]), University of Illinois at Urbana-Champaign ([http://www.illinois.edu UIUC]) is the principal organizer of MIREX 2011. &lt;br /&gt;
&lt;br /&gt;
The MIREX 2011 community will hold its annual meeting as part of [http://ismir2011.ismir.net/ The 11th International Conference on Music Information Retrieval], ISMIR 2011, which will be held in Utrecht, Netherlands, from August 9th to 13th, 2011. The MIREX plenary (working lunch) and poster sessions will be held Wednesday, 11 August 2011.&lt;br /&gt;
&lt;br /&gt;
J. Stephen Downie&amp;lt;br&amp;gt;&lt;br /&gt;
Director, IMIRSEL&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===MIREX 2011 Evaluation Tasks===&lt;br /&gt;
&lt;br /&gt;
* [[2011:Audio Classification (Train/Test) Tasks]], incorporating:&lt;br /&gt;
** Audio Artist Identification&lt;br /&gt;
** Audio US Pop Genre Classification&lt;br /&gt;
** Audio Latin Genre Classification&lt;br /&gt;
** Audio Music Mood Classification&lt;br /&gt;
** Audio Classical Composer Identification&lt;br /&gt;
* [[2011:Audio Cover Song Identification]]&lt;br /&gt;
* [[2011:Audio Tag Classification]] &lt;br /&gt;
* [[2011:Audio Music Similarity and Retrieval]]&lt;br /&gt;
* [[2011:Symbolic Melodic Similarity]]&lt;br /&gt;
* [[2011:Audio Onset Detection]]&lt;br /&gt;
* [[2011:Audio Key Detection]]&lt;br /&gt;
* [[2011:Real-time Audio to Score Alignment (a.k.a Score Following)]]&lt;br /&gt;
* [[2011:Query by Singing/Humming]]&lt;br /&gt;
* [[2011:Audio Melody Extraction]]&lt;br /&gt;
* [[2011:Multiple Fundamental Frequency Estimation &amp;amp; Tracking]]&lt;br /&gt;
* [[2011:Audio Chord Estimation]]&lt;br /&gt;
* [[2011:Query by Tapping]]&lt;br /&gt;
* [[2011:Audio Beat Tracking]]&lt;br /&gt;
* [[2011:Structural Segmentation]]&lt;br /&gt;
* [[2011:Audio Tempo Estimation]]&lt;br /&gt;
&lt;br /&gt;
===Note to New Participants===&lt;br /&gt;
Please take the time to read the following review article that explains the history and structure of MIREX.&lt;br /&gt;
&lt;br /&gt;
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):&amp;lt;br&amp;gt;&lt;br /&gt;
A window into music information retrieval research.''Acoustical Science and Technology 29'' (4): 247-255. &amp;lt;br&amp;gt;&lt;br /&gt;
Available at: [http://dx.doi.org/10.1250/ast.29.247 http://dx.doi.org/10.1250/ast.29.247]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Note to All Participants===&lt;br /&gt;
Because MIREX is premised upon the sharing of ideas and results, '''ALL''' MIREX participants are expected to:&lt;br /&gt;
&lt;br /&gt;
# submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).&lt;br /&gt;
# submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2011 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)&lt;br /&gt;
# present a poster at the MIREX 2011 poster session at ISMIR 2011 (Wednesday, 11 August 2011)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Software Dependency Requests===&lt;br /&gt;
If you have not submitted to MIREX before or are unsure whether IMIRSEL/NEMA currently supports some of the software/architecture dependencies for your submission a [https://spreadsheets.google.com/embeddedform?formkey=dDltRjc4NDBDdkZiaF9qZXV0bU5ScUE6MA dependency request form is available]. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you. &lt;br /&gt;
&lt;br /&gt;
Due to the high volume of submissions expected at MIREX 2011, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.&lt;br /&gt;
&lt;br /&gt;
==Getting Involved in MIREX 2011==&lt;br /&gt;
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2011 the best yet.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Mailing List Participation===&lt;br /&gt;
If you are interested in formal MIR evaluation, you should also subscribe to the &amp;quot;MIREX&amp;quot; (aka &amp;quot;EvalFest&amp;quot;) mail list and participate in the community discussions about defining and running MIREX 2011 tasks. Subscription information at: &lt;br /&gt;
[https://mail.lis.illinois.edu/mailman/listinfo/evalfest EvalFest Central]. &lt;br /&gt;
&lt;br /&gt;
If you are participating in MIREX 2011, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2011 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here. &lt;br /&gt;
&lt;br /&gt;
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2011 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Wiki Participation===&lt;br /&gt;
Please create an account via: [[Special:Userlogin]].&lt;br /&gt;
&lt;br /&gt;
Please note that because of &amp;quot;spam-bots&amp;quot;, MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).&lt;br /&gt;
&lt;br /&gt;
==MIREX 2005 - 2010 Wikis==&lt;br /&gt;
Content from MIREX 2005 - 2010 are available at:&lt;br /&gt;
&lt;br /&gt;
'''[[2009:Main_Page|MIREX 2010]]''' &lt;br /&gt;
'''[[2009:Main_Page|MIREX 2009]]''' &lt;br /&gt;
'''[[2008:Main_Page|MIREX 2008]]''' &lt;br /&gt;
'''[[2007:Main_Page|MIREX 2007]]''' &lt;br /&gt;
'''[[2006:Main_Page|MIREX 2006]]''' &lt;br /&gt;
'''[[2005:Main_Page|MIREX 2005]]'''&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2011:MIREX_Home&amp;diff=7744</id>
		<title>2011:MIREX Home</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2011:MIREX_Home&amp;diff=7744"/>
		<updated>2010-08-19T16:55:40Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with ' ==Welcome to MIREX 2011== This is the main page for the sixth running of the Music Information Retrieval Evaluation eXchange (MIREX 2011). The International Music Information Re…'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
==Welcome to MIREX 2011==&lt;br /&gt;
This is the main page for the sixth running of the Music Information Retrieval Evaluation eXchange (MIREX 2011). The International Music Information Retrieval Systems Evaluation Laboratory ([https://music-ir.org/evaluation IMIRSEL]) at the Graduate School of Library and Information Science ([http://www.lis.illinois.edu GSLIS]), University of Illinois at Urbana-Champaign ([http://www.illinois.edu UIUC]) is the principal organizer of MIREX 2011. &lt;br /&gt;
&lt;br /&gt;
The MIREX 2011 community will hold its annual meeting as part of [http://ismir2011.ismir.net/ The 11th International Conference on Music Information Retrieval], ISMIR 2011, which will be held in Utrecht, Netherlands, from August 9th to 13th, 2011. The MIREX plenary (working lunch) and poster sessions will be held Wednesday, 11 August 2011.&lt;br /&gt;
&lt;br /&gt;
J. Stephen Downie&amp;lt;br&amp;gt;&lt;br /&gt;
Director, IMIRSEL&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===MIREX 2011 Evaluation Tasks===&lt;br /&gt;
&lt;br /&gt;
* [[2011:Audio Classification (Train/Test) Tasks]], incorporating:&lt;br /&gt;
** Audio Artist Identification&lt;br /&gt;
** Audio US Pop Genre Classification&lt;br /&gt;
** Audio Latin Genre Classification&lt;br /&gt;
** Audio Music Mood Classification&lt;br /&gt;
** Audio Classical Composer Identification&lt;br /&gt;
* [[2011:Audio Cover Song Identification]]&lt;br /&gt;
* [[2011:Audio Tag Classification]] &lt;br /&gt;
* [[2011:Audio Music Similarity and Retrieval]]&lt;br /&gt;
* [[2011:Symbolic Melodic Similarity]]&lt;br /&gt;
* [[2011:Audio Onset Detection]]&lt;br /&gt;
* [[2011:Audio Key Detection]]&lt;br /&gt;
* [[2011:Real-time Audio to Score Alignment (a.k.a Score Following)]]&lt;br /&gt;
* [[2011:Query by Singing/Humming]]&lt;br /&gt;
* [[2011:Audio Melody Extraction]]&lt;br /&gt;
* [[2011:Multiple Fundamental Frequency Estimation &amp;amp; Tracking]]&lt;br /&gt;
* [[2011:Audio Chord Estimation]]&lt;br /&gt;
* [[2011:Query by Tapping]]&lt;br /&gt;
* [[2011:Audio Beat Tracking]]&lt;br /&gt;
* [[2011:Structural Segmentation]]&lt;br /&gt;
* [[2011:Audio Tempo Estimation]]&lt;br /&gt;
&lt;br /&gt;
===Note to New Participants===&lt;br /&gt;
Please take the time to read the following review article that explains the history and structure of MIREX.&lt;br /&gt;
&lt;br /&gt;
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):&amp;lt;br&amp;gt;&lt;br /&gt;
A window into music information retrieval research.''Acoustical Science and Technology 29'' (4): 247-255. &amp;lt;br&amp;gt;&lt;br /&gt;
Available at: [http://dx.doi.org/10.1250/ast.29.247 http://dx.doi.org/10.1250/ast.29.247]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Note to All Participants===&lt;br /&gt;
Because MIREX is premised upon the sharing of ideas and results, '''ALL''' MIREX participants are expected to:&lt;br /&gt;
&lt;br /&gt;
# submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).&lt;br /&gt;
# submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2011 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)&lt;br /&gt;
# present a poster at the MIREX 2011 poster session at ISMIR 2011 (Wednesday, 11 August 2011)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Software Dependency Requests===&lt;br /&gt;
If you have not submitted to MIREX before or are unsure whether IMIRSEL/NEMA currently supports some of the software/architecture dependencies for your submission a [https://spreadsheets.google.com/embeddedform?formkey=dDltRjc4NDBDdkZiaF9qZXV0bU5ScUE6MA dependency request form is available]. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you. &lt;br /&gt;
&lt;br /&gt;
Due to the high volume of submissions expected at MIREX 2011, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.&lt;br /&gt;
&lt;br /&gt;
==Getting Involved in MIREX 2011==&lt;br /&gt;
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2011 the best yet.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Mailing List Participation===&lt;br /&gt;
If you are interested in formal MIR evaluation, you should also subscribe to the &amp;quot;MIREX&amp;quot; (aka &amp;quot;EvalFest&amp;quot;) mail list and participate in the community discussions about defining and running MIREX 2011 tasks. Subscription information at: &lt;br /&gt;
[https://mail.lis.illinois.edu/mailman/listinfo/evalfest EvalFest Central]. &lt;br /&gt;
&lt;br /&gt;
If you are participating in MIREX 2011, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2011 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here. &lt;br /&gt;
&lt;br /&gt;
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2011 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Wiki Participation===&lt;br /&gt;
'''''Please note that you may need to create a NEW login for this wiki even if you have a login that you previously used for editing the MIREX 2005, 2006, 2007, 2008 or 2009 wikis.'''''&lt;br /&gt;
&lt;br /&gt;
However, starting in 2011 the MIREX wikis have been merged so that logins will persist for future iterations of MIREX.&lt;br /&gt;
&lt;br /&gt;
Please create an account via: [[Special:Userlogin]].&lt;br /&gt;
&lt;br /&gt;
Please note that because of &amp;quot;spam-bots&amp;quot;, MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).&lt;br /&gt;
&lt;br /&gt;
==MIREX 2005 - 2010 Wikis==&lt;br /&gt;
Content from MIREX 2005 - 2010 are available at:&lt;br /&gt;
&lt;br /&gt;
'''[[2009:Main_Page|MIREX 2010]]''' &lt;br /&gt;
'''[[2009:Main_Page|MIREX 2009]]''' &lt;br /&gt;
'''[[2008:Main_Page|MIREX 2008]]''' &lt;br /&gt;
'''[[2007:Main_Page|MIREX 2007]]''' &lt;br /&gt;
'''[[2006:Main_Page|MIREX 2006]]''' &lt;br /&gt;
'''[[2005:Main_Page|MIREX 2005]]'''&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Audio_Music_Similarity_and_Retrieval_Results&amp;diff=7624</id>
		<title>2010:Audio Music Similarity and Retrieval Results</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Audio_Music_Similarity_and_Retrieval_Results&amp;diff=7624"/>
		<updated>2010-08-03T01:39:01Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* FINE Scores */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Introduction ==&lt;br /&gt;
These are the results for the 2010 running of the Audio Music Similarity and Retrieval task set. For background information about this task set please refer to the Audio Music Similarity and Retrieval page.&lt;br /&gt;
&lt;br /&gt;
Each system was given 7000 songs chosen from IMIRSEL's &amp;quot;uspop&amp;quot;, &amp;quot;uscrap&amp;quot; and &amp;quot;american&amp;quot; &amp;quot;classical&amp;quot; and &amp;quot;sundry&amp;quot; collections. Each system then returned a 7000x7000 distance matrix. 100 songs were randomly selected from the 10 genre groups (10 per genre) as queries and the first 5 most highly ranked songs out of the 7000 were extracted for each query (after filtering out the query itself, returned results from the same artist were also omitted). Then, for each query, the returned results (candidates) from all participants were grouped and were evaluated by human graders using the Evalutron 6000 grading system. Each individual query/candidate set was evaluated by a single grader. For each query/candidate pair, graders provided two scores. Graders were asked to provide 1 categorical '''BROAD''' score with 3 categories: NS,SS,VS as explained below, and one '''FINE''' score (in the range from 0 to 10). A description and analysis is provided below.&lt;br /&gt;
&lt;br /&gt;
The systems read in 30 second audio clips as their raw data. The same 30 second clips were used in the grading stage. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== General Legend ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Team ID ====&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellspacing=&amp;quot;0&amp;quot; style=&amp;quot;text-align: left; width: 800px;&amp;quot;&lt;br /&gt;
	|- style=&amp;quot;background: yellow;&amp;quot;&lt;br /&gt;
	! width=&amp;quot;80&amp;quot; | Sub code &lt;br /&gt;
	! width=&amp;quot;200&amp;quot; | Submission name &lt;br /&gt;
	! width=&amp;quot;80&amp;quot; style=&amp;quot;text-align: center;&amp;quot; | Abstract &lt;br /&gt;
	! width=&amp;quot;440&amp;quot; | Contributors&lt;br /&gt;
	|-&lt;br /&gt;
	! BWL1&lt;br /&gt;
	| MTG-AMS ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/BWL1.pdf PDF] || [http://mtg.upf.edu Dmitry Bogdanov], [http://mtg.upf.edu Nicolas Wack], [http://mtg.upf.edu Cyril Laurier]&lt;br /&gt;
	|-&lt;br /&gt;
	! PS1&lt;br /&gt;
	| PS09 ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/PS1.pdf PDF] || [http://www.cp.jku.at/ Tim Pohle], [http://www.cp.jku.at/ Dominik Schnitzer]&lt;br /&gt;
	|-&lt;br /&gt;
	! PSS1&lt;br /&gt;
	| PSS10 ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/PSS1.pdf PDF] || [http://www.cp.jku.at/ Tim Pohle], [http://www.cp.jku.at Klaus Seyerlehner], [http://www.cp.jku.at/ Dominik Schnitzer]&lt;br /&gt;
	|-&lt;br /&gt;
	! RZ1&lt;br /&gt;
	| RND ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/RZ1.pdf PDF] || [http://www.cp.jku.at Rainer Zufall]&lt;br /&gt;
	|-&lt;br /&gt;
	! SSPK2&lt;br /&gt;
	| cbmr_sim ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/SSPK2.pdf PDF] || [http://www.cp.jku.at Klaus Seyerlehner], [http://www.cp.jku.at Markus Schedl], [http://www.cp.jku.at Tim Pohle], [http://www.cp.jku.at Peter Knees]&lt;br /&gt;
	|-&lt;br /&gt;
	! TLN1&lt;br /&gt;
	| Post-Processing 1 of Marsyas similarity results ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/TLN1.pdf PDF] || [http://www.cs.uvic.ca/~gtzan George Tzanetakis], [http://recherche.ircam.fr/equipes/analyse-synthese/home.html Mathieu Lagrange], [http://sness.net Steven Ness]&lt;br /&gt;
	|-&lt;br /&gt;
	! TLN2&lt;br /&gt;
	| Post-Processing 2 of Marsyas similarity results ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/TLN2.pdf PDF] || [http://www.cs.uvic.ca/~gtzan George Tzanetakis], [http://recherche.ircam.fr/equipes/analyse-synthese/home.html Mathieu Lagrange], [http://sness.net Steven Ness]&lt;br /&gt;
	|-&lt;br /&gt;
	! TNL1&lt;br /&gt;
	| MarsyasSimilarity ||  style=&amp;quot;text-align: center;&amp;quot; | [https://www.music-ir.org/mirex/abstracts/2010/TNL1.pdf PDF] || [http://www.cs.uvic.ca/~gtzan George Tzanetakis], [http://sness.net Steven Ness], [http://recherche.ircam.fr/equipes/analyse-synthese/home.html Mathieu Lagrange]&lt;br /&gt;
	|}&lt;br /&gt;
&lt;br /&gt;
====Broad Categories====&lt;br /&gt;
'''NS''' = Not Similar&amp;lt;br /&amp;gt;&lt;br /&gt;
'''SS''' = Somewhat Similar&amp;lt;br /&amp;gt;&lt;br /&gt;
'''VS''' = Very Similar&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=====Understanding Summary Measures=====&lt;br /&gt;
'''Fine''' = Has a range from 0 (failure) to 100 (perfection). &amp;lt;br /&amp;gt;&lt;br /&gt;
'''Broad''' = Has a range from 0 (failure) to 2 (perfection) as each query/candidate pair is scored with either NS=0, SS=1 or VS=2. &amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Human Evaluation==&lt;br /&gt;
===Overall Summary Results===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv p=3&amp;gt;2010/ams/AMS2010summary_evalutron.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
'''Note:RZ1''' is the random result for comparing purpose. &lt;br /&gt;
&lt;br /&gt;
===Friedman's Tests===&lt;br /&gt;
====Friedman's Test (FINE Scores)====&lt;br /&gt;
The Friedman test was run in MATLAB against the '''Fine''' summary data over the 100 queries.&amp;lt;br /&amp;gt;&lt;br /&gt;
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);&lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv p=3&amp;gt;2010/ams/evalutron.fine.friedman.tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:2010AMS.evalutron.fine.friedman.tukeyKramerHSD.png|500px]]&lt;br /&gt;
&lt;br /&gt;
====Friedman's Test (BROAD Scores)====&lt;br /&gt;
The Friedman test was run in MATLAB against the '''BROAD''' summary data over the 100 queries.&amp;lt;br /&amp;gt;&lt;br /&gt;
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);&lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv p=3&amp;gt;2010/ams/evalutron.cat.friedman.tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:2010AMS.evalutron.cat.friedman.tukeyKramerHSD.png|500px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Summary Results by Query===&lt;br /&gt;
====FINE Scores====&lt;br /&gt;
These are the mean FINE scores per query assigned by Evalutron graders. The FINE scores for the 5 candidates returned per algorithm, per query, have been averaged. Values are bounded between 0 and 100. A perfect score would be 100. Genre labels have been included for reference. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv p=3&amp;gt;2010/ams/fine_scores.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====BROAD Scores====&lt;br /&gt;
These are the mean BROAD scores per query assigned by Evalutron graders. The BROAD scores for the 5 candidates returned per algorithm, per query, have been averaged. Values are bounded between 0 (not similar) and 2 (very similar). A perfect score would be 2. Genre labels have been included for reference. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv p=3&amp;gt;2010/ams/cat_scores.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Raw Scores===&lt;br /&gt;
The raw data derived from the Evalutron 6000 human evaluations are located on the [[2010:Audio Music Similarity and Retrieval Raw Data]] page.&lt;br /&gt;
&lt;br /&gt;
==Metadata and Distance Space Evaluation==&lt;br /&gt;
The following reports provide evaluation statistics based on analysis of the distance space and metadata matches and  include:&lt;br /&gt;
* Neighbourhood clustering by artist, album and genre&lt;br /&gt;
* Artist-filtered genre clustering&lt;br /&gt;
* How often the triangular inequality holds&lt;br /&gt;
* Statistics on 'hubs' (tracks similar to many tracks) and orphans (tracks that are not similar to any other tracks at N results).&lt;br /&gt;
&lt;br /&gt;
=== Reports ===&lt;br /&gt;
	&lt;br /&gt;
'''BWL1''' = [https://music-ir.org/mirex/results/2010/ams/statistics/BWL1/report.txt Dmitry Bogdanov, Nicolas Wack, Cyril Laurier]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''PS1''' = [https://music-ir.org/mirex/results/2010/ams/statistics/PS1/report.txt Tim Pohle, Dominik Schnitzer]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''PSS1''' = [https://music-ir.org/mirex/results/2010/ams/statistics/PSS1/report.txt Tim Pohle, Klaus Seyerlehner, Dominik Schnitzer]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''RZ1''' = [https://music-ir.org/mirex/results/2010/ams/statistics/RZ1/report.txt Dmitry Rainer Zufall]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''SSPK2''' = [https://music-ir.org/mirex/results/2010/ams/statistics/SSPK2/report.txt Klaus Seyerlehner, Markus Schedl, Tim Pohle, Peter Knees]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''TLN1''' = [https://music-ir.org/mirex/results/2010/ams/statistics/TLN1/report.txt George Tzanetakis, Mathieu Lagrange, Steven Ness]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''TLN2''' = [https://music-ir.org/mirex/results/2010/ams/statistics/TLN2/report.txt George Tzanetakis, Mathieu Lagrange, Steven Ness]&amp;lt;br /&amp;gt;&lt;br /&gt;
'''TLN3''' = [https://music-ir.org/mirex/results/2010/ams/statistics/TLN3/report.txt George Tzanetakis, Mathieu Lagrange, Steven Ness]&amp;lt;br /&amp;gt;&lt;br /&gt;
== Run Times ==&lt;br /&gt;
&amp;lt;csv&amp;gt;2010/ams/audiosim.runtime.csv&amp;lt;/csv&amp;gt;&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=7399</id>
		<title>Sandbox</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=7399"/>
		<updated>2010-07-24T05:46:11Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;math&amp;gt;\alpha + 5 * \beta&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
é&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7303</id>
		<title>2010:Evalutron6000 Walkthrough</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7303"/>
		<updated>2010-07-15T19:31:15Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Evaluate */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==UPDATE 2010==&lt;br /&gt;
The 2010 Evalutron has a new look and implementation. It is easier to use than the original implementation. However, it is necessary to read through this document before you start using the system. Enjoy!&lt;br /&gt;
&lt;br /&gt;
== Four Important Issues Before Proceeding ==&lt;br /&gt;
&lt;br /&gt;
# This system is NOT open to the general public. Due to the legalities imposed upon us by the University of Illinois and the US Federal Government, we are allowed to accept only those graders that have a stake in Music Information Retrieval and Music Digital Library research. Acceptable persons include MIREX participants, subscribers to the music-ir[AT]ircam.fr list, ISMIR attendees, computational musicologists, music technology researchers, etc. If you are in doubt about your acceptability, please contact Prof. Downie at [mailto:jdownie@illinois.edu].&lt;br /&gt;
# Do NOT click &amp;quot;Get Assignments&amp;quot; link if you are merely curious to see what the Evalutron 6000 is set up to do. The assignment distribution process scientifically distributes Query and Candidate sets to each account asking for assignments.&lt;br /&gt;
# The Evalutron is set up for both Audio Music Similarity (AMS) task and Symbolic Melodic Similarity (SMS) task. Please do NOT click &amp;quot;Get Assignments&amp;quot; link under the task you do not intend to participate. Otherwise, you will get assignments for that task which will prevent other graders from getting the assignments you get.&lt;br /&gt;
# Please do NOT create multiple accounts for yourself Evalutron. The system is designed to track individuals based upon unique sign-in IDs. The creation of multiple accounts for one person causes the system to improperly distribute the assignments to your fellow graders. If you have difficulties, such as forgetting your sign-in password, please contact us at [mailto:mirproject@lists.lis.illinois.edu] or try the &amp;quot;Forgot Password&amp;quot; link found at the left side of the homepage (Fig 1).&lt;br /&gt;
&lt;br /&gt;
==Welcome to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
In order to use the Evalutron 6000 you will need to modern web browser (e.g., Firefox, Internet Explorer, Safari, Mozilla, etc) that supports JavaScript, Flash, and Cookies. Evalutron has been tested on Windows XP, Windows 7, MacOS X, and Ubuntu Linux. If you are using a different platform and having trouble, please try accessing Evalutron 6000 from another machine. If you are still having difficulty, contact us at [mailto:mirex@imirsel.org mirex@imirsel.org].&lt;br /&gt;
&lt;br /&gt;
When first visiting the Evalutron 6000 homepage, you will see its home page (Fig. 1). &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_home.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 1. Evalutron 6000 home page.'''&lt;br /&gt;
&lt;br /&gt;
==Register to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
If you have an account with the MIREX submission system, then you can use the same account for Evalutron 6000. Otherwise, you must register a new account. Click on the &amp;quot;Register&amp;quot; link on the left side of the page to create an account.&lt;br /&gt;
&lt;br /&gt;
The registration page is fairly straightforward (Fig. 2). All fields are required. You can create any username and password you wish. Username must be at least 5 characters long and passwords must be at least 8 characters long and are case-sensitive. Before completing the registration, you will receive an email with an activation link. Clicking that link will complete your registration. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_register.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 2. Evalutron 6000 registration page.'''&lt;br /&gt;
&lt;br /&gt;
==Agree to the Informed Consent ==&lt;br /&gt;
&lt;br /&gt;
Before starting evaluation, every evaluator must read and agree to the terms of the Informed Consent document. Otherwise, you will be redirect to the informed consent page when you try to get evaluation assignments. Clicking the &amp;quot;Informed Consent&amp;quot; link on the left-side menu will show the form (Fig. 3). The evaluation, because it is using human judgments of similarity, is considered a human-subjects research project and the Evalutron is a survey instrument. To indicate your consent to participate in the evaluation, scroll down the page and check the checkbox below the informed consent document.&lt;br /&gt;
&lt;br /&gt;
If you have questions about your rights as a subject in this research project, you should contact the UIUC IRB office (http://www.irb.uiuc.edu) for more information. The research protocol for this project is IRB# 07066.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_consent.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 3. Evalutron 6000 informed consent page.'''&lt;br /&gt;
&lt;br /&gt;
==Get Your Assignments==&lt;br /&gt;
&lt;br /&gt;
To start the evaluation process, click the &amp;quot;My Assignments&amp;quot; link on the left-side menu. The assignment page is similar to Fig. 4. This page shows all tasks available in the Evalutron 6000, and initially there is no assignment given. Be careful to select the task you intend to participate, and click the &amp;quot;Get Assignment&amp;quot; button under this task. The system will assign you a number of queries to evaluate.&lt;br /&gt;
&lt;br /&gt;
Please note that once assignments are made, they cannot be changed or removed. Therefore, please NEVER click &amp;quot;Get Assignment&amp;quot; for the task you do not intend to participate.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_getAssignment.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 4. Evalutron 6000 get assignment page.'''&lt;br /&gt;
&lt;br /&gt;
==Evaluate==&lt;br /&gt;
Clicking the &amp;quot;Evaluate Query&amp;quot; button under a query will lead you to the evaluation page of that query (Fig. 5). This page consists of instructions on the top and a list of query-candidate pairs. Please read the instructions carefully. The query for all candidates on a single page is the same, it is included with each candidate so that you can easily compare it to each candidate. For the Audio Music Similarity (AMS) task, each query and candidate is 30 second long. For Symbolic Melody Similarity (SMS) task, the length varies. Clicking on the player button besides each query or candidate will load the song into the player and begin playing. Clicking the player button again will pause it. We recommend you listen to the entire query at least once before evaluating any candidate files.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 5. Sample evaluation page.'''&lt;br /&gt;
&lt;br /&gt;
Please note there are more candidates than may be immediately visible on the page. Please scroll to the bottom of the candidate list to make sure you've evaluated each song.&lt;br /&gt;
&lt;br /&gt;
Once you have a feeling for whether or not the candidate is similar to the query, click the &amp;quot;Not Similar&amp;quot;, &amp;quot;Somewhat Similar&amp;quot; or &amp;quot;Very Similar&amp;quot; radio buttons to the right of the query-candidate pair. Each grader will also need to assign a fine-grained score for the similarity of the candidate to the query on a scale of 0-100, with 0 indicating completely different and 100 perfectly similar or identical). To input the fine score, you need to move the slider, and once you let go of the slider the system will automatically record the score.&lt;br /&gt;
 &lt;br /&gt;
[[Image:2010_e6k_qcq_detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 6. Close up image of a query-candidate pair and evaluation buttons.'''&lt;br /&gt;
&lt;br /&gt;
Only after you input BOTH the broad category and the fine score can a query-candidate pair be marked as green, indicating you have completed evaluating this query-candidate pair.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_qcq_done_ detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 7. Close up image of a completed query-candidate pair.'''&lt;br /&gt;
&lt;br /&gt;
You can always change your evaluation for any candidate by toggling the radio buttons and adjust the Fine Score selection scale.  Once an evaluation has been made, however, it cannot be retracted, only changed. (i.e., you cannot &amp;quot;unvote&amp;quot;).&lt;br /&gt;
&lt;br /&gt;
==Work on Another Query==&lt;br /&gt;
&lt;br /&gt;
At the bottom of the evaluation page, there is a &amp;quot;View All Assignments&amp;quot; button (Fig. 8). At any time, clicking this button will direct you to your assignment page similar to the one shown in Fig. 4. When you have completed evaluating all of the candidates for this query, you may click this button to continue on another query.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page_bottom.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 8. Close up image of &amp;quot;View All Assignments&amp;quot; button which loads the assignment page. '''&lt;br /&gt;
&lt;br /&gt;
You may also click the &amp;quot;My Assignments&amp;quot; link on the left-side menu to go to the My Assignments page.&lt;br /&gt;
&lt;br /&gt;
On the My Assignments page, you may click another &amp;quot;Evaluate Query&amp;quot; button to load evaluation page (like Fig. 5) of that query. &lt;br /&gt;
&lt;br /&gt;
You will see a list of all of the queries you have evaluated (or are evaluating) at the My Assignments page. You can return to any query by clicking on the &amp;quot;Evaluate Query&amp;quot; underneath it (Fig. 4). You can re-evaluate any candidate for any query at any time, up to the closing of the evaluation system.&lt;br /&gt;
&lt;br /&gt;
==Monitor Progress==&lt;br /&gt;
&lt;br /&gt;
At any time a grader may monitor his/her progress on the My Assignment page (Fig. 9). Each query has a status bar where the completed portion will be marked as green and the unfinished part red. When all the bars become green, the assignments are all completed. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_progress.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 9. Progress indicated on the My Assignment Page '''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Grading Expectations and &amp;quot;Reasonableness&amp;quot;==&lt;br /&gt;
&lt;br /&gt;
For each query-candidate pair, we need you to assign BOTH a Broad Category score AND a Fine Score (i.e., a numeric grade between 0 and 100,  0 is meant to represent complete different and 100 perfectly similar or identical.). You have the freedom to make whatever associations you desire between a particular Broad Category score and its related Fine Score. In fact, we expect to see variations across evaluators with regard to the relationships between Broad Categories and Fine Scores as this is a normal part of human subjectivity. However, we will be using the two different types of scores to do important inter-related post-Evalutron calculations so, please, do be thoughtful in selecting your Broad Categories and related Fine Scores. What we are really asking here is that you apply a level of &amp;quot;reasonableness&amp;quot; to both your scores and your associations. For example, if you score a candidate in the VERY SIMILAR category, a Fine Score of 21 would not be, by most standards, &amp;quot;reasonable&amp;quot;. Same applies at the other extreme. For example, a Broad Category score of NOT SIMILAR should not be associated with a Fine Score of, say, 72 or 84, etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Coming and Going: Avoiding Grader Fatigue==&lt;br /&gt;
&lt;br /&gt;
Listening to, and then comparing, hundreds of audio files is very tiring. We have built into the back-end of the Evalutron 6000 system a rather robust database system that records in near-real-time your grading scores. These scores are saved along with information about which queries and candidates you have yet to review. All this information is stored in association with your personal sign-in ID. This means you can break up your grading over several days at times convenient and productive to you. In fact, we recommend that you not try to tackle your &amp;quot;assignment&amp;quot; in one big chunk as fresh ears are happy ears and happy ears make for better evaluations.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7302</id>
		<title>2010:Evalutron6000 Walkthrough</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7302"/>
		<updated>2010-07-15T19:26:56Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Agree to the Informed Consent */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==UPDATE 2010==&lt;br /&gt;
The 2010 Evalutron has a new look and implementation. It is easier to use than the original implementation. However, it is necessary to read through this document before you start using the system. Enjoy!&lt;br /&gt;
&lt;br /&gt;
== Four Important Issues Before Proceeding ==&lt;br /&gt;
&lt;br /&gt;
# This system is NOT open to the general public. Due to the legalities imposed upon us by the University of Illinois and the US Federal Government, we are allowed to accept only those graders that have a stake in Music Information Retrieval and Music Digital Library research. Acceptable persons include MIREX participants, subscribers to the music-ir[AT]ircam.fr list, ISMIR attendees, computational musicologists, music technology researchers, etc. If you are in doubt about your acceptability, please contact Prof. Downie at [mailto:jdownie@illinois.edu].&lt;br /&gt;
# Do NOT click &amp;quot;Get Assignments&amp;quot; link if you are merely curious to see what the Evalutron 6000 is set up to do. The assignment distribution process scientifically distributes Query and Candidate sets to each account asking for assignments.&lt;br /&gt;
# The Evalutron is set up for both Audio Music Similarity (AMS) task and Symbolic Melodic Similarity (SMS) task. Please do NOT click &amp;quot;Get Assignments&amp;quot; link under the task you do not intend to participate. Otherwise, you will get assignments for that task which will prevent other graders from getting the assignments you get.&lt;br /&gt;
# Please do NOT create multiple accounts for yourself Evalutron. The system is designed to track individuals based upon unique sign-in IDs. The creation of multiple accounts for one person causes the system to improperly distribute the assignments to your fellow graders. If you have difficulties, such as forgetting your sign-in password, please contact us at [mailto:mirproject@lists.lis.illinois.edu] or try the &amp;quot;Forgot Password&amp;quot; link found at the left side of the homepage (Fig 1).&lt;br /&gt;
&lt;br /&gt;
==Welcome to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
In order to use the Evalutron 6000 you will need to modern web browser (e.g., Firefox, Internet Explorer, Safari, Mozilla, etc) that supports JavaScript, Flash, and Cookies. Evalutron has been tested on Windows XP, Windows 7, MacOS X, and Ubuntu Linux. If you are using a different platform and having trouble, please try accessing Evalutron 6000 from another machine. If you are still having difficulty, contact us at [mailto:mirex@imirsel.org mirex@imirsel.org].&lt;br /&gt;
&lt;br /&gt;
When first visiting the Evalutron 6000 homepage, you will see its home page (Fig. 1). &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_home.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 1. Evalutron 6000 home page.'''&lt;br /&gt;
&lt;br /&gt;
==Register to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
If you have an account with the MIREX submission system, then you can use the same account for Evalutron 6000. Otherwise, you must register a new account. Click on the &amp;quot;Register&amp;quot; link on the left side of the page to create an account.&lt;br /&gt;
&lt;br /&gt;
The registration page is fairly straightforward (Fig. 2). All fields are required. You can create any username and password you wish. Username must be at least 5 characters long and passwords must be at least 8 characters long and are case-sensitive. Before completing the registration, you will receive an email with an activation link. Clicking that link will complete your registration. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_register.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 2. Evalutron 6000 registration page.'''&lt;br /&gt;
&lt;br /&gt;
==Agree to the Informed Consent ==&lt;br /&gt;
&lt;br /&gt;
Before starting evaluation, every evaluator must read and agree to the terms of the Informed Consent document. Otherwise, you will be redirect to the informed consent page when you try to get evaluation assignments. Clicking the &amp;quot;Informed Consent&amp;quot; link on the left-side menu will show the form (Fig. 3). The evaluation, because it is using human judgments of similarity, is considered a human-subjects research project and the Evalutron is a survey instrument. To indicate your consent to participate in the evaluation, scroll down the page and check the checkbox below the informed consent document.&lt;br /&gt;
&lt;br /&gt;
If you have questions about your rights as a subject in this research project, you should contact the UIUC IRB office (http://www.irb.uiuc.edu) for more information. The research protocol for this project is IRB# 07066.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_consent.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 3. Evalutron 6000 informed consent page.'''&lt;br /&gt;
&lt;br /&gt;
==Get Your Assignments==&lt;br /&gt;
&lt;br /&gt;
To start the evaluation process, click the &amp;quot;My Assignments&amp;quot; link on the left-side menu. The assignment page is similar to Fig. 4. This page shows all tasks available in the Evalutron 6000, and initially there is no assignment given. Be careful to select the task you intend to participate, and click the &amp;quot;Get Assignment&amp;quot; button under this task. The system will assign you a number of queries to evaluate.&lt;br /&gt;
&lt;br /&gt;
Please note that once assignments are made, they cannot be changed or removed. Therefore, please NEVER click &amp;quot;Get Assignment&amp;quot; for the task you do not intend to participate.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_getAssignment.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 4. Evalutron 6000 get assignment page.'''&lt;br /&gt;
&lt;br /&gt;
==Evaluate==&lt;br /&gt;
Clicking the &amp;quot;Evaluate Query&amp;quot; button under a query will lead you to the evaluation page of that query (Fig. 5). This page consists of instructions on the top and a list of query-candidate pairs. Please read the instructions carefully. The query in each of the query-candidate pairs is the same, and it is aligned with each candidate so that you can replay it at any time when you evaluate the candidate. For Audio Music Similarity (AMS) task, each query and candidate is 30 second long. For Symbolic Melody Similarity (SMS) task, the length varies. Clicking on the player button besides each query or candidate will load the song into the player and begin playing. Clicking the player button again will pause it. We recommend you listen to the entire query at least once before evaluating any candidate files.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 5. Sample evaluation page.'''&lt;br /&gt;
&lt;br /&gt;
Please note there are more candidates than may be immediately visible on the page. Please scroll to the bottom of the candidate list to make sure you've evaluated each song.&lt;br /&gt;
&lt;br /&gt;
Once you have a feeling for whether or not the candidate is similar to the query, click the &amp;quot;Not Similar&amp;quot;, &amp;quot;Somewhat Similar&amp;quot; or &amp;quot;Very Similar&amp;quot; radio buttons to the right of the query-candidate pair. Each grader will also need to assign a fine-grained score for the similarity of the candidate to the query on a scale of 0-100, with 0 indicating completely different and 100 perfectly similar or identical). To input the fine score, you need to move the scaler, and once you let go of the scaler the system will automatically record the score.&lt;br /&gt;
 &lt;br /&gt;
[[Image:2010_e6k_qcq_detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 6. Close up image of a query-candidate pair and evaluation buttons.'''&lt;br /&gt;
&lt;br /&gt;
Only after you input BOTH the broad category and the fine score can a query-candidate pair be marked as green, indicating you have complete evaluating this query-candidate pair.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_qcq_done_ detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 7. Close up image of a completed query-candidate pair.'''&lt;br /&gt;
&lt;br /&gt;
You can always change your evaluation for any candidate by toggling the radio buttons and adjust the Fine Score selection scale.  Once an evaluation has been made, however, it cannot be retracted, only changed. (i.e., you cannot &amp;quot;unvote&amp;quot;).&lt;br /&gt;
&lt;br /&gt;
==Work on Another Query==&lt;br /&gt;
&lt;br /&gt;
At the bottom of the evaluation page, there is a &amp;quot;View All Assignments&amp;quot; button (Fig. 8). At any time, clicking this button will direct you to your assignment page similar to the one shown in Fig. 4. When you have completed evaluating all of the candidates for this query, you may click this button to continue on another query.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page_bottom.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 8. Close up image of &amp;quot;View All Assignments&amp;quot; button which loads the assignment page. '''&lt;br /&gt;
&lt;br /&gt;
You may also click the &amp;quot;My Assignments&amp;quot; link on the left-side menu to go to the My Assignments page.&lt;br /&gt;
&lt;br /&gt;
On the My Assignments page, you may click another &amp;quot;Evaluate Query&amp;quot; button to load evaluation page (like Fig. 5) of that query. &lt;br /&gt;
&lt;br /&gt;
You will see a list of all of the queries you have evaluated (or are evaluating) at the My Assignments page. You can return to any query by clicking on the &amp;quot;Evaluate Query&amp;quot; underneath it (Fig. 4). You can re-evaluate any candidate for any query at any time, up to the closing of the evaluation system.&lt;br /&gt;
&lt;br /&gt;
==Monitor Progress==&lt;br /&gt;
&lt;br /&gt;
At any time a grader may monitor his/her progress on the My Assignment page (Fig. 9). Each query has a status bar where the completed portion will be marked as green and the unfinished part red. When all the bars become green, the assignments are all completed. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_progress.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 9. Progress indicated on the My Assignment Page '''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Grading Expectations and &amp;quot;Reasonableness&amp;quot;==&lt;br /&gt;
&lt;br /&gt;
For each query-candidate pair, we need you to assign BOTH a Broad Category score AND a Fine Score (i.e., a numeric grade between 0 and 100,  0 is meant to represent complete different and 100 perfectly similar or identical.). You have the freedom to make whatever associations you desire between a particular Broad Category score and its related Fine Score. In fact, we expect to see variations across evaluators with regard to the relationships between Broad Categories and Fine Scores as this is a normal part of human subjectivity. However, we will be using the two different types of scores to do important inter-related post-Evalutron calculations so, please, do be thoughtful in selecting your Broad Categories and related Fine Scores. What we are really asking here is that you apply a level of &amp;quot;reasonableness&amp;quot; to both your scores and your associations. For example, if you score a candidate in the VERY SIMILAR category, a Fine Score of 21 would not be, by most standards, &amp;quot;reasonable&amp;quot;. Same applies at the other extreme. For example, a Broad Category score of NOT SIMILAR should not be associated with a Fine Score of, say, 72 or 84, etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Coming and Going: Avoiding Grader Fatigue==&lt;br /&gt;
&lt;br /&gt;
Listening to, and then comparing, hundreds of audio files is very tiring. We have built into the back-end of the Evalutron 6000 system a rather robust database system that records in near-real-time your grading scores. These scores are saved along with information about which queries and candidates you have yet to review. All this information is stored in association with your personal sign-in ID. This means you can break up your grading over several days at times convenient and productive to you. In fact, we recommend that you not try to tackle your &amp;quot;assignment&amp;quot; in one big chunk as fresh ears are happy ears and happy ears make for better evaluations.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7301</id>
		<title>2010:Evalutron6000 Walkthrough</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7301"/>
		<updated>2010-07-15T19:25:13Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Register to the Evalutron 6000 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==UPDATE 2010==&lt;br /&gt;
The 2010 Evalutron has a new look and implementation. It is easier to use than the original implementation. However, it is necessary to read through this document before you start using the system. Enjoy!&lt;br /&gt;
&lt;br /&gt;
== Four Important Issues Before Proceeding ==&lt;br /&gt;
&lt;br /&gt;
# This system is NOT open to the general public. Due to the legalities imposed upon us by the University of Illinois and the US Federal Government, we are allowed to accept only those graders that have a stake in Music Information Retrieval and Music Digital Library research. Acceptable persons include MIREX participants, subscribers to the music-ir[AT]ircam.fr list, ISMIR attendees, computational musicologists, music technology researchers, etc. If you are in doubt about your acceptability, please contact Prof. Downie at [mailto:jdownie@illinois.edu].&lt;br /&gt;
# Do NOT click &amp;quot;Get Assignments&amp;quot; link if you are merely curious to see what the Evalutron 6000 is set up to do. The assignment distribution process scientifically distributes Query and Candidate sets to each account asking for assignments.&lt;br /&gt;
# The Evalutron is set up for both Audio Music Similarity (AMS) task and Symbolic Melodic Similarity (SMS) task. Please do NOT click &amp;quot;Get Assignments&amp;quot; link under the task you do not intend to participate. Otherwise, you will get assignments for that task which will prevent other graders from getting the assignments you get.&lt;br /&gt;
# Please do NOT create multiple accounts for yourself Evalutron. The system is designed to track individuals based upon unique sign-in IDs. The creation of multiple accounts for one person causes the system to improperly distribute the assignments to your fellow graders. If you have difficulties, such as forgetting your sign-in password, please contact us at [mailto:mirproject@lists.lis.illinois.edu] or try the &amp;quot;Forgot Password&amp;quot; link found at the left side of the homepage (Fig 1).&lt;br /&gt;
&lt;br /&gt;
==Welcome to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
In order to use the Evalutron 6000 you will need to modern web browser (e.g., Firefox, Internet Explorer, Safari, Mozilla, etc) that supports JavaScript, Flash, and Cookies. Evalutron has been tested on Windows XP, Windows 7, MacOS X, and Ubuntu Linux. If you are using a different platform and having trouble, please try accessing Evalutron 6000 from another machine. If you are still having difficulty, contact us at [mailto:mirex@imirsel.org mirex@imirsel.org].&lt;br /&gt;
&lt;br /&gt;
When first visiting the Evalutron 6000 homepage, you will see its home page (Fig. 1). &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_home.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 1. Evalutron 6000 home page.'''&lt;br /&gt;
&lt;br /&gt;
==Register to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
If you have an account with the MIREX submission system, then you can use the same account for Evalutron 6000. Otherwise, you must register a new account. Click on the &amp;quot;Register&amp;quot; link on the left side of the page to create an account.&lt;br /&gt;
&lt;br /&gt;
The registration page is fairly straightforward (Fig. 2). All fields are required. You can create any username and password you wish. Username must be at least 5 characters long and passwords must be at least 8 characters long and are case-sensitive. Before completing the registration, you will receive an email with an activation link. Clicking that link will complete your registration. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_register.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 2. Evalutron 6000 registration page.'''&lt;br /&gt;
&lt;br /&gt;
==Agree to the Informed Consent ==&lt;br /&gt;
&lt;br /&gt;
Before starting evaluation, every evaluator must read and agree to the terms of the Informed Consent document. Otherwise, you will be redirect to the informed consent page when you try to get evaluation assignments. Clicking the &amp;quot;Informed Consent&amp;quot; link on the left-side menu will show the form (Fig. 3). The evaluation, because it is using human judgments of similarity, is considered a human-subjects research project and the Evaluatron is basically a survey instrument. To indicate your consent to participate in the evaluation, scroll down the page and check the &amp;quot;I Agree&amp;quot; checkbox below the informed consent document.&lt;br /&gt;
&lt;br /&gt;
If you have questions about your rights as a subject in this research project, you should contact the UIUC IRB office (http://www.irb.uiuc.edu) for more information. The research protocol for this project is IRB# 07066.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_consent.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 3. Evalutron 6000 informed consent page.'''&lt;br /&gt;
&lt;br /&gt;
==Get Your Assignments==&lt;br /&gt;
&lt;br /&gt;
To start the evaluation process, click the &amp;quot;My Assignments&amp;quot; link on the left-side menu. The assignment page is similar to Fig. 4. This page shows all tasks available in the Evalutron 6000, and initially there is no assignment given. Be careful to select the task you intend to participate, and click the &amp;quot;Get Assignment&amp;quot; button under this task. The system will assign you a number of queries to evaluate.&lt;br /&gt;
&lt;br /&gt;
Please note that once assignments are made, they cannot be changed or removed. Therefore, please NEVER click &amp;quot;Get Assignment&amp;quot; for the task you do not intend to participate.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_getAssignment.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 4. Evalutron 6000 get assignment page.'''&lt;br /&gt;
&lt;br /&gt;
==Evaluate==&lt;br /&gt;
Clicking the &amp;quot;Evaluate Query&amp;quot; button under a query will lead you to the evaluation page of that query (Fig. 5). This page consists of instructions on the top and a list of query-candidate pairs. Please read the instructions carefully. The query in each of the query-candidate pairs is the same, and it is aligned with each candidate so that you can replay it at any time when you evaluate the candidate. For Audio Music Similarity (AMS) task, each query and candidate is 30 second long. For Symbolic Melody Similarity (SMS) task, the length varies. Clicking on the player button besides each query or candidate will load the song into the player and begin playing. Clicking the player button again will pause it. We recommend you listen to the entire query at least once before evaluating any candidate files.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 5. Sample evaluation page.'''&lt;br /&gt;
&lt;br /&gt;
Please note there are more candidates than may be immediately visible on the page. Please scroll to the bottom of the candidate list to make sure you've evaluated each song.&lt;br /&gt;
&lt;br /&gt;
Once you have a feeling for whether or not the candidate is similar to the query, click the &amp;quot;Not Similar&amp;quot;, &amp;quot;Somewhat Similar&amp;quot; or &amp;quot;Very Similar&amp;quot; radio buttons to the right of the query-candidate pair. Each grader will also need to assign a fine-grained score for the similarity of the candidate to the query on a scale of 0-100, with 0 indicating completely different and 100 perfectly similar or identical). To input the fine score, you need to move the scaler, and once you let go of the scaler the system will automatically record the score.&lt;br /&gt;
 &lt;br /&gt;
[[Image:2010_e6k_qcq_detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 6. Close up image of a query-candidate pair and evaluation buttons.'''&lt;br /&gt;
&lt;br /&gt;
Only after you input BOTH the broad category and the fine score can a query-candidate pair be marked as green, indicating you have complete evaluating this query-candidate pair.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_qcq_done_ detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 7. Close up image of a completed query-candidate pair.'''&lt;br /&gt;
&lt;br /&gt;
You can always change your evaluation for any candidate by toggling the radio buttons and adjust the Fine Score selection scale.  Once an evaluation has been made, however, it cannot be retracted, only changed. (i.e., you cannot &amp;quot;unvote&amp;quot;).&lt;br /&gt;
&lt;br /&gt;
==Work on Another Query==&lt;br /&gt;
&lt;br /&gt;
At the bottom of the evaluation page, there is a &amp;quot;View All Assignments&amp;quot; button (Fig. 8). At any time, clicking this button will direct you to your assignment page similar to the one shown in Fig. 4. When you have completed evaluating all of the candidates for this query, you may click this button to continue on another query.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page_bottom.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 8. Close up image of &amp;quot;View All Assignments&amp;quot; button which loads the assignment page. '''&lt;br /&gt;
&lt;br /&gt;
You may also click the &amp;quot;My Assignments&amp;quot; link on the left-side menu to go to the My Assignments page.&lt;br /&gt;
&lt;br /&gt;
On the My Assignments page, you may click another &amp;quot;Evaluate Query&amp;quot; button to load evaluation page (like Fig. 5) of that query. &lt;br /&gt;
&lt;br /&gt;
You will see a list of all of the queries you have evaluated (or are evaluating) at the My Assignments page. You can return to any query by clicking on the &amp;quot;Evaluate Query&amp;quot; underneath it (Fig. 4). You can re-evaluate any candidate for any query at any time, up to the closing of the evaluation system.&lt;br /&gt;
&lt;br /&gt;
==Monitor Progress==&lt;br /&gt;
&lt;br /&gt;
At any time a grader may monitor his/her progress on the My Assignment page (Fig. 9). Each query has a status bar where the completed portion will be marked as green and the unfinished part red. When all the bars become green, the assignments are all completed. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_progress.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 9. Progress indicated on the My Assignment Page '''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Grading Expectations and &amp;quot;Reasonableness&amp;quot;==&lt;br /&gt;
&lt;br /&gt;
For each query-candidate pair, we need you to assign BOTH a Broad Category score AND a Fine Score (i.e., a numeric grade between 0 and 100,  0 is meant to represent complete different and 100 perfectly similar or identical.). You have the freedom to make whatever associations you desire between a particular Broad Category score and its related Fine Score. In fact, we expect to see variations across evaluators with regard to the relationships between Broad Categories and Fine Scores as this is a normal part of human subjectivity. However, we will be using the two different types of scores to do important inter-related post-Evalutron calculations so, please, do be thoughtful in selecting your Broad Categories and related Fine Scores. What we are really asking here is that you apply a level of &amp;quot;reasonableness&amp;quot; to both your scores and your associations. For example, if you score a candidate in the VERY SIMILAR category, a Fine Score of 21 would not be, by most standards, &amp;quot;reasonable&amp;quot;. Same applies at the other extreme. For example, a Broad Category score of NOT SIMILAR should not be associated with a Fine Score of, say, 72 or 84, etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Coming and Going: Avoiding Grader Fatigue==&lt;br /&gt;
&lt;br /&gt;
Listening to, and then comparing, hundreds of audio files is very tiring. We have built into the back-end of the Evalutron 6000 system a rather robust database system that records in near-real-time your grading scores. These scores are saved along with information about which queries and candidates you have yet to review. All this information is stored in association with your personal sign-in ID. This means you can break up your grading over several days at times convenient and productive to you. In fact, we recommend that you not try to tackle your &amp;quot;assignment&amp;quot; in one big chunk as fresh ears are happy ears and happy ears make for better evaluations.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7300</id>
		<title>2010:Evalutron6000 Walkthrough</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Evalutron6000_Walkthrough&amp;diff=7300"/>
		<updated>2010-07-15T19:24:48Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Welcome to the Evalutron 6000 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==UPDATE 2010==&lt;br /&gt;
The 2010 Evalutron has a new look and implementation. It is easier to use than the original implementation. However, it is necessary to read through this document before you start using the system. Enjoy!&lt;br /&gt;
&lt;br /&gt;
== Four Important Issues Before Proceeding ==&lt;br /&gt;
&lt;br /&gt;
# This system is NOT open to the general public. Due to the legalities imposed upon us by the University of Illinois and the US Federal Government, we are allowed to accept only those graders that have a stake in Music Information Retrieval and Music Digital Library research. Acceptable persons include MIREX participants, subscribers to the music-ir[AT]ircam.fr list, ISMIR attendees, computational musicologists, music technology researchers, etc. If you are in doubt about your acceptability, please contact Prof. Downie at [mailto:jdownie@illinois.edu].&lt;br /&gt;
# Do NOT click &amp;quot;Get Assignments&amp;quot; link if you are merely curious to see what the Evalutron 6000 is set up to do. The assignment distribution process scientifically distributes Query and Candidate sets to each account asking for assignments.&lt;br /&gt;
# The Evalutron is set up for both Audio Music Similarity (AMS) task and Symbolic Melodic Similarity (SMS) task. Please do NOT click &amp;quot;Get Assignments&amp;quot; link under the task you do not intend to participate. Otherwise, you will get assignments for that task which will prevent other graders from getting the assignments you get.&lt;br /&gt;
# Please do NOT create multiple accounts for yourself Evalutron. The system is designed to track individuals based upon unique sign-in IDs. The creation of multiple accounts for one person causes the system to improperly distribute the assignments to your fellow graders. If you have difficulties, such as forgetting your sign-in password, please contact us at [mailto:mirproject@lists.lis.illinois.edu] or try the &amp;quot;Forgot Password&amp;quot; link found at the left side of the homepage (Fig 1).&lt;br /&gt;
&lt;br /&gt;
==Welcome to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
In order to use the Evalutron 6000 you will need to modern web browser (e.g., Firefox, Internet Explorer, Safari, Mozilla, etc) that supports JavaScript, Flash, and Cookies. Evalutron has been tested on Windows XP, Windows 7, MacOS X, and Ubuntu Linux. If you are using a different platform and having trouble, please try accessing Evalutron 6000 from another machine. If you are still having difficulty, contact us at [mailto:mirex@imirsel.org mirex@imirsel.org].&lt;br /&gt;
&lt;br /&gt;
When first visiting the Evalutron 6000 homepage, you will see its home page (Fig. 1). &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_home.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 1. Evalutron 6000 home page.'''&lt;br /&gt;
&lt;br /&gt;
==Register to the Evalutron 6000==&lt;br /&gt;
&lt;br /&gt;
If you have an account with the submission system, then you can use the same account for Evalutron 6000. Otherwise, you must register a new account. Click on the &amp;quot;Register&amp;quot; link on the left side of the page to create an account.&lt;br /&gt;
&lt;br /&gt;
The registration page is fairly straightforward (Fig. 2). All fields are required. You can create any username and password you wish. Username must be at least 5 characters long and passwords must be at least 8 characters long and are case-sensitive. Before completing the registration, you will receive an email with an activation link. Clicking that link will complete your registration. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_register.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 2. Evalutron 6000 registration page.'''&lt;br /&gt;
&lt;br /&gt;
==Agree to the Informed Consent ==&lt;br /&gt;
&lt;br /&gt;
Before starting evaluation, every evaluator must read and agree to the terms of the Informed Consent document. Otherwise, you will be redirect to the informed consent page when you try to get evaluation assignments. Clicking the &amp;quot;Informed Consent&amp;quot; link on the left-side menu will show the form (Fig. 3). The evaluation, because it is using human judgments of similarity, is considered a human-subjects research project and the Evaluatron is basically a survey instrument. To indicate your consent to participate in the evaluation, scroll down the page and check the &amp;quot;I Agree&amp;quot; checkbox below the informed consent document.&lt;br /&gt;
&lt;br /&gt;
If you have questions about your rights as a subject in this research project, you should contact the UIUC IRB office (http://www.irb.uiuc.edu) for more information. The research protocol for this project is IRB# 07066.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_consent.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 3. Evalutron 6000 informed consent page.'''&lt;br /&gt;
&lt;br /&gt;
==Get Your Assignments==&lt;br /&gt;
&lt;br /&gt;
To start the evaluation process, click the &amp;quot;My Assignments&amp;quot; link on the left-side menu. The assignment page is similar to Fig. 4. This page shows all tasks available in the Evalutron 6000, and initially there is no assignment given. Be careful to select the task you intend to participate, and click the &amp;quot;Get Assignment&amp;quot; button under this task. The system will assign you a number of queries to evaluate.&lt;br /&gt;
&lt;br /&gt;
Please note that once assignments are made, they cannot be changed or removed. Therefore, please NEVER click &amp;quot;Get Assignment&amp;quot; for the task you do not intend to participate.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_getAssignment.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 4. Evalutron 6000 get assignment page.'''&lt;br /&gt;
&lt;br /&gt;
==Evaluate==&lt;br /&gt;
Clicking the &amp;quot;Evaluate Query&amp;quot; button under a query will lead you to the evaluation page of that query (Fig. 5). This page consists of instructions on the top and a list of query-candidate pairs. Please read the instructions carefully. The query in each of the query-candidate pairs is the same, and it is aligned with each candidate so that you can replay it at any time when you evaluate the candidate. For Audio Music Similarity (AMS) task, each query and candidate is 30 second long. For Symbolic Melody Similarity (SMS) task, the length varies. Clicking on the player button besides each query or candidate will load the song into the player and begin playing. Clicking the player button again will pause it. We recommend you listen to the entire query at least once before evaluating any candidate files.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 5. Sample evaluation page.'''&lt;br /&gt;
&lt;br /&gt;
Please note there are more candidates than may be immediately visible on the page. Please scroll to the bottom of the candidate list to make sure you've evaluated each song.&lt;br /&gt;
&lt;br /&gt;
Once you have a feeling for whether or not the candidate is similar to the query, click the &amp;quot;Not Similar&amp;quot;, &amp;quot;Somewhat Similar&amp;quot; or &amp;quot;Very Similar&amp;quot; radio buttons to the right of the query-candidate pair. Each grader will also need to assign a fine-grained score for the similarity of the candidate to the query on a scale of 0-100, with 0 indicating completely different and 100 perfectly similar or identical). To input the fine score, you need to move the scaler, and once you let go of the scaler the system will automatically record the score.&lt;br /&gt;
 &lt;br /&gt;
[[Image:2010_e6k_qcq_detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 6. Close up image of a query-candidate pair and evaluation buttons.'''&lt;br /&gt;
&lt;br /&gt;
Only after you input BOTH the broad category and the fine score can a query-candidate pair be marked as green, indicating you have complete evaluating this query-candidate pair.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_qcq_done_ detail.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 7. Close up image of a completed query-candidate pair.'''&lt;br /&gt;
&lt;br /&gt;
You can always change your evaluation for any candidate by toggling the radio buttons and adjust the Fine Score selection scale.  Once an evaluation has been made, however, it cannot be retracted, only changed. (i.e., you cannot &amp;quot;unvote&amp;quot;).&lt;br /&gt;
&lt;br /&gt;
==Work on Another Query==&lt;br /&gt;
&lt;br /&gt;
At the bottom of the evaluation page, there is a &amp;quot;View All Assignments&amp;quot; button (Fig. 8). At any time, clicking this button will direct you to your assignment page similar to the one shown in Fig. 4. When you have completed evaluating all of the candidates for this query, you may click this button to continue on another query.&lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_eval_page_bottom.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 8. Close up image of &amp;quot;View All Assignments&amp;quot; button which loads the assignment page. '''&lt;br /&gt;
&lt;br /&gt;
You may also click the &amp;quot;My Assignments&amp;quot; link on the left-side menu to go to the My Assignments page.&lt;br /&gt;
&lt;br /&gt;
On the My Assignments page, you may click another &amp;quot;Evaluate Query&amp;quot; button to load evaluation page (like Fig. 5) of that query. &lt;br /&gt;
&lt;br /&gt;
You will see a list of all of the queries you have evaluated (or are evaluating) at the My Assignments page. You can return to any query by clicking on the &amp;quot;Evaluate Query&amp;quot; underneath it (Fig. 4). You can re-evaluate any candidate for any query at any time, up to the closing of the evaluation system.&lt;br /&gt;
&lt;br /&gt;
==Monitor Progress==&lt;br /&gt;
&lt;br /&gt;
At any time a grader may monitor his/her progress on the My Assignment page (Fig. 9). Each query has a status bar where the completed portion will be marked as green and the unfinished part red. When all the bars become green, the assignments are all completed. &lt;br /&gt;
&lt;br /&gt;
[[Image:2010_e6k_progress.png|border]]&lt;br /&gt;
&lt;br /&gt;
'''Figure 9. Progress indicated on the My Assignment Page '''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Grading Expectations and &amp;quot;Reasonableness&amp;quot;==&lt;br /&gt;
&lt;br /&gt;
For each query-candidate pair, we need you to assign BOTH a Broad Category score AND a Fine Score (i.e., a numeric grade between 0 and 100,  0 is meant to represent complete different and 100 perfectly similar or identical.). You have the freedom to make whatever associations you desire between a particular Broad Category score and its related Fine Score. In fact, we expect to see variations across evaluators with regard to the relationships between Broad Categories and Fine Scores as this is a normal part of human subjectivity. However, we will be using the two different types of scores to do important inter-related post-Evalutron calculations so, please, do be thoughtful in selecting your Broad Categories and related Fine Scores. What we are really asking here is that you apply a level of &amp;quot;reasonableness&amp;quot; to both your scores and your associations. For example, if you score a candidate in the VERY SIMILAR category, a Fine Score of 21 would not be, by most standards, &amp;quot;reasonable&amp;quot;. Same applies at the other extreme. For example, a Broad Category score of NOT SIMILAR should not be associated with a Fine Score of, say, 72 or 84, etc.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Coming and Going: Avoiding Grader Fatigue==&lt;br /&gt;
&lt;br /&gt;
Listening to, and then comparing, hundreds of audio files is very tiring. We have built into the back-end of the Evalutron 6000 system a rather robust database system that records in near-real-time your grading scores. These scores are saved along with information about which queries and candidates you have yet to review. All this information is stored in association with your personal sign-in ID. This means you can break up your grading over several days at times convenient and productive to you. In fact, we recommend that you not try to tackle your &amp;quot;assignment&amp;quot; in one big chunk as fresh ears are happy ears and happy ears make for better evaluations.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7223</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7223"/>
		<updated>2010-07-09T07:55:04Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
*results by year&lt;br /&gt;
**2010:MIREX2010_Results| MIREX 2010 Not Ready&lt;br /&gt;
**2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
**2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
**2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
**2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
**2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX CENTRAL HOME&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7222</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7222"/>
		<updated>2010-07-09T07:51:01Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Reverted edits by CameronJones (Talk) to last revision by Jdownie&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
*results by year&lt;br /&gt;
***2010:MIREX2010_Results| MIREX 2010 Not Ready&lt;br /&gt;
***2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
***2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
***2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
***2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
***2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX CENTRAL HOME&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7221</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7221"/>
		<updated>2010-07-09T07:50:36Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
** 2010:Main_Page|MIREX 2010 ([[2010:MIREX2010_Results|Not Ready]])&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
* results by year&lt;br /&gt;
** &lt;br /&gt;
** 2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
** 2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
** 2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
** 2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
** 2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX HOME&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MIREX_2010_Submission_Instructions&amp;diff=7211</id>
		<title>MIREX 2010 Submission Instructions</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MIREX_2010_Submission_Instructions&amp;diff=7211"/>
		<updated>2010-06-25T06:22:20Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2010:MIREX 2010 Submission Instructions&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#Redirect [[2010:MIREX 2010 Submission Instructions]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=7131</id>
		<title>Sandbox</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=7131"/>
		<updated>2010-06-05T09:01:59Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: testing math extension after upgrade&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;math&amp;gt;\alpha + 5 * \beta&amp;lt;/math&amp;gt;&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=File:Korean_keyboard_layout.png&amp;diff=7130</id>
		<title>File:Korean keyboard layout.png</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=File:Korean_keyboard_layout.png&amp;diff=7130"/>
		<updated>2010-06-05T09:01:20Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Testing file upload after upgrade&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Testing file upload after upgrade&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7061</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7061"/>
		<updated>2010-06-04T15:43:41Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
* results by year&lt;br /&gt;
** 2010:MIREX2010_Results| MIREX 2010 Not Ready&lt;br /&gt;
** 2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
** 2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
** 2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
** 2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
** 2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX HOME&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7060</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7060"/>
		<updated>2010-06-04T15:41:48Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* mirex by year&lt;br /&gt;
&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
&lt;br /&gt;
*results by year&lt;br /&gt;
***2010:MIREX2010_Results| MIREX 2010 Not Ready&lt;br /&gt;
***2009:MIREX2009_Results| MIREX 2009 Results &lt;br /&gt;
***2008:MIREX2008_Results| MIREX 2008 Results &lt;br /&gt;
***2007:MIREX2007_Results| MIREX 2007 Results &lt;br /&gt;
***2006:MIREX2006_Results| MIREX 2006 Results &lt;br /&gt;
***2005:MIREX2005_Results| MIREX 2005 Results &lt;br /&gt;
&lt;br /&gt;
* SEARCH&lt;br /&gt;
* navigation&lt;br /&gt;
** mainpage|MIREX HOME&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Audio_Drum_Detection&amp;diff=7050</id>
		<title>Audio Drum Detection</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Audio_Drum_Detection&amp;diff=7050"/>
		<updated>2010-06-04T10:37:11Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with '* 2005:Audio Drum Det'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[2005:Audio Drum Det]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Audio_Classification&amp;diff=7049</id>
		<title>Audio Classification</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Audio_Classification&amp;diff=7049"/>
		<updated>2010-06-04T09:53:47Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with '* 2005:Audio Artist Identification * 2005:Audio Genre Classification * 2007:Audio Genre Classification * 2007:Audio Music Mood Classification * [[2007:Audio Class…'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[2005:Audio Artist Identification]]&lt;br /&gt;
* [[2005:Audio Genre Classification]]&lt;br /&gt;
* [[2007:Audio Genre Classification]]&lt;br /&gt;
* [[2007:Audio Music Mood Classification]]&lt;br /&gt;
* [[2007:Audio Classical Composer Identification]]&lt;br /&gt;
* [[2008:Audio Genre Classification]]&lt;br /&gt;
* [[2008:Audio Music Mood Classification]]&lt;br /&gt;
* [[2009:Audio Genre Classification]]&lt;br /&gt;
* [[2009:Audio Music Mood Classification]]&lt;br /&gt;
* [[2010:Audio_Classification_(Train/Test)_Tasks]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Audio_Chord_Estimation&amp;diff=7048</id>
		<title>Audio Chord Estimation</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Audio_Chord_Estimation&amp;diff=7048"/>
		<updated>2010-06-04T09:48:10Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[2008:Audio Chord Detection]]&lt;br /&gt;
* [[2009:Audio Chord Detection]]&lt;br /&gt;
* [[2010:Audio Chord Estimation]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Audio_Chord_Estimation&amp;diff=7047</id>
		<title>Audio Chord Estimation</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Audio_Chord_Estimation&amp;diff=7047"/>
		<updated>2010-06-04T09:47:12Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with '* 2010:Audio Chord Estimation'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[2010:Audio Chord Estimation]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Audio_Beat_Tracking&amp;diff=7046</id>
		<title>Audio Beat Tracking</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Audio_Beat_Tracking&amp;diff=7046"/>
		<updated>2010-06-04T09:46:27Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with '* 2006:Audio Beat Tracking * 2009:Audio Beat Tracking * 2010:Audio Beat Tracking'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* [[2006:Audio Beat Tracking]]&lt;br /&gt;
* [[2009:Audio Beat Tracking]]&lt;br /&gt;
* [[2010:Audio Beat Tracking]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7045</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7045"/>
		<updated>2010-06-04T08:59:29Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* navigation&lt;br /&gt;
** mainpage|mainpage-description&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
* mirex&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
* SEARCH&lt;br /&gt;
* tasks&lt;br /&gt;
** Audio Beat Tracking|Audio Beat Tracking&lt;br /&gt;
** Audio Chord Estimation|Au. Chord Estimation&lt;br /&gt;
** Audio Classification|Au. Classification&lt;br /&gt;
** Audio Cover Song Identification|Au. Cover Song ID&lt;br /&gt;
** Audio Drum Detection|Au. Drum Detection&lt;br /&gt;
** Audio Key Detection|Au. Key Detection&lt;br /&gt;
** Audio Melody Extraction|Au. Melody Extraction&lt;br /&gt;
** Audio Music Similarity|Au. Music Similarity&lt;br /&gt;
** Audio Onset Detection|Au. Onset Detection&lt;br /&gt;
** Audio Tag Classification|Au. Tag Classification&lt;br /&gt;
** Audio Tempo Extraction|Au. Tempo Extraction&lt;br /&gt;
** Multi F0 Estimation|Multi F0 Estimation&lt;br /&gt;
** Query by Singing/Humming|Query by Singing / Humming&lt;br /&gt;
** Query by Tapping|Query by Tapping&lt;br /&gt;
** Score Following|Score Following&lt;br /&gt;
** Structural Segmentation|Structural Seg.&lt;br /&gt;
** Symbolic Classification|Symbolic Classification&lt;br /&gt;
** Symbolic Key Detection|Sym. Key Detection&lt;br /&gt;
** Symbolic Melodic Similarity|Sym. Melodic Similarity&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7044</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7044"/>
		<updated>2010-06-04T08:48:43Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* navigation&lt;br /&gt;
** mainpage|mainpage-description&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
* mirex&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
* SEARCH&lt;br /&gt;
* tasks&lt;br /&gt;
** Audio Beat Tracking|Audio Beat Tracking&lt;br /&gt;
** Audio Chord Estimation|Audio Chord Estimation&lt;br /&gt;
** Audio Classification|Audio Classification&lt;br /&gt;
** Audio Cover Song ID|Audio Cover Song ID&lt;br /&gt;
** Audio Drum Detection|Audio Drum Detection&lt;br /&gt;
** Audio Key Detection|Audio Key Detection&lt;br /&gt;
** Audio Melody Extraction|Audio Melody Extraction&lt;br /&gt;
** Audio Music Similarity|Audio Music Similarity&lt;br /&gt;
** Audio Onset Detection|Audio Onset Detection&lt;br /&gt;
** Audio Tag Classification|Audio Tag Classification&lt;br /&gt;
** Audio Tempo Extraction|Audio Tempo Extraction&lt;br /&gt;
** Multi F0|Multi F0&lt;br /&gt;
** Query by Singing/Humming|Query by Singing/Humming&lt;br /&gt;
** Query by Tapping|Query by Tapping&lt;br /&gt;
** Score Following|Score Following&lt;br /&gt;
** Structural Segmentation|Structural Segmentation&lt;br /&gt;
** Symbolic Classification|Symbolic Classification&lt;br /&gt;
** Symbolic Key Detection|Symbolic Key Detection&lt;br /&gt;
** Symbolic Melodic Similarity|Symbolic Melodic Similarity&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7043</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7043"/>
		<updated>2010-06-04T08:47:08Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* navigation&lt;br /&gt;
** mainpage|mainpage-description&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
* mirex&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
* SEARCH&lt;br /&gt;
* tasks&lt;br /&gt;
** Audio Beat Tracking&lt;br /&gt;
** Audio Chord Estimation&lt;br /&gt;
** Audio Classification&lt;br /&gt;
** Audio Cover Song ID&lt;br /&gt;
** Audio Drum Detection&lt;br /&gt;
** Audio Key Detection&lt;br /&gt;
** Audio Melody Extraction&lt;br /&gt;
** Audio Music Similarity&lt;br /&gt;
** Audio Onset Detection&lt;br /&gt;
** Audio Tag Classification&lt;br /&gt;
** Audio Tempo Extraction&lt;br /&gt;
** Multi F0&lt;br /&gt;
** Query by Singing/Humming&lt;br /&gt;
** Query by Tapping&lt;br /&gt;
** Score Following&lt;br /&gt;
** Structural Segmentation&lt;br /&gt;
** Symbolic Classification&lt;br /&gt;
** Symbolic Key Detection&lt;br /&gt;
** Symbolic Melodic Similarity&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7042</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7042"/>
		<updated>2010-06-04T08:29:46Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* navigation&lt;br /&gt;
** mainpage|mainpage-description&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
* mirex&lt;br /&gt;
** 2005:Main_Page|MIREX 2005&lt;br /&gt;
** 2006:Main_Page|MIREX 2006&lt;br /&gt;
** 2007:Main_Page|MIREX 2007&lt;br /&gt;
** 2008:Main_Page|MIREX 2008&lt;br /&gt;
** 2009:Main_Page|MIREX 2009&lt;br /&gt;
** 2010:Main_Page|MIREX 2010&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7041</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Sidebar&amp;diff=7041"/>
		<updated>2010-06-04T08:28:50Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with '* navigation ** mainpage|mainpage-description ** portal-url|portal ** currentevents-url|currentevents ** recentchanges-url|recentchanges ** randompage-url|randompage ** helppage|…'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* navigation&lt;br /&gt;
** mainpage|mainpage-description&lt;br /&gt;
** portal-url|portal&lt;br /&gt;
** currentevents-url|currentevents&lt;br /&gt;
** recentchanges-url|recentchanges&lt;br /&gt;
** randompage-url|randompage&lt;br /&gt;
** helppage|help&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* mirex&lt;br /&gt;
** 2005|2005:Main_Page&lt;br /&gt;
** 2006|2006:Main_Page&lt;br /&gt;
** 2007|2007:Main_Page&lt;br /&gt;
** 2008|2008:Main_Page&lt;br /&gt;
** 2009|2009:Main_Page&lt;br /&gt;
** 2010|2010:Main_Page&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MediaWiki:Mainpage&amp;diff=7040</id>
		<title>MediaWiki:Mainpage</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MediaWiki:Mainpage&amp;diff=7040"/>
		<updated>2010-06-04T08:25:13Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with 'MIREX HOME'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;MIREX HOME&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:MIREX_HOME&amp;diff=7039</id>
		<title>2010:MIREX HOME</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:MIREX_HOME&amp;diff=7039"/>
		<updated>2010-06-04T04:59:56Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* MIREX 2010 Evaluation Tasks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Welcome to MIREX 2010==&lt;br /&gt;
This is the main page for the sixth running of the Music Information Retrieval Evaluation eXchange (MIREX 2010). The International Music Information Retrieval Systems Evaluation Laboratory ([https://music-ir.org/evaluation IMIRSEL]) at the Graduate School of Library and Information Science ([http://www.lis.illinois.edu GSLIS]), University of Illinois at Urbana-Champaign ([http://www.illinois.edu UIUC]) is the principal organizer of MIREX 2010. &lt;br /&gt;
&lt;br /&gt;
The MIREX 2010 community will hold its annual meeting as part of [http://ismir2010.ismir.net/ The 11th International Conference on Music Information Retrieval], ISMIR 2010, which will be held in Utrecht, Netherlands, from August 9th to 13th, 2010. The MIREX plenary (working lunch) and poster sessions will be held Wednesday, 11 August 2010.&lt;br /&gt;
&lt;br /&gt;
J. Stephen Downie&amp;lt;br&amp;gt;&lt;br /&gt;
Director, IMIRSEL&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===MIREX 2010 Evaluation Tasks===&lt;br /&gt;
&lt;br /&gt;
The IMIRSEL team at UIUC solicited proposals for evaluation tasks to be performed at the Music Information Retrieval Evaluation eXchange 2010 (MIREX 2010) and polled the community on their likelihood of participation in each task. A summary of the responses from the community is given below:&lt;br /&gt;
&lt;br /&gt;
Results as of Monday 24th May 2010:&lt;br /&gt;
&lt;br /&gt;
Total individual responses = 74&lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv p=0&amp;gt;2010/poll/MIREX_Task_Participation_Poll.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Hence, the IMIRSEL team has decided to attempt the running of the following tasks at MIREX 2010:&lt;br /&gt;
* [[2010:Audio Classification (Train/Test) Tasks]], incorporating:&lt;br /&gt;
** Audio Artist Identification&lt;br /&gt;
** Audio US Pop Genre Classification&lt;br /&gt;
** Audio Latin Genre Classification&lt;br /&gt;
** Audio Music Mood Classification&lt;br /&gt;
** Audio Classical Composer Identification&lt;br /&gt;
* [[2010:Audio Cover Song Identification]]&lt;br /&gt;
* [[2010:Audio Tag Classification]] &lt;br /&gt;
* [[2010:Audio Music Similarity and Retrieval]]&lt;br /&gt;
* [[2010:Symbolic Melodic Similarity]]&lt;br /&gt;
* [[2010:Audio Onset Detection]]&lt;br /&gt;
* [[2010:Audio Key Detection]]&lt;br /&gt;
* [[2010:Real-time Audio to Score Alignment (a.k.a Score Following)]]&lt;br /&gt;
* [[2010:Query by Singing/Humming]]&lt;br /&gt;
* [[2010:Audio Melody Extraction]]&lt;br /&gt;
* [[2010:Multiple Fundamental Frequency Estimation &amp;amp; Tracking]]&lt;br /&gt;
* [[2010:Audio Chord Estimation]]&lt;br /&gt;
* &amp;lt;strike&amp;gt;[[2010:Query by Tapping]]&amp;lt;/strike&amp;gt;&lt;br /&gt;
* [[2010:Audio Beat Tracking]]&lt;br /&gt;
* [[2010:Structural Segmentation]]&lt;br /&gt;
&lt;br /&gt;
==== New 2010 Proposals ====&lt;br /&gt;
* &amp;lt;strike&amp;gt;[[2010:Harmonic Analysis]]&amp;lt;/strike&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Projected dates===&lt;br /&gt;
* 1st June 2010: MIREX submission system opens (target date)&lt;br /&gt;
* 22nd June - 1st July 2010: Rolling MIREX submission system closures (dates to be announced)&lt;br /&gt;
* 15th July 2010: MIREX results posting begins&lt;br /&gt;
* 1st August 2010: All MIREX results posted (somewhat hopeful target date)&lt;br /&gt;
* 2-6th August 2010: USMIR Summer School&lt;br /&gt;
* 9-13th August 2010: ISMIR conference&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Note to New Participants===&lt;br /&gt;
Please take the time to read the following review article that explains the history and structure of MIREX.&lt;br /&gt;
&lt;br /&gt;
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):&amp;lt;br&amp;gt;&lt;br /&gt;
A window into music information retrieval research.''Acoustical Science and Technology 29'' (4): 247-255. &amp;lt;br&amp;gt;&lt;br /&gt;
Available at: [http://dx.doi.org/10.1250/ast.29.247 http://dx.doi.org/10.1250/ast.29.247]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Note to All Participants===&lt;br /&gt;
Because MIREX is premised upon the sharing of ideas and results, '''ALL''' MIREX participants are expected to:&lt;br /&gt;
&lt;br /&gt;
# submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).&lt;br /&gt;
# submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2010 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)&lt;br /&gt;
# present a poster at the MIREX 2010 poster session at ISMIR 2010 (exact date to be announced)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Software Dependency Requests===&lt;br /&gt;
If you have not submitted to MIREX before or are unsure whether IMIRSEL/NEMA currently supports some of the software/architecture dependencies for your submission a [https://spreadsheets.google.com/embeddedform?formkey=dDltRjc4NDBDdkZiaF9qZXV0bU5ScUE6MA dependency request form is available]. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you. &lt;br /&gt;
&lt;br /&gt;
Due to the high volume of submissions expected at MIREX 2010, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.&lt;br /&gt;
&lt;br /&gt;
==Getting Involved in MIREX 2010==&lt;br /&gt;
MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2010 the best yet.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Mailing List Participation===&lt;br /&gt;
If you are interested in formal MIR evaluation, you should also subscribe to the &amp;quot;MIREX&amp;quot; (aka &amp;quot;EvalFest&amp;quot;) mail list and participate in the community discussions about defining and running MIREX 2010 tasks. Subscription information at: &lt;br /&gt;
[https://mail.lis.illinois.edu/mailman/listinfo/evalfest]. &lt;br /&gt;
&lt;br /&gt;
If you are participating in MIREX 2010, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2010 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here. &lt;br /&gt;
&lt;br /&gt;
Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2010 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Wiki Participation===&lt;br /&gt;
'''''Please note that you may need to create a NEW login for this wiki even if you have a login that you previously used for editing the MIREX 2005, 2006, 2007, 2008 or 2009 wikis.'''''&lt;br /&gt;
&lt;br /&gt;
However, starting in 2010 the MIREX wikis have been merged so that logins will persist for future iterations of MIREX.&lt;br /&gt;
&lt;br /&gt;
Please create an account via: [[Special:Userlogin]].&lt;br /&gt;
&lt;br /&gt;
Please note that because of &amp;quot;spam-bots&amp;quot;, MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==MIREX 2010 Submission Instructions==&lt;br /&gt;
&lt;br /&gt;
* Be sure to follow the [[2009:Best Coding Practices for MIREX | Best Coding Practices for MIREX]]&lt;br /&gt;
* Be sure to follow the [[MIREX 2010 Submission Instructions]]&lt;br /&gt;
&lt;br /&gt;
==MIREX 2005 - 2009 Wikis==&lt;br /&gt;
This is the new wiki for MIREX 2010. The wikis for MIREX 2005 - 2009 are available at:&lt;br /&gt;
&lt;br /&gt;
'''[[2009:Main_Page|MIREX 2009]]''' &lt;br /&gt;
https://www.music-ir.org/mirex/2009/&lt;br /&gt;
&lt;br /&gt;
'''[[2008:Main_Page|MIREX 2008]]''' &lt;br /&gt;
https://www.music-ir.org/mirex/2008/&lt;br /&gt;
&lt;br /&gt;
'''[[2007:Main_Page|MIREX 2007]]''' &lt;br /&gt;
https://www.music-ir.org/mirex/2007/&lt;br /&gt;
&lt;br /&gt;
'''[[2006:Main_Page|MIREX 2006]]''' &lt;br /&gt;
https://www.music-ir.org/mirex/2006/&lt;br /&gt;
&lt;br /&gt;
'''[[2005:Main_Page|MIREX 2005]]''' &lt;br /&gt;
https://www.music-ir.org/mirex/2005/&lt;br /&gt;
&lt;br /&gt;
You can interlink between this wiki and the previous wikis using '''2005:''' prefix on links to connect to pages in MIREX 2005 and '''2006:''' for MIREX 2006 and '''2007:''' for MIREX 2007 and '''2008:''' for MIREX 2008 and '''2009:''' for MIREX 2009.&lt;br /&gt;
&lt;br /&gt;
===ISMIR 2004 Audio Description Contest===&lt;br /&gt;
The Audio Description Contest held at ISMIR 2004 is a precursor to MIREX. Details of the ISMIR 2004 Audio Description Contest can be found at:&lt;br /&gt;
&lt;br /&gt;
'''[http://ismir2004.ismir.net/ISMIR_Contest.html| ISMIR 2004 Audio Description Contest]''' &lt;br /&gt;
http://ismir2004.ismir.net/ISMIR_Contest.html&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Structural_Segmentation&amp;diff=6971</id>
		<title>2010:Structural Segmentation</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Structural_Segmentation&amp;diff=6971"/>
		<updated>2010-05-29T00:44:31Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Frame clustering */ mathified the equations&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Description ==&lt;br /&gt;
&lt;br /&gt;
The aim of the MIREX structural segmentation evaluation is to identify the key structural sections in musical audio. The segment structure (or form) is one of the most important musical parameters. It is furthermore special because musical structure -- especially in popular music genres (e.g. verse, chorus, etc.) -- is accessible to everybody: it needs no particular musical knowledge. This task was first run in 2009.&lt;br /&gt;
&lt;br /&gt;
== Data == &lt;br /&gt;
&lt;br /&gt;
=== Collections ===&lt;br /&gt;
The final MIREX data set for structural segmentation is comprised of 297 songs. The majority come from the Beatles collection. Works from other artists round out the evaluation dataset.&lt;br /&gt;
&lt;br /&gt;
=== Audio Formats ===&lt;br /&gt;
&lt;br /&gt;
* CD-quality (PCM, 16-bit, 44100 Hz)&lt;br /&gt;
* single channel (mono)&lt;br /&gt;
&lt;br /&gt;
== Submission Format ==&lt;br /&gt;
&lt;br /&gt;
Submissions to this task will have to conform to a specified format detailed below. Submissions should be packaged and contain at least two files: The algorithm itself and a README containing contact information and detailing, in full, the use of the algorithm.&lt;br /&gt;
&lt;br /&gt;
=== Input Data ===&lt;br /&gt;
Participating algorithms will have to read audio in the following format:&lt;br /&gt;
&lt;br /&gt;
* Sample rate: 44.1 KHz&lt;br /&gt;
* Sample size: 16 bit&lt;br /&gt;
* Number of channels: 1 (mono)&lt;br /&gt;
* Encoding: WAV &lt;br /&gt;
&lt;br /&gt;
=== Output Data ===&lt;br /&gt;
&lt;br /&gt;
The structural segmentation algorithms will return the segmentation in an ASCII text file for each input .wav audio file. The specification of this output file is immediately below.&lt;br /&gt;
&lt;br /&gt;
=== Output File Format (Structural Segmentation) ===&lt;br /&gt;
&lt;br /&gt;
The Structural Segmentation output file format is a tab-delimited ASCII text format. This is the same as Chris Harte's chord labelling files (.lab), and so is the same format as the ground truth as well. Onset and offset times are given in seconds, and the labels are simply letters: 'A', 'B', ... with segments referring to the same structural element having the same label.&lt;br /&gt;
&lt;br /&gt;
Three column text file of the format&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;onset_time(sec)&amp;gt;\t&amp;lt;offset_time(sec)&amp;gt;\t&amp;lt;label&amp;gt;\n&lt;br /&gt;
 &amp;lt;onset_time(sec)&amp;gt;\t&amp;lt;offset_time(sec)&amp;gt;\t&amp;lt;label&amp;gt;\n&lt;br /&gt;
 ...&lt;br /&gt;
&lt;br /&gt;
where \t denotes a tab, \n denotes the end of line. The &amp;lt; and &amp;gt; characters are not included. An example output file would look something like:&lt;br /&gt;
&lt;br /&gt;
 0.000    5.223    A&lt;br /&gt;
 5.223    15.101   B&lt;br /&gt;
 15.101   20.334   A&lt;br /&gt;
&lt;br /&gt;
=== Algorithm Calling Format ===&lt;br /&gt;
&lt;br /&gt;
The submitted algorithm must take as arguments a SINGLE .wav file to perform the structural segmentation on as well as the full output path and filename of the output file. The ability to specify the output path and file name is essential. Denoting the input .wav file path and name as %input and the output file path and name as %output, a program called foobar could be called from the command-line as follows:&lt;br /&gt;
&lt;br /&gt;
 foobar %input %output&lt;br /&gt;
 foobar -i %input -o %output&lt;br /&gt;
&lt;br /&gt;
Moreover, if your submission takes additional parameters, foobar could be called like:&lt;br /&gt;
&lt;br /&gt;
 foobar .1 %input %output&lt;br /&gt;
 foobar -param1 .1 -i %input -o %output  &lt;br /&gt;
&lt;br /&gt;
If your submission is in MATLAB, it should be submitted as a function. Once again, the function must contain String inputs for the full path and names of the input and output files. Parameters could also be specified as input arguments of the function. For example: &lt;br /&gt;
&lt;br /&gt;
 foobar('%input','%output')&lt;br /&gt;
 foobar(.1,'%input','%output')&lt;br /&gt;
&lt;br /&gt;
=== README File ===&lt;br /&gt;
&lt;br /&gt;
A README file accompanying each submission should contain explicit instructions on how to to run the program (as well as contact information, etc.). In particular, each command line to run should be specified, using %input for the input sound file and %output for the resulting text file.&lt;br /&gt;
&lt;br /&gt;
For instance, to test the program foobar with a specific value for parameter param1, the README file would look like:&lt;br /&gt;
&lt;br /&gt;
 foobar -param1 .1 -i %input -o %output&lt;br /&gt;
&lt;br /&gt;
For a submission using MATLAB, the README file could look like:&lt;br /&gt;
&lt;br /&gt;
 matlab -r &amp;quot;foobar(.1,'%input','%output');quit;&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Evaluation Procedures ==&lt;br /&gt;
At the last ISMIR conference [http://ismir2008.ismir.net/papers/ISMIR2008_219.pdf Lukashevich] proposed a measure for segmentation evaluation. Because of the complexity of the structural segmentation task definition, several different evaluation measures will be employed to address different aspects. It should be noted that none of the evaluation measures cares about the true labels of the sections: they only denote the clustering. This means that it does not matter if the systems produce true labels such as &amp;quot;chorus&amp;quot; and &amp;quot;verse&amp;quot;, or arbitrary labels such as &amp;quot;A&amp;quot; and &amp;quot;B&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
=== Boundary retrieval ===&lt;br /&gt;
'''Hit rate''' Found segment boundaries are accepted to be correct if they are within 0.5s ([http://ismir2007.ismir.net/proceedings/ISMIR2007_p051_turnbull.pdf Turnbull et al. ISMIR2007]) or 3s ([http://dx.doi.org/10.1109/TASL.2007.910781 Levy &amp;amp; Sandler TASLP2008]) from a border in the ground truth. Based on the matched hits, ''boundary retrieval recall rate'', ''boundary retrieval precision rate'', and ''boundary retrieval F-measure'' are be calculated.&lt;br /&gt;
&lt;br /&gt;
'''Median deviation''' Two median deviation measure between boundaries in the result and ground truth are calculated: ''median true-to-guess'' is the median time from boundaries in ground truth to the closest boundaries in the result, and ''median guess-to-true'' is similarly the median time from boundaries in the result to boundaries in ground truth. ([http://ismir2007.ismir.net/proceedings/ISMIR2007_p051_turnbull.pdf Turnbull et al. ISMIR2007])&lt;br /&gt;
&lt;br /&gt;
=== Frame clustering ===&lt;br /&gt;
Both the result and the ground truth are handled in short frames (e.g., beat or fixed 100ms). All frame pairs in a structure description are handled. The pairs in which both frames are assigned to the same cluster (i.e., have the same label) form the sets &amp;lt;math&amp;gt;P_E&amp;lt;/math&amp;gt; (for the system result) and &amp;lt;math&amp;gt;P_A&amp;lt;/math&amp;gt; (for the ground truth). The ''pairwise precision rate'' can be calculated by &amp;lt;math&amp;gt;P = \frac{|P_E \cap P_A|}{|P_E|}&amp;lt;/math&amp;gt;, ''pairwise recall rate'' by &amp;lt;math&amp;gt;R = \frac{|P_E \cap P_A|}{|P_A|}&amp;lt;/math&amp;gt;, and ''pairwise F-measure'' by &amp;lt;math&amp;gt;F=\frac{2 P R}{P + R}&amp;lt;/math&amp;gt;. ([http://dx.doi.org/10.1109/TASL.2007.910781 Levy &amp;amp; Sandler TASLP2008])&lt;br /&gt;
&lt;br /&gt;
=== Normalised conditional entropies ===&lt;br /&gt;
Over- and under segmentation based evaluation measures proposed in [http://ismir2008.ismir.net/papers/ISMIR2008_219.pdf Lukashevich ISMIR2008].&lt;br /&gt;
Structure descriptions are represented as frame sequences with the associated cluster information (similar to the Frame clustering measure). Confusion matrix between the labels in ground truth and the result is calculated. The matrix C is of size |L_A| * |L_E|, i.e., number of unique labels in the ground truth times number of unique labels in the result. From the confusion matrix, the joint distribution is calculated by normalising the values with the total number of frames F:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j} = C_{i,j} / F&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Similarly, the two marginals are calculated:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_i^a = \sum_{j=1}^{|L_E|} C{i,j}/F&amp;lt;/math&amp;gt;, and&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_j^e = \sum_{i=1}^{|L_A|} C{i,j}/F&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Conditional distributions:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j}^{a|e} = C_{i,j} / \sum_{i=1}^{|L_A|} C{i,j}&amp;lt;/math&amp;gt;, and&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j}^{e|a} = C_{i,j} / \sum_{j=1}^{|L_E|} C{i,j}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The conditional entropies will then be&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;H(E|A) = - \sum_{i=1}^{|L_A|} p_i^a \sum_{j=1}^{|L_E|} p_{i,j}^{e|a} \log_2(p_{i,j}^{e|a})&amp;lt;/math&amp;gt;, and&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;H(A|E) = - \sum_{j=1}^{|L_E|} p_j^e \sum_{i=1}^{|L_A|} p_{i,j}^{a|e} \log_2(p_{i,j}^{a|e})&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The final evaluation measures will then be the oversegmentation score&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;S_O = 1 - \frac{H(E|A)}{\log_2(|L_E|)}&amp;lt;/math&amp;gt; , and the undersegmentation score&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;S_U = 1 - \frac{H(A|E)}{\log_2(|L_A|)}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Development Collections == &lt;br /&gt;
*Jouni Paulus's [http://www.cs.tut.fi/sgn/arg/paulus/structure.html structure analysis page] links to a corpus of 177 Beatles songs ([http://www.cs.tut.fi/sgn/arg/paulus/beatles_sections_TUT.zip zip file]). The Beatles annotations are not a part of the TUTstructure07 dataset. That dataset contains 557 songs, a list of which is available [http://www.cs.tut.fi/sgn/arg/paulus/TUTstructure07_files.html here].&lt;br /&gt;
&lt;br /&gt;
*Ewald Peiszer's [http://www.ifs.tuwien.ac.at/mir/audiosegmentation.html thesis page] links to a portion of the corpus he used: 43 non-Beatles pop songs (including 10 J-pop songs) ([http://www.ifs.tuwien.ac.at/mir/audiosegmentation/dl/ep_groundtruth_excl_Paulus.zip zip file]).&lt;br /&gt;
&lt;br /&gt;
These public corpora give a combined 220 songs.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Structural_Segmentation&amp;diff=6970</id>
		<title>2010:Structural Segmentation</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Structural_Segmentation&amp;diff=6970"/>
		<updated>2010-05-29T00:43:28Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Normalised conditional entropies */ Hopefully TeX is working now...&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Description ==&lt;br /&gt;
&lt;br /&gt;
The aim of the MIREX structural segmentation evaluation is to identify the key structural sections in musical audio. The segment structure (or form) is one of the most important musical parameters. It is furthermore special because musical structure -- especially in popular music genres (e.g. verse, chorus, etc.) -- is accessible to everybody: it needs no particular musical knowledge. This task was first run in 2009.&lt;br /&gt;
&lt;br /&gt;
== Data == &lt;br /&gt;
&lt;br /&gt;
=== Collections ===&lt;br /&gt;
The final MIREX data set for structural segmentation is comprised of 297 songs. The majority come from the Beatles collection. Works from other artists round out the evaluation dataset.&lt;br /&gt;
&lt;br /&gt;
=== Audio Formats ===&lt;br /&gt;
&lt;br /&gt;
* CD-quality (PCM, 16-bit, 44100 Hz)&lt;br /&gt;
* single channel (mono)&lt;br /&gt;
&lt;br /&gt;
== Submission Format ==&lt;br /&gt;
&lt;br /&gt;
Submissions to this task will have to conform to a specified format detailed below. Submissions should be packaged and contain at least two files: The algorithm itself and a README containing contact information and detailing, in full, the use of the algorithm.&lt;br /&gt;
&lt;br /&gt;
=== Input Data ===&lt;br /&gt;
Participating algorithms will have to read audio in the following format:&lt;br /&gt;
&lt;br /&gt;
* Sample rate: 44.1 KHz&lt;br /&gt;
* Sample size: 16 bit&lt;br /&gt;
* Number of channels: 1 (mono)&lt;br /&gt;
* Encoding: WAV &lt;br /&gt;
&lt;br /&gt;
=== Output Data ===&lt;br /&gt;
&lt;br /&gt;
The structural segmentation algorithms will return the segmentation in an ASCII text file for each input .wav audio file. The specification of this output file is immediately below.&lt;br /&gt;
&lt;br /&gt;
=== Output File Format (Structural Segmentation) ===&lt;br /&gt;
&lt;br /&gt;
The Structural Segmentation output file format is a tab-delimited ASCII text format. This is the same as Chris Harte's chord labelling files (.lab), and so is the same format as the ground truth as well. Onset and offset times are given in seconds, and the labels are simply letters: 'A', 'B', ... with segments referring to the same structural element having the same label.&lt;br /&gt;
&lt;br /&gt;
Three column text file of the format&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;onset_time(sec)&amp;gt;\t&amp;lt;offset_time(sec)&amp;gt;\t&amp;lt;label&amp;gt;\n&lt;br /&gt;
 &amp;lt;onset_time(sec)&amp;gt;\t&amp;lt;offset_time(sec)&amp;gt;\t&amp;lt;label&amp;gt;\n&lt;br /&gt;
 ...&lt;br /&gt;
&lt;br /&gt;
where \t denotes a tab, \n denotes the end of line. The &amp;lt; and &amp;gt; characters are not included. An example output file would look something like:&lt;br /&gt;
&lt;br /&gt;
 0.000    5.223    A&lt;br /&gt;
 5.223    15.101   B&lt;br /&gt;
 15.101   20.334   A&lt;br /&gt;
&lt;br /&gt;
=== Algorithm Calling Format ===&lt;br /&gt;
&lt;br /&gt;
The submitted algorithm must take as arguments a SINGLE .wav file to perform the structural segmentation on as well as the full output path and filename of the output file. The ability to specify the output path and file name is essential. Denoting the input .wav file path and name as %input and the output file path and name as %output, a program called foobar could be called from the command-line as follows:&lt;br /&gt;
&lt;br /&gt;
 foobar %input %output&lt;br /&gt;
 foobar -i %input -o %output&lt;br /&gt;
&lt;br /&gt;
Moreover, if your submission takes additional parameters, foobar could be called like:&lt;br /&gt;
&lt;br /&gt;
 foobar .1 %input %output&lt;br /&gt;
 foobar -param1 .1 -i %input -o %output  &lt;br /&gt;
&lt;br /&gt;
If your submission is in MATLAB, it should be submitted as a function. Once again, the function must contain String inputs for the full path and names of the input and output files. Parameters could also be specified as input arguments of the function. For example: &lt;br /&gt;
&lt;br /&gt;
 foobar('%input','%output')&lt;br /&gt;
 foobar(.1,'%input','%output')&lt;br /&gt;
&lt;br /&gt;
=== README File ===&lt;br /&gt;
&lt;br /&gt;
A README file accompanying each submission should contain explicit instructions on how to to run the program (as well as contact information, etc.). In particular, each command line to run should be specified, using %input for the input sound file and %output for the resulting text file.&lt;br /&gt;
&lt;br /&gt;
For instance, to test the program foobar with a specific value for parameter param1, the README file would look like:&lt;br /&gt;
&lt;br /&gt;
 foobar -param1 .1 -i %input -o %output&lt;br /&gt;
&lt;br /&gt;
For a submission using MATLAB, the README file could look like:&lt;br /&gt;
&lt;br /&gt;
 matlab -r &amp;quot;foobar(.1,'%input','%output');quit;&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Evaluation Procedures ==&lt;br /&gt;
At the last ISMIR conference [http://ismir2008.ismir.net/papers/ISMIR2008_219.pdf Lukashevich] proposed a measure for segmentation evaluation. Because of the complexity of the structural segmentation task definition, several different evaluation measures will be employed to address different aspects. It should be noted that none of the evaluation measures cares about the true labels of the sections: they only denote the clustering. This means that it does not matter if the systems produce true labels such as &amp;quot;chorus&amp;quot; and &amp;quot;verse&amp;quot;, or arbitrary labels such as &amp;quot;A&amp;quot; and &amp;quot;B&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
=== Boundary retrieval ===&lt;br /&gt;
'''Hit rate''' Found segment boundaries are accepted to be correct if they are within 0.5s ([http://ismir2007.ismir.net/proceedings/ISMIR2007_p051_turnbull.pdf Turnbull et al. ISMIR2007]) or 3s ([http://dx.doi.org/10.1109/TASL.2007.910781 Levy &amp;amp; Sandler TASLP2008]) from a border in the ground truth. Based on the matched hits, ''boundary retrieval recall rate'', ''boundary retrieval precision rate'', and ''boundary retrieval F-measure'' are be calculated.&lt;br /&gt;
&lt;br /&gt;
'''Median deviation''' Two median deviation measure between boundaries in the result and ground truth are calculated: ''median true-to-guess'' is the median time from boundaries in ground truth to the closest boundaries in the result, and ''median guess-to-true'' is similarly the median time from boundaries in the result to boundaries in ground truth. ([http://ismir2007.ismir.net/proceedings/ISMIR2007_p051_turnbull.pdf Turnbull et al. ISMIR2007])&lt;br /&gt;
&lt;br /&gt;
=== Frame clustering ===&lt;br /&gt;
Both the result and the ground truth are handled in short frames (e.g., beat or fixed 100ms). All frame pairs in a structure description are handled. The pairs in which both frames are assigned to the same cluster (i.e., have the same label) form the sets P_E (for the system result) and P_A (for the ground truth). The ''pairwise precision rate'' can be calculated by P = \frac{|P_E \cap P_A|}{|P_E|}, ''pairwise recall rate'' by R = \frac{|P_E \cap P_A|}{|P_A|}, and ''pairwise F-measure'' by F=\frac{2 P R}{P + R}. ([http://dx.doi.org/10.1109/TASL.2007.910781 Levy &amp;amp; Sandler TASLP2008])&lt;br /&gt;
&lt;br /&gt;
=== Normalised conditional entropies ===&lt;br /&gt;
Over- and under segmentation based evaluation measures proposed in [http://ismir2008.ismir.net/papers/ISMIR2008_219.pdf Lukashevich ISMIR2008].&lt;br /&gt;
Structure descriptions are represented as frame sequences with the associated cluster information (similar to the Frame clustering measure). Confusion matrix between the labels in ground truth and the result is calculated. The matrix C is of size |L_A| * |L_E|, i.e., number of unique labels in the ground truth times number of unique labels in the result. From the confusion matrix, the joint distribution is calculated by normalising the values with the total number of frames F:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j} = C_{i,j} / F&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Similarly, the two marginals are calculated:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_i^a = \sum_{j=1}^{|L_E|} C{i,j}/F&amp;lt;/math&amp;gt;, and&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_j^e = \sum_{i=1}^{|L_A|} C{i,j}/F&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Conditional distributions:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j}^{a|e} = C_{i,j} / \sum_{i=1}^{|L_A|} C{i,j}&amp;lt;/math&amp;gt;, and&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j}^{e|a} = C_{i,j} / \sum_{j=1}^{|L_E|} C{i,j}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The conditional entropies will then be&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;H(E|A) = - \sum_{i=1}^{|L_A|} p_i^a \sum_{j=1}^{|L_E|} p_{i,j}^{e|a} \log_2(p_{i,j}^{e|a})&amp;lt;/math&amp;gt;, and&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;H(A|E) = - \sum_{j=1}^{|L_E|} p_j^e \sum_{i=1}^{|L_A|} p_{i,j}^{a|e} \log_2(p_{i,j}^{a|e})&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The final evaluation measures will then be the oversegmentation score&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;S_O = 1 - \frac{H(E|A)}{\log_2(|L_E|)}&amp;lt;/math&amp;gt; , and the undersegmentation score&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;S_U = 1 - \frac{H(A|E)}{\log_2(|L_A|)}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Relevant Development Collections == &lt;br /&gt;
*Jouni Paulus's [http://www.cs.tut.fi/sgn/arg/paulus/structure.html structure analysis page] links to a corpus of 177 Beatles songs ([http://www.cs.tut.fi/sgn/arg/paulus/beatles_sections_TUT.zip zip file]). The Beatles annotations are not a part of the TUTstructure07 dataset. That dataset contains 557 songs, a list of which is available [http://www.cs.tut.fi/sgn/arg/paulus/TUTstructure07_files.html here].&lt;br /&gt;
&lt;br /&gt;
*Ewald Peiszer's [http://www.ifs.tuwien.ac.at/mir/audiosegmentation.html thesis page] links to a portion of the corpus he used: 43 non-Beatles pop songs (including 10 J-pop songs) ([http://www.ifs.tuwien.ac.at/mir/audiosegmentation/dl/ep_groundtruth_excl_Paulus.zip zip file]).&lt;br /&gt;
&lt;br /&gt;
These public corpora give a combined 220 songs.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=6969</id>
		<title>Sandbox</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=6969"/>
		<updated>2010-05-29T00:38:34Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;math&amp;gt;\alpha&amp;lt;/math&amp;gt;&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=6968</id>
		<title>Sandbox</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=6968"/>
		<updated>2010-05-29T00:28:11Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=6966</id>
		<title>Sandbox</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=Sandbox&amp;diff=6966"/>
		<updated>2010-05-28T20:23:17Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Created page with '&amp;lt;math&amp;gt;\alpha&amp;lt;/math&amp;gt;'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;math&amp;gt;\alpha&amp;lt;/math&amp;gt;&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Structural_Segmentation&amp;diff=6965</id>
		<title>2010:Structural Segmentation</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Structural_Segmentation&amp;diff=6965"/>
		<updated>2010-05-28T20:22:02Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: /* Normalised conditional entropies */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Description ==&lt;br /&gt;
&lt;br /&gt;
The aim of the MIREX structural segmentation evaluation is to identify the key structural sections in musical audio. The segment structure (or form) is one of the most important musical parameters. It is furthermore special because musical structure -- especially in popular music genres (e.g. verse, chorus, etc.) -- is accessible to everybody: it needs no particular musical knowledge. This task was first run in 2009.&lt;br /&gt;
&lt;br /&gt;
== Data == &lt;br /&gt;
&lt;br /&gt;
=== Collections ===&lt;br /&gt;
The final MIREX data set for structural segmentation is comprised of 297 songs. The majority come from the Beatles collection. Works from other artists round out the evaluation dataset.&lt;br /&gt;
&lt;br /&gt;
=== Audio Formats ===&lt;br /&gt;
&lt;br /&gt;
* CD-quality (PCM, 16-bit, 44100 Hz)&lt;br /&gt;
* single channel (mono)&lt;br /&gt;
&lt;br /&gt;
== Submission Format ==&lt;br /&gt;
&lt;br /&gt;
Submissions to this task will have to conform to a specified format detailed below. Submissions should be packaged and contain at least two files: The algorithm itself and a README containing contact information and detailing, in full, the use of the algorithm.&lt;br /&gt;
&lt;br /&gt;
=== Input Data ===&lt;br /&gt;
Participating algorithms will have to read audio in the following format:&lt;br /&gt;
&lt;br /&gt;
* Sample rate: 44.1 KHz&lt;br /&gt;
* Sample size: 16 bit&lt;br /&gt;
* Number of channels: 1 (mono)&lt;br /&gt;
* Encoding: WAV &lt;br /&gt;
&lt;br /&gt;
=== Output Data ===&lt;br /&gt;
&lt;br /&gt;
The structural segmentation algorithms will return the segmentation in an ASCII text file for each input .wav audio file. The specification of this output file is immediately below.&lt;br /&gt;
&lt;br /&gt;
=== Output File Format (Structural Segmentation) ===&lt;br /&gt;
&lt;br /&gt;
The Structural Segmentation output file format is a tab-delimited ASCII text format. This is the same as Chris Harte's chord labelling files (.lab), and so is the same format as the ground truth as well. Onset and offset times are given in seconds, and the labels are simply letters: 'A', 'B', ... with segments referring to the same structural element having the same label.&lt;br /&gt;
&lt;br /&gt;
Three column text file of the format&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;onset_time(sec)&amp;gt;\t&amp;lt;offset_time(sec)&amp;gt;\t&amp;lt;label&amp;gt;\n&lt;br /&gt;
 &amp;lt;onset_time(sec)&amp;gt;\t&amp;lt;offset_time(sec)&amp;gt;\t&amp;lt;label&amp;gt;\n&lt;br /&gt;
 ...&lt;br /&gt;
&lt;br /&gt;
where \t denotes a tab, \n denotes the end of line. The &amp;lt; and &amp;gt; characters are not included. An example output file would look something like:&lt;br /&gt;
&lt;br /&gt;
 0.000    5.223    A&lt;br /&gt;
 5.223    15.101   B&lt;br /&gt;
 15.101   20.334   A&lt;br /&gt;
&lt;br /&gt;
=== Algorithm Calling Format ===&lt;br /&gt;
&lt;br /&gt;
The submitted algorithm must take as arguments a SINGLE .wav file to perform the structural segmentation on as well as the full output path and filename of the output file. The ability to specify the output path and file name is essential. Denoting the input .wav file path and name as %input and the output file path and name as %output, a program called foobar could be called from the command-line as follows:&lt;br /&gt;
&lt;br /&gt;
 foobar %input %output&lt;br /&gt;
 foobar -i %input -o %output&lt;br /&gt;
&lt;br /&gt;
Moreover, if your submission takes additional parameters, foobar could be called like:&lt;br /&gt;
&lt;br /&gt;
 foobar .1 %input %output&lt;br /&gt;
 foobar -param1 .1 -i %input -o %output  &lt;br /&gt;
&lt;br /&gt;
If your submission is in MATLAB, it should be submitted as a function. Once again, the function must contain String inputs for the full path and names of the input and output files. Parameters could also be specified as input arguments of the function. For example: &lt;br /&gt;
&lt;br /&gt;
 foobar('%input','%output')&lt;br /&gt;
 foobar(.1,'%input','%output')&lt;br /&gt;
&lt;br /&gt;
=== README File ===&lt;br /&gt;
&lt;br /&gt;
A README file accompanying each submission should contain explicit instructions on how to to run the program (as well as contact information, etc.). In particular, each command line to run should be specified, using %input for the input sound file and %output for the resulting text file.&lt;br /&gt;
&lt;br /&gt;
For instance, to test the program foobar with a specific value for parameter param1, the README file would look like:&lt;br /&gt;
&lt;br /&gt;
 foobar -param1 .1 -i %input -o %output&lt;br /&gt;
&lt;br /&gt;
For a submission using MATLAB, the README file could look like:&lt;br /&gt;
&lt;br /&gt;
 matlab -r &amp;quot;foobar(.1,'%input','%output');quit;&amp;quot;&lt;br /&gt;
&lt;br /&gt;
== Evaluation Procedures ==&lt;br /&gt;
At the last ISMIR conference [http://ismir2008.ismir.net/papers/ISMIR2008_219.pdf Lukashevich] proposed a measure for segmentation evaluation. Because of the complexity of the structural segmentation task definition, several different evaluation measures will be employed to address different aspects. It should be noted that none of the evaluation measures cares about the true labels of the sections: they only denote the clustering. This means that it does not matter if the systems produce true labels such as &amp;quot;chorus&amp;quot; and &amp;quot;verse&amp;quot;, or arbitrary labels such as &amp;quot;A&amp;quot; and &amp;quot;B&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
=== Boundary retrieval ===&lt;br /&gt;
'''Hit rate''' Found segment boundaries are accepted to be correct if they are within 0.5s ([http://ismir2007.ismir.net/proceedings/ISMIR2007_p051_turnbull.pdf Turnbull et al. ISMIR2007]) or 3s ([http://dx.doi.org/10.1109/TASL.2007.910781 Levy &amp;amp; Sandler TASLP2008]) from a border in the ground truth. Based on the matched hits, ''boundary retrieval recall rate'', ''boundary retrieval precision rate'', and ''boundary retrieval F-measure'' are be calculated.&lt;br /&gt;
&lt;br /&gt;
'''Median deviation''' Two median deviation measure between boundaries in the result and ground truth are calculated: ''median true-to-guess'' is the median time from boundaries in ground truth to the closest boundaries in the result, and ''median guess-to-true'' is similarly the median time from boundaries in the result to boundaries in ground truth. ([http://ismir2007.ismir.net/proceedings/ISMIR2007_p051_turnbull.pdf Turnbull et al. ISMIR2007])&lt;br /&gt;
&lt;br /&gt;
=== Frame clustering ===&lt;br /&gt;
Both the result and the ground truth are handled in short frames (e.g., beat or fixed 100ms). All frame pairs in a structure description are handled. The pairs in which both frames are assigned to the same cluster (i.e., have the same label) form the sets P_E (for the system result) and P_A (for the ground truth). The ''pairwise precision rate'' can be calculated by P = \frac{|P_E \cap P_A|}{|P_E|}, ''pairwise recall rate'' by R = \frac{|P_E \cap P_A|}{|P_A|}, and ''pairwise F-measure'' by F=\frac{2 P R}{P + R}. ([http://dx.doi.org/10.1109/TASL.2007.910781 Levy &amp;amp; Sandler TASLP2008])&lt;br /&gt;
&lt;br /&gt;
=== Normalised conditional entropies ===&lt;br /&gt;
Over- and under segmentation based evaluation measures proposed in [http://ismir2008.ismir.net/papers/ISMIR2008_219.pdf Lukashevich ISMIR2008].&lt;br /&gt;
Structure descriptions are represented as frame sequences with the associated cluster information (similar to the Frame clustering measure). Confusion matrix between the labels in ground truth and the result is calculated. The matrix C is of size |L_A| * |L_E|, i.e., number of unique labels in the ground truth times number of unique labels in the result. From the confusion matrix, the joint distribution is calculated by normalising the values with the total number of frames F:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;p_{i,j} = C_{i,j} / F&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Similarly, the two marginals are calculated:&lt;br /&gt;
&lt;br /&gt;
p_i^a = \sum_{j=1}^{|L_E|} C{i,j}/F , and&lt;br /&gt;
&lt;br /&gt;
p_j^e = \sum_{i=1}^{|L_A|} C{i,j}/F&lt;br /&gt;
&lt;br /&gt;
Conditional distributions:&lt;br /&gt;
&lt;br /&gt;
p_{i,j}^{a|e} = C_{i,j} / \sum_{i=1}^{|L_A|} C{i,j} , and&lt;br /&gt;
&lt;br /&gt;
p_{i,j}^{e|a} = C_{i,j} / \sum_{j=1}^{|L_E|} C{i,j}&lt;br /&gt;
&lt;br /&gt;
The conditional entropies will then be&lt;br /&gt;
&lt;br /&gt;
H(E|A) = - \sum_{i=1}^{|L_A|} p_i^a \sum_{j=1}^{|L_E|} p_{i,j}^{e|a} \log_2(p_{i,j}^{e|a}), and&lt;br /&gt;
&lt;br /&gt;
H(A|E) = - \sum_{j=1}^{|L_E|} p_j^e \sum_{i=1}^{|L_A|} p_{i,j}^{a|e} \log_2(p_{i,j}^{a|e})&lt;br /&gt;
&lt;br /&gt;
The final evaluation measures will then be the oversegmentation score&lt;br /&gt;
&lt;br /&gt;
S_O = 1 - \frac{H(E|A)}{\log_2(|L_E|)} , and the undersegmentation score&lt;br /&gt;
&lt;br /&gt;
S_U = 1 - \frac{H(A|E)}{\log_2(|L_A|)}&lt;br /&gt;
&lt;br /&gt;
== Relevant Development Collections == &lt;br /&gt;
*Jouni Paulus's [http://www.cs.tut.fi/sgn/arg/paulus/structure.html structure analysis page] links to a corpus of 177 Beatles songs ([http://www.cs.tut.fi/sgn/arg/paulus/beatles_sections_TUT.zip zip file]). The Beatles annotations are not a part of the TUTstructure07 dataset. That dataset contains 557 songs, a list of which is available [http://www.cs.tut.fi/sgn/arg/paulus/TUTstructure07_files.html here].&lt;br /&gt;
&lt;br /&gt;
*Ewald Peiszer's [http://www.ifs.tuwien.ac.at/mir/audiosegmentation.html thesis page] links to a portion of the corpus he used: 43 non-Beatles pop songs (including 10 J-pop songs) ([http://www.ifs.tuwien.ac.at/mir/audiosegmentation/dl/ep_groundtruth_excl_Paulus.zip zip file]).&lt;br /&gt;
&lt;br /&gt;
These public corpora give a combined 220 songs.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2005&amp;diff=6765</id>
		<title>2005</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2005&amp;diff=6765"/>
		<updated>2010-05-19T17:06:37Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2005:Main Page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2005:Main_Page]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2006&amp;diff=6764</id>
		<title>2006</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2006&amp;diff=6764"/>
		<updated>2010-05-19T17:06:09Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2006:Main Page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2006:Main_Page]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2007&amp;diff=6763</id>
		<title>2007</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2007&amp;diff=6763"/>
		<updated>2010-05-19T17:05:50Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2007:Main Page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2007:Main_Page]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2008&amp;diff=6762</id>
		<title>2008</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2008&amp;diff=6762"/>
		<updated>2010-05-19T17:05:35Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2008:Main Page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2008:Main_Page]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2009&amp;diff=6761</id>
		<title>2009</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2009&amp;diff=6761"/>
		<updated>2010-05-19T16:53:55Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2009:Main Page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2009:Main_Page]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010&amp;diff=6760</id>
		<title>2010</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010&amp;diff=6760"/>
		<updated>2010-05-19T16:53:35Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: Redirected page to 2010:Main Page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[2010:Main_Page]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=MIREX_HOME&amp;diff=6759</id>
		<title>MIREX HOME</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=MIREX_HOME&amp;diff=6759"/>
		<updated>2010-05-17T22:16:28Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We recently have merged all current and previous iterations of the MIREX wiki into a single wiki installation to make it easier to manage. All the pages, images, abstracts, and images have been migrated, but some links and images may still be broken. We're currently manually inspecting all pages, but would appreciate your help in correcting any errors you see.&lt;br /&gt;
&lt;br /&gt;
Content on the wiki is now organized into mediawiki namespaces, one for each year. You can view the current 2010 content here: [[2010:Main_Page]]&lt;br /&gt;
&lt;br /&gt;
Similarly for previous content.&lt;br /&gt;
* [[2009:Main_Page]]&lt;br /&gt;
* [[2008:Main_Page]]&lt;br /&gt;
* [[2007:Main_Page]]&lt;br /&gt;
* [[2006:Main_Page]]&lt;br /&gt;
* [[2005:Main_Page]]&lt;br /&gt;
&lt;br /&gt;
All links to older wiki content will be redirected to this new wiki, and should take you to the correct page on the new installation, but please update any bookmarks or links you may have which point into current or old wiki content.&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2009:Query-by-Tapping_Results&amp;diff=6674</id>
		<title>2009:Query-by-Tapping Results</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2009:Query-by-Tapping_Results&amp;diff=6674"/>
		<updated>2010-05-14T03:39:06Z</updated>

		<summary type="html">&lt;p&gt;CameronJones: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
These are the results for the 2008 running of the Query-by-Singing/Humming task. For background information about this task set please refer to the [[2009:Query by Singing/Humming]] page. &lt;br /&gt;
&lt;br /&gt;
===Task Descriptions===&lt;br /&gt;
&lt;br /&gt;
'''Task 1 [[#Task 1 Results|Goto Task 1 Results]]''': The first subtask is the same as last year. In this subtask, submitted systems take a symbolic sung query as input and return a list of songs from the test database. Mean reciprocal rank (MRR) of the ground truth, as well as the simple hit(1)/miss(0) counting, is calculated over the top 10 returns. Two data sets are used:&lt;br /&gt;
&lt;br /&gt;
* [[#Task 1a, Jang's Dataset Results|Jang's Dataset]] Roger Jang's [http://neural.cs.nthu.edu.tw/jang2/dataSet/qbt4public/MIR-QBT.rar MIR-QBT]: This dataset contains both wav files (recorded via microphone) and onset files (human-labeled onset time).  136 ground truth songs with 890 queries. &lt;br /&gt;
* [[#Task 1b, Hsiao's Dataset Results|Hsiao's Dataset]] Show Hsiao's [http://neural.cs.nthu.edu.tw/jang2/dataSet/qbt4public/QBT_symbolic.rar QBT_symbolic]: This dataset contains only onset files (obtained from the user's tapping on keyboard). 143 ground truth songs with 410 queries. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Task 2 [[#Task 2 Results|Goto Task 2 Results]]''':The second subtask is the same as last year too. In this subtask, submitted systems take a wave-file sung query as input and return a list of songs from the test database. Mean reciprocal rank (MRR) of the ground truth, as well as the simple hit(1)/miss(0) counting, is calculated over the top 10 returns. Only Roger Jang's data http://neural.cs.nthu.edu.tw/jang2/dataSet/qbt4public/MIR-QBT.rar MIR-QBT] is used as the other dataset has no wave files.&lt;br /&gt;
&lt;br /&gt;
===General Legend===&lt;br /&gt;
====Team ID====&lt;br /&gt;
'''CSJ''' = [https://www.music-ir.org/mirex/2009/results/qbt/QbtChenJang.pdf  Chun-Ta Chen and Jyh-Shing Roger Jang]&amp;lt;br/&amp;gt;&lt;br /&gt;
'''HAFR''' = [https://www.music-ir.org/mirex/2009/results/qbt/HAFR.pdf  Pierre Hanna, Julien Allali, Pascal Ferraro,Matthias Robine]&amp;lt;br/&amp;gt;&lt;br /&gt;
'''HL''' = [https://www.music-ir.org/mirex/2009/results/qbt/QBT_Show.pdf  Shu-Jen Show Hsiao,Tyne Liang]&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Task 1 Results===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Task 1a, Jang's Dataset Results===&lt;br /&gt;
&lt;br /&gt;
=====Task 1a Overall Results=====&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/QbtFinalTask1a.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Task 1a Friedman's Test for Significant Differences====&lt;br /&gt;
The Friedman test was run in MATLAB against the QBSH Task 1 MRR data over the 48 ground truth song groups.&lt;br /&gt;
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);&lt;br /&gt;
&lt;br /&gt;
Simple Hit/Miss Count:&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/friedman/QbtTask1aSimple_friedman_tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2009_sqbttask1asimple_friedman_mean_ranks.png]]&lt;br /&gt;
&lt;br /&gt;
MRR Method:&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/friedman/Qbt1aTask2Mrr_friedman_tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2009_sqbt1atask2mrr_friedman_mean_ranks.png]]&lt;br /&gt;
&lt;br /&gt;
====Task 1a Summary Results by Query Group====&lt;br /&gt;
Simple Hit/Miss Counting&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtTask1aSimpleByGroup.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
MRR Method&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtTask1aMrrByGroup.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Task 1a Summary Results by Query ====&lt;br /&gt;
Simple Hit/Miss Counting&lt;br /&gt;
[https://www.music-ir.org/mirex/2009/results/qbt/QbtTask1aSimpleByQuery.csv]&lt;br /&gt;
&lt;br /&gt;
MRR Method&lt;br /&gt;
[https://www.music-ir.org/mirex/2009/results/qbt/QbtTask1aMrrByQuery.csv]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Task 1b, Hsiao's Dataset Results===&lt;br /&gt;
&lt;br /&gt;
=====Task 1b Overall Results=====&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtFinalTask1b.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Task 1b Friedman's Test for Significant Differences====&lt;br /&gt;
The Friedman test was run in MATLAB against the QBSH Task 1 MRR data over the 48 ground truth song groups.&lt;br /&gt;
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);&lt;br /&gt;
&lt;br /&gt;
Simple Hit/Miss Count:&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/friedman/QbtTask1bSimple_friedman_tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2009_sqbttask1bsimple_friedman_mean_ranks.png]]&lt;br /&gt;
&lt;br /&gt;
MRR Method:&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/friedman/QbtTask1bMrr_friedman_tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2009_sqbttask1bmrr_friedman_mean_ranks.png ]]&lt;br /&gt;
&lt;br /&gt;
====Task 1b Summary Results by Query Group====&lt;br /&gt;
Simple Hit/Miss Counting&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtTask1bSimpleByGroup.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
MRR Method&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtTask1bMrrByGroup.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Task 1b Summary Results by Query ====&lt;br /&gt;
Simple Hit/Miss Counting&lt;br /&gt;
[https://www.music-ir.org/mirex/2009/results/qbt/QbtTask1bSimpleByQuery.csv]&lt;br /&gt;
&lt;br /&gt;
MRR Method&lt;br /&gt;
[https://www.music-ir.org/mirex/2009/results/qbt/QbtTask1bMrrByQuery.csv]&lt;br /&gt;
&lt;br /&gt;
===Task 2 Results===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=====Task 2 Overall Results=====&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/QbtFinalTask2.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Task 2 Friedman's Test for Significant Differences====&lt;br /&gt;
The Friedman test was run in MATLAB against the QBSH Task 1 MRR data over the 48 ground truth song groups.&lt;br /&gt;
Command: [c,m,h,gnames] = multcompare(stats, 'ctype', 'tukey-kramer','estimate', 'friedman', 'alpha', 0.05);&lt;br /&gt;
&lt;br /&gt;
Simple Hit/Miss Count:&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/friedman/QbtTask2Simple_friedman_tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2009_sqbttask2simple_friedman_mean_ranks.png]]&lt;br /&gt;
&lt;br /&gt;
MRR Method:&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/friedman/QbtTask2Mrr_friedman_tukeyKramerHSD.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Image:2009_sqbttask2mrr_friedman_mean_ranks.png]]&lt;br /&gt;
&lt;br /&gt;
====Task 2 Summary Results by Query Group====&lt;br /&gt;
Simple Hit/Miss Counting&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtTask2SimpleByGroup.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
MRR Method&lt;br /&gt;
&amp;lt;csv p=2&amp;gt;2009/qbt/QbtTask2MrrByGroup.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
====Task 2 Summary Results by Query ====&lt;br /&gt;
Simple Hit/Miss Counting&lt;br /&gt;
[https://www.music-ir.org/mirex/2009/results/qbt/QbtTask2SimpleByQuery.csv]&lt;br /&gt;
&lt;br /&gt;
MRR Method&lt;br /&gt;
[https://www.music-ir.org/mirex/2009/results/qbt/QbtTask2MrrByQuery.csv]&lt;br /&gt;
&lt;br /&gt;
===Runtime Results===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;csv&amp;gt;2009/qbt/qbtRunTime.csv&amp;lt;/csv&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[Category: Results]]&lt;/div&gt;</summary>
		<author><name>CameronJones</name></author>
		
	</entry>
</feed>