<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://music-ir.org/mirex/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Karishma</id>
	<title>MIREX Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://music-ir.org/mirex/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Karishma"/>
	<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/wiki/Special:Contributions/Karishma"/>
	<updated>2026-04-13T20:12:09Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.1</generator>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Symbolic_Similarity_2010_Graders&amp;diff=7226</id>
		<title>2010:Symbolic Similarity 2010 Graders</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Symbolic_Similarity_2010_Graders&amp;diff=7226"/>
		<updated>2010-07-12T16:47:05Z</updated>

		<summary type="html">&lt;p&gt;Karishma: Created page with '=SMS 2010 Graders=  Welcome to the SMS grader sign-up page. Please give us your name and email contact information. If you obscure your email, please make it relatively obvious t…'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=SMS 2010 Graders=&lt;br /&gt;
&lt;br /&gt;
Welcome to the SMS grader sign-up page. Please give us your name and email contact information. If you obscure your email, please make it relatively obvious to us how to parse the address.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Template:&amp;lt;/b&amp;gt; Name. Location. &amp;lt;Email&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Sample:&amp;lt;/b&amp;gt; J. Stephen Downie. Illinois, USA. &amp;lt;jdownie@illinois.edu&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Sign Up Area==&lt;/div&gt;</summary>
		<author><name>Karishma</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Audio_Similarity_2010_Graders&amp;diff=7225</id>
		<title>2010:Audio Similarity 2010 Graders</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Audio_Similarity_2010_Graders&amp;diff=7225"/>
		<updated>2010-07-12T16:44:30Z</updated>

		<summary type="html">&lt;p&gt;Karishma: /* AMS 2009 Graders */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=AMS 2010 Graders=&lt;br /&gt;
&lt;br /&gt;
Welcome to the AMS grader sign-up page. Please give us your name and email contact information. If you obscure your email, please make it relatively obvious to us how to parse the address.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Template:&amp;lt;/b&amp;gt; Name. Location. &amp;lt;Email&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Sample:&amp;lt;/b&amp;gt; J. Stephen Downie. Illinois, USA. &amp;lt;jdownie@illinois.edu&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Sign Up Area==&lt;/div&gt;</summary>
		<author><name>Karishma</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2010:Audio_Similarity_2010_Graders&amp;diff=7224</id>
		<title>2010:Audio Similarity 2010 Graders</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2010:Audio_Similarity_2010_Graders&amp;diff=7224"/>
		<updated>2010-07-12T16:44:06Z</updated>

		<summary type="html">&lt;p&gt;Karishma: Created page with '=AMS 2009 Graders=  Welcome to the AMS grader sign-up page. Please give us your name and email contact information. If you obscure your email, please make it relatively obvious t…'&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=AMS 2009 Graders=&lt;br /&gt;
&lt;br /&gt;
Welcome to the AMS grader sign-up page. Please give us your name and email contact information. If you obscure your email, please make it relatively obvious to us how to parse the address.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Template:&amp;lt;/b&amp;gt; Name. Location. &amp;lt;Email&amp;gt; &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Sample:&amp;lt;/b&amp;gt; J. Stephen Downie. Illinois, USA. &amp;lt;jdownie@illinois.edu&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Sign Up Area==&lt;/div&gt;</summary>
		<author><name>Karishma</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2006:Audio_Beat_Tracking&amp;diff=6785</id>
		<title>2006:Audio Beat Tracking</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2006:Audio_Beat_Tracking&amp;diff=6785"/>
		<updated>2010-05-20T19:34:32Z</updated>

		<summary type="html">&lt;p&gt;Karishma: /* References */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Results ==&lt;br /&gt;
&lt;br /&gt;
Results are on [[2006:Audio Beat Tracking Results]] page. &lt;br /&gt;
&lt;br /&gt;
'''NOTE:''' Due to an evaluation error, the results of the Audio Beat Tracking task have been updated as of 17 July 2007, and differ from those presented at ISMIR 2006.&lt;br /&gt;
&lt;br /&gt;
== Proposers ==&lt;br /&gt;
&lt;br /&gt;
* Paul M. Brossier (Queen Mary, University of London) &amp;lt;piem at altern.org&amp;gt;&lt;br /&gt;
* Matthew Davies (Queen Mary, University of London) &amp;lt;matthew.davies at elec.qmul.ac.uk&amp;gt;&lt;br /&gt;
* Martin F. McKinney (Philips) &amp;lt;mckinney at alum.mit.edu&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Description ==&lt;br /&gt;
&lt;br /&gt;
The aim of the automatic beat tracking task is to track each beat locations in a collection of sound files. Unlike the [[2006:Audio Tempo Extraction]] task, which aim is to detect tempi for each file, the beat tracking task aims at detecting all beat locations in recordings. The algorithms will be evaluated in terms of their accuracy in predicting beat locations annotated by a group of listeners. &lt;br /&gt;
&lt;br /&gt;
=== Input data ===&lt;br /&gt;
&lt;br /&gt;
''Audio Format'':&lt;br /&gt;
&lt;br /&gt;
The sound files are the same 160 30-second excerpts (WAV format) used for the Audio Tempo contest.  Beat locations have been annotated in each excerpt by 40 different listeners (39 listeners for a few excerpts. The length of each excerpt is 30 seconds.&lt;br /&gt;
&lt;br /&gt;
''Audio Content'':&lt;br /&gt;
&lt;br /&gt;
The audio recordings were selected to provide a stable tempo value, a wide distribution of tempi values, and a large variety of instrumentation and musical styles. About 20% of the files contain non-binary meters, and a small number of examples contain changing meters.  One disadvantage of using this set for beat tracking is that the tempi are rather stable and this set will not test beat-tracking algorithms in their ability to track tempo changes.&lt;br /&gt;
&lt;br /&gt;
=== Output data ===&lt;br /&gt;
&lt;br /&gt;
Submitted programs should output one beat location per line, with a ┬½new line┬╗ character (\n) at the end of each line. The results should either be saved to a text file.&lt;br /&gt;
&lt;br /&gt;
Example of possible output:&lt;br /&gt;
&lt;br /&gt;
0.0123156&amp;lt;br&amp;gt;&lt;br /&gt;
1.9388662&amp;lt;br&amp;gt;&lt;br /&gt;
3.8777323&amp;lt;br&amp;gt;&lt;br /&gt;
5.8165980&amp;lt;br&amp;gt;&lt;br /&gt;
7.7554634&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Each submission should be accompanied with a README file describing how the program should be used. For instance:&lt;br /&gt;
&lt;br /&gt;
To run the program ''foobar'' on the file input.wav and store the results in the file output.txt, the following command should be used:&lt;br /&gt;
&lt;br /&gt;
  foobar -i input.wav &amp;gt; output.txt&lt;br /&gt;
&lt;br /&gt;
== Participants ==&lt;br /&gt;
&lt;br /&gt;
* Miguel Alonso and Ga├½l Richard (ENST, Paris), &amp;lt;miguel.alonso at enst.fr&amp;gt;, &amp;lt;gael.richard at enst.fr&amp;gt; ''(to be confirmed)''&lt;br /&gt;
* Paul Brossier (Queen Mary, University of London), &amp;lt;piem at altern.org&amp;gt;&lt;br /&gt;
* Matthew Davies (Queen Mary, University of London), &amp;lt;matthew.davies at elec.qmul.ac.uk&amp;gt;&lt;br /&gt;
* Douglas Eck (University of Montreal), &amp;lt;eckdoug at iro.umontreal.ca&amp;gt;&lt;br /&gt;
* Geoffroy Peeters (IRCAM, Paris), &amp;lt;peeters at ircam.fr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Other potential participants:&lt;br /&gt;
&lt;br /&gt;
* Fabien Gouyon (University Pompeu Fabra) and Simon Dixon (OFAI), &amp;lt;fabien.gouyon at iua.upf.es&amp;gt;, &amp;lt;simon at oefai.at&amp;gt;&lt;br /&gt;
* Anssi Klapuri (Tampere International Center for Signal Processing, Finland), &amp;lt;klap at cs.tut.fi&amp;gt;&lt;br /&gt;
* Martin F. McKinney (Philips) &amp;lt;mckinney at alum.mit.edu&amp;gt;&lt;br /&gt;
* Dirk Moelants (IPEM, Ghent University) &amp;lt;dirk at moelants.net&amp;gt;&lt;br /&gt;
* Bill Sethares (University of Wisconsin-Madison), &amp;lt;sethares at ece.wisc.edu&amp;gt;&lt;br /&gt;
* George Tzanetakis (University of Victoria), &amp;lt;gtzan at cs.uvic.ca&amp;gt;&lt;br /&gt;
* Christian Uhle (Fraunhofer Institut), &amp;lt;uhle at idmt.fhg.de&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Evaluation Procedures ==&lt;br /&gt;
&lt;br /&gt;
''This is a major re-write by Martin McKinney and is open to suggestions.''&lt;br /&gt;
&lt;br /&gt;
Evaluation of beat-tracking includes an implicit evaluation of tempo accuracy, however, the focus here will be on proper time position of beats.  We propose the following evaluation method, which is quite simple in nature and accounts for ambiguity in the perception of the most salient metrical level:  For each excerpt, an impulse train will be created from each of the 40 annotated ground truth beat vectors as well as from the algorithm output.  The impulse trains will be 25 seconds long (ignoring tapped beats at times less than 5 seconds), constructed with a 100-Hz sampling rate, and have unit impulses at beat times.  Each impulse train of annotations will be denoted by &amp;lt;math&amp;gt;a_s[n]&amp;lt;/math&amp;gt;, where the subscript &amp;lt;math&amp;gt;s&amp;lt;/math&amp;gt; is the annotator number (1-40), and the impulse train from the algorithm will be donoted by &amp;lt;math&amp;gt;y[n]&amp;lt;/math&amp;gt;.  The performance, &amp;lt;math&amp;gt;p&amp;lt;/math&amp;gt;, of the beat-tracking algorithm for a single excperpt will be measured by calculating the cross-correlation function of &amp;lt;math&amp;gt;a_s[n]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;y[n]&amp;lt;/math&amp;gt; within a small delay window, &amp;lt;math&amp;gt;W&amp;lt;/math&amp;gt;, around zero and then averaged across the number of annotators (&amp;lt;math&amp;gt;S&amp;lt;/math&amp;gt;):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;P = \frac{1}{S}\sum_{s=1}^{S}\frac{1}{NP}\sum_{m=-W}^{+W}{\sum_{n=1}^{N}{y[n] \cdot a_s[n-m]}}&amp;lt;/math&amp;gt;,&lt;br /&gt;
&lt;br /&gt;
where &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; is the sample-length of pulse trains &amp;lt;math&amp;gt;y[n]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;a_s[n]&amp;lt;/math&amp;gt;, and NP is a normalization factor defined by the maximum number of impulses in either impulse train:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;NP = \mbox{max}(\sum{y[n]},\sum{a_s[n]})&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The &amp;quot;error&amp;quot; window, W, is proportional to (1/5 of) the beat in the annotated taps and is defined (in Matlab notation ;-) as:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;math&amp;gt;W&amp;lt;/math&amp;gt; = round(0.2 * median(diff(find((a_s[n])))).&lt;br /&gt;
&lt;br /&gt;
The algorithm with the best average P-score (across excerpts) will win.&lt;br /&gt;
&lt;br /&gt;
''The choice of 1/5 of the beat was somewhat arbitrarily chosen and is open for discussion.  I've used this method to examine correlations between taps of different subjects and it works quite well.  Comments please.  -Martin''&lt;br /&gt;
&lt;br /&gt;
== Evaluation Database ==&lt;br /&gt;
&lt;br /&gt;
A collection of 160 musical exerpts will be used for the evaluation procedure, the same collection used for the [[2006:Audio Tempo Extraction]] contest. Each recording has been annotated by 40 different listeners (39 in a few cases). The annotation procedures were described in [2] and [3].&lt;br /&gt;
&lt;br /&gt;
20 excerpts will be provided for training to the participant, and the remaining 140 excerpts, novel to all participants, will be used for the contest.&lt;br /&gt;
&lt;br /&gt;
== References == &lt;br /&gt;
&lt;br /&gt;
# Masataka Goto and Yoichi Muraoka. Issues in evaluating beat tracking systems. In Working Notes of IJCAI-97 Workshop on Issues in AI and Music - Evaluation and Assessment, pages 9┬¡16, 1997 [http://staff.aist.go.jp/m.goto/PAPER/AIM97.300dpi.ps postscript]&lt;br /&gt;
# McKinney, M.F. and Moelants, D. (2004), Deviations from the resonance theory of tempo induction, Conference on Interdisciplinary Musicology, Graz. [http://gewi.kfunigraz.ac.at/~cim04/CIM04_paper_pdf/McKinney_Moelants_CIM04_proceedings_t.pdf pdf]&lt;br /&gt;
# Moelants, D. and McKinney, M.F. (2004), Tempo perception and musical content: What makes a piece slow, fast, or temporally ambiguous? International Conference on Music Perception &amp;amp; Cognition, Evanston, IL. [http://www.northwestern.edu/icmpc/proceedings/ICMPC8/PDF/AUTHOR/MP040237.pdf pdf]&lt;br /&gt;
&lt;br /&gt;
== Comments ==&lt;br /&gt;
&lt;br /&gt;
Paul's comments: as noted off-list by Matthew Davies, Goto proposed evaluation metrics to evaluate beat tracking algorithms accuracy [1] are somewhat difficult to apply to all beat tracking algorithms without modifications, since they assume the algorithms stabilise on a robust tempo value only after 45 seconds. Even after removing this 45s constraint, the four different metrics obtained by this method are somewhat difficult to interpret.&lt;br /&gt;
&lt;br /&gt;
==Practice Data==&lt;br /&gt;
You can find it here:&lt;br /&gt;
&lt;br /&gt;
https://www.music-ir.org/evaluation/MIREX/data/2006/beat/&lt;br /&gt;
&lt;br /&gt;
User: beattrack Password: b34trx&lt;br /&gt;
&lt;br /&gt;
https://www.music-ir.org/evaluation/MIREX/data/2006/tempo/&lt;br /&gt;
&lt;br /&gt;
User: tempo Password: t3mp0&lt;br /&gt;
&lt;br /&gt;
Data has been uploaded in both .tgz and .zip format.&lt;/div&gt;</summary>
		<author><name>Karishma</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2009:MIREX_2009_Submission_Instructions&amp;diff=6784</id>
		<title>2009:MIREX 2009 Submission Instructions</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2009:MIREX_2009_Submission_Instructions&amp;diff=6784"/>
		<updated>2010-05-20T18:37:52Z</updated>

		<summary type="html">&lt;p&gt;Karishma: /* Extended Abstract Details */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
This is the official set of submission guidelines for MIREX 2009. The URL for MIREX 2009 Submission System can be found at the bottom of this page. We are deliberately placing the Submission URL at the bottom of this page to force folks to thoroughly read through this MIREX 2009 instruction set.&lt;br /&gt;
&lt;br /&gt;
==Step-By-Step Instructions==&lt;br /&gt;
# Read this wiki page from top to bottom.&lt;br /&gt;
# Go read Andreas Ehmann's [[2009:Best Coding Practices for MIREX]].&lt;br /&gt;
# Go read the task-specific instructions for each task to which you will be submitting. Links to each task are available off the [[2009:Main Page|MIREX 2009]] main wiki page.&lt;br /&gt;
# For each submission, be sure you understand the Input/Output format issues as laid down by the task organizers.&lt;br /&gt;
# For each submission, be absolutely sure you have constructed a README.txt file that:&lt;br /&gt;
## Begins with submitter name(s) and contact information (email, etc.).&lt;br /&gt;
## Meets whatever task-specific requirements were laid down by the task organizers.&lt;br /&gt;
## Provides the team at IMIRSEL with a set of''' clear and complete''' instructions on how to run your program.&lt;br /&gt;
# Create a first-draft extended abstract that outlines the basic methods and techniques being deployed in your submission. See Extended Abstract Details below.&lt;br /&gt;
# Review, one last time, all the various submission requirements.&lt;br /&gt;
# Go to the MIREX 2009 Submission System page (URL at bottom of this page).&lt;br /&gt;
# Follow the instructions provided on the MIREX 2009 Submission System page.&lt;br /&gt;
# Submit your spiffy code bundle (don't forget the README.txt file) and your first-pass extended abstract!&lt;br /&gt;
&lt;br /&gt;
==Extended Abstract Details==&lt;br /&gt;
The extended abstracts provide the outside world with a general understanding of what each submission is trying to accomplish. The extended abstracts need NOT be cutting edge/never-before-published materials. The extended abstracts will be revised by the authors after the data has been collected (to allow for commentary on results data); however, we at IMIRSEL still need the first-pass drafts at submission time to help us understand what is happening in the submission. Like last year we will post the final versions of the extended abstracts as part of the MIREX 2009 results page (see MIREX 2005 Results; https://www.music-ir.org/evaluation/mirex-results/).&lt;br /&gt;
&lt;br /&gt;
The MIREX 2009 extended abstracts:&lt;br /&gt;
# Are two to four pages long (think ISMIR poster/demo papers).&lt;br /&gt;
# Must be created using the ISMIR 2009 template set: http://ismir2009.ismir.net/submit .&lt;br /&gt;
# Must be submitted in PDF format.&lt;br /&gt;
# Should include, if exists, references to other publications about your work (yes, self-reference is encouraged!)&lt;br /&gt;
# Should have the same general look and feel as these examples from last year: &lt;br /&gt;
&lt;br /&gt;
* https://www.music-ir.org/evaluation/mirex-results/articles/audio_genre/west.pdf&lt;br /&gt;
* https://www.music-ir.org/evaluation/mirex-results/articles/audio_genre/pampalk.pdf&lt;br /&gt;
&lt;br /&gt;
=MIREX 2009 Submission System URL=&lt;br /&gt;
&lt;br /&gt;
[https://www.music-ir.org/evaluation/MIREX/submission/ https://www.music-ir.org/evaluation/MIREX/submission/]&lt;/div&gt;</summary>
		<author><name>Karishma</name></author>
		
	</entry>
	<entry>
		<id>https://music-ir.org/mirex/w/index.php?title=2009:MIREX_2009_Submission_Instructions&amp;diff=6783</id>
		<title>2009:MIREX 2009 Submission Instructions</title>
		<link rel="alternate" type="text/html" href="https://music-ir.org/mirex/w/index.php?title=2009:MIREX_2009_Submission_Instructions&amp;diff=6783"/>
		<updated>2010-05-20T18:36:38Z</updated>

		<summary type="html">&lt;p&gt;Karishma: /* Extended Abstract Details */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Introduction==&lt;br /&gt;
This is the official set of submission guidelines for MIREX 2009. The URL for MIREX 2009 Submission System can be found at the bottom of this page. We are deliberately placing the Submission URL at the bottom of this page to force folks to thoroughly read through this MIREX 2009 instruction set.&lt;br /&gt;
&lt;br /&gt;
==Step-By-Step Instructions==&lt;br /&gt;
# Read this wiki page from top to bottom.&lt;br /&gt;
# Go read Andreas Ehmann's [[2009:Best Coding Practices for MIREX]].&lt;br /&gt;
# Go read the task-specific instructions for each task to which you will be submitting. Links to each task are available off the [[2009:Main Page|MIREX 2009]] main wiki page.&lt;br /&gt;
# For each submission, be sure you understand the Input/Output format issues as laid down by the task organizers.&lt;br /&gt;
# For each submission, be absolutely sure you have constructed a README.txt file that:&lt;br /&gt;
## Begins with submitter name(s) and contact information (email, etc.).&lt;br /&gt;
## Meets whatever task-specific requirements were laid down by the task organizers.&lt;br /&gt;
## Provides the team at IMIRSEL with a set of''' clear and complete''' instructions on how to run your program.&lt;br /&gt;
# Create a first-draft extended abstract that outlines the basic methods and techniques being deployed in your submission. See Extended Abstract Details below.&lt;br /&gt;
# Review, one last time, all the various submission requirements.&lt;br /&gt;
# Go to the MIREX 2009 Submission System page (URL at bottom of this page).&lt;br /&gt;
# Follow the instructions provided on the MIREX 2009 Submission System page.&lt;br /&gt;
# Submit your spiffy code bundle (don't forget the README.txt file) and your first-pass extended abstract!&lt;br /&gt;
&lt;br /&gt;
==Extended Abstract Details==&lt;br /&gt;
The extended abstracts provide the outside world with a general understanding of what each submission is trying to accomplish. The extended abstracts need NOT be cutting edge/never-before-published materials. The extended abstracts will be revised by the authors after the data has been collected (to allow for commentary on results data); however, we at IMIRSEL still need the first-pass drafts at submission time to help us understand what is happening in the submission. Like last year we will post the final versions of the extended abstracts as part of the MIREX 2009 results page (see MIREX 2005 Results; https://www.music-ir.org/evaluation/mirex-results/).&lt;br /&gt;
&lt;br /&gt;
The MIREX 2009 extended abstracts:&lt;br /&gt;
# Are two to four pages long (think ISMIR poster/demo papers).&lt;br /&gt;
# Must be created using the ISMIR 2009 template set: http://www.ismir2009.ismir.net/submit .&lt;br /&gt;
# Must be submitted in PDF format.&lt;br /&gt;
# Should include, if exists, references to other publications about your work (yes, self-reference is encouraged!)&lt;br /&gt;
# Should have the same general look and feel as these examples from last year: &lt;br /&gt;
&lt;br /&gt;
* https://www.music-ir.org/evaluation/mirex-results/articles/audio_genre/west.pdf&lt;br /&gt;
* https://www.music-ir.org/evaluation/mirex-results/articles/audio_genre/pampalk.pdf&lt;br /&gt;
&lt;br /&gt;
=MIREX 2009 Submission System URL=&lt;br /&gt;
&lt;br /&gt;
[https://www.music-ir.org/evaluation/MIREX/submission/ https://www.music-ir.org/evaluation/MIREX/submission/]&lt;/div&gt;</summary>
		<author><name>Karishma</name></author>
		
	</entry>
</feed>