Difference between revisions of "2012:MIREX Home"

From MIREX Wiki
(Announcing the K-MIREX KETI/Illinois Collaboration)
(MIREX 2012 Possible Evaluation Tasks)
 
(35 intermediate revisions by 2 users not shown)
Line 8: Line 8:
 
Director, IMIRSEL<br>
 
Director, IMIRSEL<br>
  
==Announcing the K-MIREX KETI/Illinois Collaboration==
+
==Announcing the KETI/Illinois  ''K-MIREX Collaboration''==
  
The MIREX team at the University of Illinois is very proud to announce its new <b><i>K-MIREX Collaboration</i></b> with the research team led by Dr. Seok-Pil Lee and Dr. Chai-Jong Song of the Digital Media Research Center at the Korea Electronics Technology Institute (KETI) [http://www.keti.re.kr/e-keti/ http://www.keti.re.kr/e-keti/]. Dr. Song and his KETI colleagues will be taking the lead on running the 2012 <b>Query by Singing/Humming (QBSH)</b> and <b>Audio Melody Extracation (AME) Tasks </b>. We do not foresee any special deviations from traditional MIREX submission procedures for these two tasks. Should they arise, participants will be informed.
+
The MIREX team at the University of Illinois is very proud to announce its new <b><i>K-MIREX Collaboration</i></b> with the research team led by Dr. Seok-Pil Lee and Chai-Jong Song of the Digital Media Research Center at the Korea Electronics Technology Institute (KETI) [http://www.keti.re.kr/e-keti/ http://www.keti.re.kr/e-keti/]. Song and his KETI colleagues will be taking the lead on running the 2012 <b>Query by Singing/Humming (QBSH)</b> and <b>Audio Melody Extraction (AME) Tasks </b>. We do not foresee any special deviations from traditional MIREX submission procedures for these two tasks. Should they arise, participants will be informed.
  
 
==MIREX and the <i>Million Song Dataset Challenge</i>==
 
==MIREX and the <i>Million Song Dataset Challenge</i>==
  
The MIREX team at the University of Illinois is also proud to announce its co-operative engagement with the Million Song Dataset (MSD) Challenge team of Brian McFee, Thierry Bertin-Mahieux, Daniel P.W. Ellis, and Gert Lanckriet. Below the recent announcement of the MSD Challenge.
+
The MIREX team at the University of Illinois is also proud to announce its co-operative engagement with the Million Song Dataset (MSD) Challenge team of Brian McFee, Thierry Bertin-Mahieux, Daniel P.W. Ellis, and Gert Lanckriet. Below is the recent email announcement of the MSD Challenge.
  
<pre>
+
:<b>The Million Song Dataset Challenge</b><br>
The Million Song Dataset Challenge
+
:'''DEADLINE''': '''2012-08-09 23:59 UTC'''.<br>
DEADLINE: 2012-08-09 23:59 UTC.
+
:'''URL''':[http://www.kaggle.com/c/msdchallenge http://www.kaggle.com/c/msdchallenge].<br>
URL: http://www.kaggle.com/c/msdchallenge.
 
  
We are happy to announce the Million Song Dataset Challenge: a large-scale, open evaluation of personalized music recommendation algorithms.
+
:We are happy to announce the Million Song Dataset Challenge: a large-scale, open evaluation of personalized music recommendation algorithms.
  
We provide listening history data for 1.1 million users (1 million train, 110 thousand validation and test) and over 380 thousand songs from the Million Song Dataset.  Given partial historical data for each test user, the goal is to produce a ranking over the remaining songs for that user.
+
:We provide listening history data for 1.1 million users (1 million train, 110 thousand validation and test) and over 380 thousand songs from the Million Song Dataset.  Given partial historical data for each test user, the goal is to produce a ranking over the remaining songs for that user.
  
What makes this challenge unique?  Openness!  The songs in the dataset are equipped with metadata (e.g., artist and title), as well as audio content analysis, semantic annotations, lyrics, etc.  Because the data is open, participants are free to construct and include any additional features, or ignore the features altogether.  The field is wide open!
+
:What makes this challenge unique?  Openness!  The songs in the dataset are equipped with metadata (e.g., artist and title), as well as audio content analysis, semantic annotations, lyrics, etc.  Because the data is open, participants are free to construct and include any additional features, or ignore the features altogether.  The field is wide open!
  
Details:
+
:Details:<br>
The challenge ends 2012-08-09 23:59 UTC.
+
:The challenge ends 2012-08-09 23:59 UTC.
Evaluation is handled through Kaggle.com, and the details can be found at http://www.kaggle.com/c/msdchallenge.
+
 
A post-analysis of the leading submissions will be performed by the Music Information Retrieval Evaluation eXchange (MIREX), and the results presented at the 13th International Society for Music Information Retrieval (ISMIR) conference in October, 2012.
+
:Evaluation is handled through Kaggle.com, and the details can be found at [http://www.kaggle.com/c/msdchallenge http://www.kaggle.com/c/msdchallenge]. <br>
For details on the Million Song Dataset, please see http://labrosa.ee.columbia.edu/millionsong/.
+
 
For background information about the contest and the organizing team, please see http://labrosa.ee.columbia.edu/millionsong/challenge.  
+
:A post-analysis of the leading submissions will be performed by the Music Information Retrieval Evaluation eXchange (MIREX), and the results presented at the 13th International Society for Music Information Retrieval (ISMIR) conference in October, 2012.
</pre>
+
 
 +
:For details on the Million Song Dataset, please see [http://labrosa.ee.columbia.edu/millionsong/ http://labrosa.ee.columbia.edu/millionsong/].
 +
 
 +
:For background information about the contest and the organizing team, please see [http://labrosa.ee.columbia.edu/millionsong/challenge http://labrosa.ee.columbia.edu/millionsong/challenge].
  
 
==MIREX 2012 Deadline Dates==
 
==MIREX 2012 Deadline Dates==
Line 65: Line 67:
 
Please answer [https://docs.google.com/spreadsheet/viewform?formkey=dFlrZlZIWUpGX3o0S0NCb3huVHBfeUE6MA#gid=0 MIREX 2012 Task Participation Poll]
 
Please answer [https://docs.google.com/spreadsheet/viewform?formkey=dFlrZlZIWUpGX3o0S0NCb3huVHBfeUE6MA#gid=0 MIREX 2012 Task Participation Poll]
 
on your likelihood of participation in each task.  
 
on your likelihood of participation in each task.  
Poll is going to close on Sunday July 29th 2012.
+
Poll will close on Sunday July 29th 2012.
 +
 
 +
Also, please add your name and email address to the bottom of each task page for those tasks in which you plan to participate.
  
 
==MIREX 2012 Possible Evaluation Tasks==
 
==MIREX 2012 Possible Evaluation Tasks==
Line 89: Line 93:
 
* [[2012:Structural Segmentation]]
 
* [[2012:Structural Segmentation]]
 
* [[2012:Audio Tempo Estimation]]
 
* [[2012:Audio Tempo Estimation]]
 +
* [[2013:Discovery of Repeated Themes & Sections]]
  
 
===Note to New Participants===
 
===Note to New Participants===
Please take the time to read the following review article that explains the history and structure of MIREX.
+
Please take the time to read the following review articles that explain the history and structure of MIREX.
  
 
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):<br>
 
Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):<br>
Line 99: Line 104:
 
Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).<br>
 
Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).<br>
 
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.<br>
 
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.<br>
Advances in Music Information Retrieval Vol. 274, pp. 93-115<br>
+
''Advances in Music Information Retrieval'' Vol. 274, pp. 93-115<br>
Available at: [http://www.google.com/url?sa=t&rct=j&q=the%20music%20information%20retrieval%20evaluation%20exchange%3A%20some%20observations%20and%20insights&source=web&cd=1&ved=0CFQQFjAA&url=https%3A%2F%2Fkaggle2.blob.core.windows.net%2Fcompetitions%2Fkaggle%2F2799%2Fmedia%2FThe%2520Music%2520Information%2520Retrieval%2520Evaluation%2520eXchange-%2520Some%2520Observations%2520and%2520Insights.pdf&ei=UhvST4__Cq2H0QGm7qSnAw&usg=AFQjCNG0hMxcHm98GTu0gRX2X0AjwYYuow here]
+
Available at: [http://bit.ly/KpM5u5 http://bit.ly/KpM5u5]
  
 
===Note to All Participants===
 
===Note to All Participants===
Line 131: Line 136:
  
 
===Wiki Participation===
 
===Wiki Participation===
Please create an account via: [[Special:Userlogin]].
+
If you find that you cannot edit a MIREX wiki page, you will need to create a new account via: [[Special:Userlogin]].
  
 
Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).
 
Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).

Latest revision as of 04:31, 14 March 2013

Welcome to MIREX 2012

This is the main page for the eighth running of the Music Information Retrieval Evaluation eXchange (MIREX 2012). The International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) at the Graduate School of Library and Information Science (GSLIS), University of Illinois at Urbana-Champaign (UIUC) is the principal organizer of MIREX 2012.

The MIREX 2012 community will hold its annual meeting as part of The 13th International Conference on Music Information Retrieval, ISMIR 2012, which will be held in Porto, Portugal, the 8-12 October, 2012. The MIREX plenary and poster sessions will be held Friday, October 12 during the conference.

J. Stephen Downie
Director, IMIRSEL

Announcing the KETI/Illinois K-MIREX Collaboration

The MIREX team at the University of Illinois is very proud to announce its new K-MIREX Collaboration with the research team led by Dr. Seok-Pil Lee and Chai-Jong Song of the Digital Media Research Center at the Korea Electronics Technology Institute (KETI) http://www.keti.re.kr/e-keti/. Song and his KETI colleagues will be taking the lead on running the 2012 Query by Singing/Humming (QBSH) and Audio Melody Extraction (AME) Tasks . We do not foresee any special deviations from traditional MIREX submission procedures for these two tasks. Should they arise, participants will be informed.

MIREX and the Million Song Dataset Challenge

The MIREX team at the University of Illinois is also proud to announce its co-operative engagement with the Million Song Dataset (MSD) Challenge team of Brian McFee, Thierry Bertin-Mahieux, Daniel P.W. Ellis, and Gert Lanckriet. Below is the recent email announcement of the MSD Challenge.

The Million Song Dataset Challenge
DEADLINE: 2012-08-09 23:59 UTC.
URL:http://www.kaggle.com/c/msdchallenge.
We are happy to announce the Million Song Dataset Challenge: a large-scale, open evaluation of personalized music recommendation algorithms.
We provide listening history data for 1.1 million users (1 million train, 110 thousand validation and test) and over 380 thousand songs from the Million Song Dataset. Given partial historical data for each test user, the goal is to produce a ranking over the remaining songs for that user.
What makes this challenge unique? Openness! The songs in the dataset are equipped with metadata (e.g., artist and title), as well as audio content analysis, semantic annotations, lyrics, etc. Because the data is open, participants are free to construct and include any additional features, or ignore the features altogether. The field is wide open!
Details:
The challenge ends 2012-08-09 23:59 UTC.
Evaluation is handled through Kaggle.com, and the details can be found at http://www.kaggle.com/c/msdchallenge.
A post-analysis of the leading submissions will be performed by the Music Information Retrieval Evaluation eXchange (MIREX), and the results presented at the 13th International Society for Music Information Retrieval (ISMIR) conference in October, 2012.
For details on the Million Song Dataset, please see http://labrosa.ee.columbia.edu/millionsong/.
For background information about the contest and the organizing team, please see http://labrosa.ee.columbia.edu/millionsong/challenge.

MIREX 2012 Deadline Dates

We have two sets of deadlines for submissions. We have to stagger the deadlines because of runtime and human evaluation considerations. The submission system will open July 30, 2012.

Tasks with a 20 August 2012 deadline:

  1. Audio Classification (Train/Test) Tasks
  2. Audio Music Similarity and Retrieval
  3. Symbolic Melodic Similarity

Tasks with a 27 August 2012 deadline:

  1. All remaining MIREX 2012 tasks.

Nota Bene: In the past we have been rather flexible about deadlines. This year, however, we simply do not have the time flexibility, sorry.

Please, please, please, let's start getting those submissions made. The sooner we have the code, the sooner we can start running the evaluations.

PS: If you have a slower running algorithm, help us help you by getting your code in ASAP. Please do pay attention to runtime limits.

MIREX 2012 Submission Instructions

MIREX 2012 Task Participation Poll

Please answer MIREX 2012 Task Participation Poll on your likelihood of participation in each task. Poll will close on Sunday July 29th 2012.

Also, please add your name and email address to the bottom of each task page for those tasks in which you plan to participate.

MIREX 2012 Possible Evaluation Tasks

Note to New Participants

Please take the time to read the following review articles that explain the history and structure of MIREX.

Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):
A window into music information retrieval research.Acoustical Science and Technology 29 (4): 247-255.
Available at: http://dx.doi.org/10.1250/ast.29.247

Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.
Advances in Music Information Retrieval Vol. 274, pp. 93-115
Available at: http://bit.ly/KpM5u5

Note to All Participants

Because MIREX is premised upon the sharing of ideas and results, ALL MIREX participants are expected to:

  1. submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).
  2. submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2012 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)
  3. present a poster at the MIREX 2012 poster session at ISMIR 2012

Software Dependency Requests

If you have not submitted to MIREX before or are unsure whether IMIRSEL/NEMA currently supports some of the software/architecture dependencies for your submission a dependency request form is available. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you.

Due to the high volume of submissions expected at MIREX 2012, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.


Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.

Getting Involved in MIREX 2012

MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2012 the best yet.


Mailing List Participation

If you are interested in formal MIR evaluation, you should also subscribe to the "MIREX" (aka "EvalFest") mail list and participate in the community discussions about defining and running MIREX 2012 tasks. Subscription information at: EvalFest Central.

If you are participating in MIREX 2012, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2012 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here.

Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2012 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.


Wiki Participation

If you find that you cannot edit a MIREX wiki page, you will need to create a new account via: Special:Userlogin.

Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).

MIREX 2005 - 2011 Wikis

Content from MIREX 2005 - 2011 are available at:

MIREX 2011 MIREX 2010 MIREX 2009 MIREX 2008 MIREX 2007 MIREX 2006 MIREX 2005