2017:Main Page

From MIREX Wiki
Revision as of 15:42, 11 July 2017 by Yun Hao (talk | contribs) (MIREX 2017 Deadline Dates)

Welcome to MIREX 2017

This is the main page for the eleventh running of the Music Information Retrieval Evaluation eXchange (MIREX 2017). The International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) at School of Information Sciences, University of Illinois at Urbana-Champaign (UIUC) is the principal organizer of MIREX 2017.

The MIREX 2017 community will hold its annual meeting as part of The 18th International Conference on Music Information Retrieval, ISMIR 2017, which will be held in Suzhou, China, October 23-28, 2017.

J. Stephen Downie
Director, IMIRSEL

Task Leadership Model

Like ISMIR 2016, we are prepared to improve the distribution of tasks for the upcoming MIREX 2017. To do so, we really need leaders to help us organize and run each task.

To volunteer to lead a task, please complete the form here. Current information about task captains can be found on the 2017:Task Captains page. Please direct any communication to the EvalFest mailing list.

What does it mean to lead a task?

  • Update wiki pages as needed
  • Communicate with submitters and troubleshooting submissions
  • Execution and evaluation of submissions
  • Publishing final results

Due to the proprietary nature of much of the data, the submission system, evaluation framework, and most of the datasets will continue to be hosted by IMIRSEL. However, we are prepared to provide access to task organizers to manage and run submissions on the IMIRSEL systems.

We really need leaders to help us this year!

MIREX 2017 Deadline Dates

  • August 21st 2017
    • Audio Classification (Train/Test) Tasks <TC: IMIRSEL>
    • Audio K-POP Genre Classification <TC: IMIRSEL>
    • Audio K-POP Mood Classification <TC: IMIRSEL>
    • Audio Fingerprinting <TC: Chung-Che Wang>
  • August 31st 2017
    • Structural Segmentation <TC: Matthew McCallum>
    • Multiple Fundamental Frequency Estimation & Tracking <TC: Yun Hao (IMIRSEL)>
    • Audio Tempo Estimation <TC: Aggelos Gkiokas, Sebastian Böck, Paul Brossier, Matthew McCallum>
    • Set List Identification <TC: Ming-Chi Yen>
  • September 4th 2017
    • Audio Onset Detection <TC: Sebastian Böck, Paul Brossier, Matthew McCallum>
    • Audio Beat Tracking <TC: Sebastian Böck, Paul Brossier, Matthew McCallum>
    • Audio Key Detection <TC: Johan Pauwels>
    • Audio Downbeat Detection <TC: Sebastian Böck, Paul Brossier, Matthew McCallum>
    • Real-time Audio to Score Alignment(a.k.a Score Following) <TC: Julio Carabias>
    • Audio Cover Song Identification <TC: Chris Tralie, Ming-Chi Yen>
    • Discovery of Repeated Themes & Sections <TC: Tom Collins,Anja Volk,Berit Janssen,Iris Ren>
    • Audio Chord Estimation <TC: Johan Pauwels>
    • Automatic Lyrics-to-Audio Alignment <TC: Georgi Dzhambazov>
    • Drum Transcription <TC: Richard Vogl>
    • Query by Singing/Humming <TC:Cheng-Tse Wu>

MIREX 2017 Submission Instructions

MIREX 2017 Possible Evaluation Tasks


Note to New Participants

Please take the time to read the following review articles that explain the history and structure of MIREX.

Downie, J. Stephen (2008). The Music Information Retrieval Evaluation Exchange (2005-2007):
A window into music information retrieval research.Acoustical Science and Technology 29 (4): 247-255.
Available at: http://dx.doi.org/10.1250/ast.29.247

Downie, J. Stephen, Andreas F. Ehmann, Mert Bay and M. Cameron Jones. (2010).
The Music Information Retrieval Evaluation eXchange: Some Observations and Insights.
Advances in Music Information Retrieval Vol. 274, pp. 93-115
Available at: http://bit.ly/KpM5u5


Runtime Limits

We reserve the right to stop any process that exceeds runtime limits for each task. We will do our best to notify you in enough time to allow revisions, but this may not be possible in some cases. Please respect the published runtime limits.


Note to All Participants

Because MIREX is premised upon the sharing of ideas and results, ALL MIREX participants are expected to:

  1. submit a DRAFT 2-3 page extended abstract PDF in the ISMIR format about the submitted programme(s) to help us and the community better understand how the algorithm works when submitting their programme(s).
  2. submit a FINALIZED 2-3 page extended abstract PDF in the ISMIR format prior to ISMIR 2016 for posting on the respective results pages (sometimes the same abstract can be used for multiple submissions; in many cases the DRAFT and FINALIZED abstracts are the same)
  3. present a poster at the MIREX 2016 poster session at ISMIR 2016


Software Dependency Requests

If you have not submitted to MIREX before or are unsure whether IMIRSEL currently supports some of the software/architecture dependencies for your submission a dependency request form is available. Please submit details of your dependencies on this form and the IMIRSEL team will attempt to satisfy them for you.

Due to the high volume of submissions expected at MIREX 2015, submissions with difficult to satisfy dependencies that the team has not been given sufficient notice of may result in the submission being rejected.


Finally, you will also be expected to detail your software/architecture dependencies in a README file to be provided to the submission system.

Getting Involved in MIREX 2017

MIREX is a community-based endeavour. Be a part of the community and help make MIREX 2017 the best yet.


Mailing List Participation

If you are interested in formal MIR evaluation, you should also subscribe to the "MIREX" (aka "EvalFest") mail list and participate in the community discussions about defining and running MIREX 2017 tasks. Subscription information at: EvalFest Central.

If you are participating in MIREX 2017, it is VERY IMPORTANT that you are subscribed to EvalFest. Deadlines, task updates and other important information will be announced via this mailing list. Please use the EvalFest for discussion of MIREX task proposals and other MIREX related issues. This wiki (MIREX 2017 wiki) will be used to embody and disseminate task proposals, however, task related discussions should be conducted on the MIREX organization mailing list (EvalFest) rather than on this wiki, but should be summarized here.

Where possible, definitions or example code for new evaluation metrics or tasks should be provided to the IMIRSEL team who will embody them in software as part of the NEMA analytics framework, which will be released to the community at or before ISMIR 2017 - providing a standardised set of interfaces and output to disciplined evaluation procedures for a great many MIR tasks.


Wiki Participation

If you find that you cannot edit a MIREX wiki page, you will need to create a new account via: Special:Userlogin.

Please note that because of "spam-bots", MIREX wiki registration requests may be moderated by IMIRSEL members. It might take up to 24 hours for approval (Thank you for your patience!).

MIREX 2005 - 2016 Wikis

Content from MIREX 2005 - 2016 are available at: MIREX 2016 MIREX 2015 MIREX 2014 MIREX 2013 MIREX 2012 MIREX 2011 MIREX 2010 MIREX 2009 MIREX 2008 MIREX 2007 MIREX 2006 MIREX 2005