2010:Symbolic Music Similarity and Retrieval

From MIREX Wiki
Revision as of 16:41, 26 May 2010 by MertBay (talk | contribs)

Task suggestion: Symbolic Melodic Similarity

Description

Retrieve the most similar items from a collection of symbolic documents, given a query, and rank them by melodic similarity. The following tasks could be defined this year:

Task 1: Monophonic to monophonic. Both the query and the documents in the collection will be monophonic.

Task 2: Monophonic to polyphonic. The documents will be polyphonic (i.e. can have simultaneous notes), but the query will still be monophonic.

Task 3: Polyphonic to polyphonic. Both the query and documents will be polyphonic.

Data

Task 1

  • 5,274 tunes belonging to the Essen folksong collection. The tunes are in standard MIDI file format. Download (< 1 MB)

Tasks 2 and 3

a.  1,000 polyphonic Karaoke files. Download (9.5 MB)
b.  10,000 mostly polyphonic MIDI files crawled from the Web. Download (69 MB)

Here are versions of the above collections consisting of equivalent MIDI files without tempo changes. These will be used in the evaluation.

a.  ~1,000 polyphonic Karaoke files. Download (9.6 MB)
b.  ~10,000 mostly polyphonic MIDI files crawled from the Web. Download (68.7 MB)

MD5 sum

You can check the program that was used to "untempify" the MIDI files.


Evaluation

Human Evaluation

The primary evaluation will involve subjective judgments by human evaluators of the retrieved sets using IMIRSEL's Evalutron 6000 system.

  • Evaluator question: Given a search based on track A, the following set of results was returned by all systems. Please place each returned track into one of three classes (not similar, somewhat similar, very similar) and provide an inidcation on a continuous scale of 0 - 10 of high similar the track is to the query.
  • 6/5/6 queries for RISM/Karaoke/Mixed datasets, 10 results per query, 1 set of eyes, ~10 participating labs
  • Higher number of queries preferred as IR research indicates variance is in queries
  • It will be possible for researchers to use this data for other types of system comparisons after MIREX 2007 results have been finalized.
  • Human evaluation to be designed and led by IMIRSEL following a similar format to that used at MIREX 2006
  • Human evaluators will be drawn from the participating labs (and any volunteers from IMIRSEL or on the MIREX lists)


Submission Format

Inputs/Outputs

Task 1:

Input: Parameters:

  • the name of a directory containing the MIDI files
  • the name of one MIDI file containing a monophonic query.

The program will be called 6 times. Three of the queries are going to be quantized (produced from symbolic notation) and three produced by humming or whistling, thus with slight rhythmic and pitch deviations.

Output: - a list of the names of the 10 most similar matching MIDI files, ordered by melodic similarity. Write the file name in separate lines, without empty lines in between.

Task 2

Task 2: Input: same interface as for task 1, thus the name of the directory with files to be searched and the name of the query. However, the directory will contain either about 10,000 mostly polyphonic MIDI files or 1000 Karaoke files.

Output: a list of the names of 10 different MIDI file names that contain melodically similar musical material, ordered by similarity, plus for each file the time (offset from the beginning in seconds) where the query matches and where the matching bit ends. If the query matches in more than one position, return the position of the most similar match (or any one of them if there is more than one most similar match). If the algorithm does not align the query with the MIDI file at any particular position, just return 0 as start time and the duration of the MIDI file as end time.

Sample output line:

somefile.mid,0,2.3

(means that somefile.mid matches the query, and the matching bit starts at the very beginning of the file and ends 2.3 seconds later). The most similar match should be returned first. In evaluation, the human graders will only here that part of the song +-10 sec buffer in both ends.

Measures

Use the same measures as 2006:Symbolic_Melodic_Similarity_Results to compare the search results of the various algorithms.

Packaging submissions

All submissions should be statically linked to all libraries (the presence of dynamically linked libraries cannot be guarenteed).

All submissions should include a README file including the following the information:

  • Command line calling format for all executables and an example formatted set of commands
  • Number of threads/cores used or whether this should be specified on the command line
  • Expected memory footprint
  • Expected runtime
  • Any required environments (and versions), e.g. python, java, bash, matlab.