MSR Image Recognition Challenge (IRC)
Microsoft happy to continue hosting this series of Image recognition (retrieval) Grand challenges. What is the it takes to build of the best image recognition system? Enter These MSR image recognition challenges in ACM Multimedia and/or IEEE ICME to develop your image recognition system B Ased on real world large scale data.Current challenge:ms-celeb-1m:recognizing One Million celebrities on the Real world DETAILS:MSR Image Recognition Challenge @ ACM MM5/27/2016: (new!) Samplecode/guids/testtool was released to each team, details 5/9/2016:development DataSet was released for download, to B E used during dry-run. 5/9/2016:competition/paper registration is opened here, please provide your Team Name (as the paper title), Organization (as the paper abstract), Team members and contact information (as the paper authors). 4/29/2016:entity list is released for download 4/5/2016:cropped and aligned faces be ready for download 4/4/2016:more Data is available to a downloading:samples 4/1/2016:ms-celeb-v1 imagethumbnails ready for downloading!Last challenge:msr IRC @ IEEE ICME
We just finished the evaluation! More Details here Important dates:the datasets for this challenge are Described here and can be DOWNLOADED&N Bsp;here. Feb 23, 2016: registration Web site is opened. Feb, 2016: icme Site is Open, please register a placeholder for your final report: select track = "Grand Challenges "And select Subject area =" MSR Grand challenge:image Recognition Challenge "Feb, 2016: upd Ate about data sets, And faq Feb, 2016: update about sample codes, and faq March 3, 2016:&NB Sp;update about test tool, team Keys, And faq March 7~10, 2016: dry run traffic sent-Your system for TES Ting/verification, And faq March, 2016: update about final Evaluation and faq March 14, 2016:& nbsp Evaluation Started, please Keep your system running stably March, 2016: evaluation ends (0:00am PDT) Mar CH, 2016: evaluation results announced (see the Rank table below) April 3, 2016: grand Challenge Paper and Data submission April 28: paper acceptance notification may 13: paper camera ready version Due
Rank |
Teamid |
Team Name |
Precision@5 |
Used External Data |
1 |
30 |
Nlpr_casia |
89.65% |
Yes |
2 |
16 |
Ybt_bj |
86.9% |
No |
3 |
5 |
NFS2016 |
85% |
Yes |
4 |
20 |
Westmountain |
84.75% |
Yes |
5 |
3 |
Rucmm |
84.55% |
Yes |
6 |
17 |
Casiie-asgard |
83.4% |
Yes |
7 |
31 |
Gorocketsgo |
81.85% |
Yes |
8 |
2 |
Cdl-ustc |
73.1% |
Yes |
9 |
4 |
Lyg |
71.35% |
No |
10 |
10 |
Frenchbulldog |
71.25% |
No |
past challenge:msr-bing IRC @ ACM mm
We have finished the Challenge IN ACM mm 2015. More Details here. Important dates:dataset available for download (clickture-lite) and Hard-disk delivery (clickture-full). June, 2015:trial set available for download and test. June, 2015:final evaluation set for task#1 available for download (encrypted) June, 2015:evaluation starts (0 : 00am PDT) June, 2015:evaluation ends (0:00am PDT) june, 2015:evaluation results announce. July 7, 2 015:paper submission deadline July, 2015:notification of Acceptance:august 15, 2015:camera-ready submission De Adline October 28,2015: grand Challenge Workshop
Latest updates: May, 2015: pre-registration form available At http://1drv.ms/1k9aaxo. June, 2015:training data set ready for Downloading: details June, 2015:trial set for task#1 are available for Download (the same as ACM MM): Http://1drv.ms/1pq08Wq June, 2015:trial code samples for TASK#2 was delivered by EMA Il. Contact us if you haven ' t received it. June, 2015:test tool for task#2 are delivered by email. Contact us if you haven ' t received it. June, 2015: evaluation set for task#1 available at here (encrypted), please download and unzip it June 2 4~june 25,2015:for task#2, dry run traffic would be sent to your recognition service,please keep your recognition service running! June, 2015:password to decrypt task#1 evaluation data are delivered to all participants by email in 0:00am PST, please Let us know if you haven ' t received it. June, 2015: evaluation results is sent back to teams July 1, 2015:evaluation reSult Summary:
Teamid |
Teamname |
Task1:image Retrieval |
Task2:image recognition |
Run1-master |
Run2 |
Run3 |
Rank-task1 |
accuracy@1 |
Arruracy@5 |
Rank-task2 |
1 |
TINA |
|
|
|
|
|
|