HomeWhy Challenges?All ChallengesCreate your own projectContributorsForum
Sign in / Register

The Camelyon16 ISBI challenge took place on Wednesday, 13 April. The submission page is reopened from 14 April and new submissions are accepted. The presentations from the organizing team are now available.

Title of presentation Presenter Download
How computers shape the future of pathology Jeroen van der Laak
Camelyon16: Aim, dataset, and evaluation Babak Ehteshami Bejnordi
Statistics, Leaderboards, Results and Comparison to Pathologist Babak Ehteshami Bejnordi

 

Public Leaderboard 1 - Whole-slide-image classification

  • The results are computed on the independent test set.
  • Evaluation 1: Teams are ranked based on area under ROC curve (AUC).

Top-five ranked teams until the challenge event deadline (Apr 1, 2016):

Rank Team AUC Submission date Description
01 Harvard Medical School and MIT, Method 1 0.9234 01 Apr 2016
02 EXB Research and Development co., Germany 0.9156 01 Apr 2016
03 Independent participant, Germany 0.8654 01 Apr 2016
04 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.8642 01 Apr 2016
05 NLP LOGIX co., USA 0.8298 01 Apr 2016

Leaderboard including all submissions (updated after each new entry):

* Indicates that the team has achieved an AUC value that surpasses the AUC of the pathologist in our study.

Rank Team AUC Submission date Description
01 * Harvard Medical School and MIT, Method 2 (updated) 0.9935 06 Nov 2016
02 * Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 3 0.9763 24 Oct 2016
03 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 1 0.9650 07 Sep 2016
04 The Chinese University of Hong Kong (CU lab, Hong Kong), Method 3 0.9415 29 Aug 2016
05 Harvard Medical School and MIT, Method 1 0.9234 01 Apr 2016
06 EXB Research and Development co., Germany 0.9156 01 Apr 2016
07 The Chinese University of Hong Kong (CU lab), Hong Kong, Method 1 0.9086 08 June 2016
08 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 2 0.9082 24 Oct 2016
09 The Chinese University of Hong Kong (CU lab), Hong Kong, Method 2 0.9056 20 July 2016
10 DeepCare Inc, China 0.8833 05 Nov 2016
11 Independent participant, Germany 0.8654 01 Apr 2016
12 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.8642 01 Apr 2016
13 NLP LOGIX co., USA 0.8298 01 Apr 2016
14 Smart Imaging Technologies co., USA 0.8207 14 May 2016
15 University of Toronto, Electrical and Computer Engineering, Canada 0.8149 01 Apr 2016
16 The Warwick-QU Team, United Kingdom 0.7958 01 Apr 2016
17 Radboud University Medical Center (DIAG), Netherlands 0.7786 01 Apr 2016
18 HTW-BERLIN, Germany 0.7676 01 Apr 2016
19 University of Toronto, Electrical and Computer Engineering, Canada 0.7621 01 Apr 2016
20 BioMediTech, University of Tampere, Finland 0.7612 01 Apr 2016
21 Smart Imaging Technologies co., USA 0.7574 01 Apr 2016
22 Technical University of Munich (CAMP), Germany - Method 2 0.7367 30 Aug 2016
23 Osaka University, Department of Bioinformatic Engineering, Japan 0.7319 01 Apr 2016
24 University of South Florida, Computer Science and Engineering, USA 0.7270 01 Apr 2016
25 NSS college of Engineering, India 0.7269 01 Apr 2016
26 BioMediTech, University of Tampere, Finland 0.7132 01 Apr 2016
27 Technical University of Munich (CAMP), Germany 0.6910 01 Apr 2016
28 United Institute of Informatics Problems, Belarus 0.6890 01 Apr 2016
29 VISILAB, University of Castilla-La Mancha, Spain 0.6531 01 Apr 2016
30 VISILAB, University of Castilla-La Mancha, Spain 0.6513 01 Apr 2016
31 Mines Paris Tec, France 0.6277 01 Apr 2016
32 Sorbonne Universites, Laboratoire d’Imagerie Biomdicale, France 0.5561 01 Apr 2016

 

Public Leaderboard 2 - Tumor localization

  • The results are computed on the independent test set.
  • Evaluation 2: The detection/localization performance is summarized using Free Response Operating Characteristic (FROC) curves. The final score is defined as the average sensitivity at 6 predefined false positive rates: 1/4, 1/2, 1, 2, 4, and 8 FPs per whole slide image.

Top-five ranked teams until the challenge event deadline (Apr 1, 2016):

Rank Team score Submission date Description
01 Harvard Medical School and MIT, Method 1 0.6933 01 Apr 2016
02 Radboud University Medical Center (DIAG), Netherlands 0.5748 01 Apr 2016
03 EXB Research and Development co., Germany 0.5111 01 Apr 2016
04 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.3889 01 Apr 2016
05 NLP LOGIX co., USA 0.3859 01 Apr 2016

Leaderboard including all submissions (updated after each new entry):

Rank Team score Submission date Description
01 Harvard Medical School and MIT, Method 2 (updated) 0.8074 06 Nov 2016
02 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 3 0.7600 24 Oct 2016
03 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 2 0.7289 24 Oct 2016
04 The Chinese University of Hong Kong (CU lab), Method 3 0.7030 29 Aug 2016
05 Harvard Medical School and MIT, Method 1 0.6933 01 Apr 2016
06 Harvard Medical School, Gordon Center for Medical Imaging, MGH, Method 1 0.5963 07 Sep 2016
07 Radboud University Medical Center (DIAG), Netherlands 0.5748 01 Apr 2016
08 The Chinese University of Hong Kong (CU lab) - Method 1 0.5444 08 June 2016
09 The Chinese University of Hong Kong (CU lab) - Method 2 0.5274 20 July 2016
10 EXB Research and Development co., Germany 0.5111 01 Apr 2016
11 Middle East Technical University, Departments of EEE, NSNT and HS, Turkey 0.3889 01 Apr 2016
12 NLP LOGIX co., USA 0.3859 01 Apr 2016
13 University of Toronto, Electrical and Computer Engineering, Canada 0.3822 01 Apr 2016
14 Independent participant, Germany 0.3667 01 Apr 2016
15 University of Toronto, Electrical and Computer Engineering, Canada 0.3519 01 Apr 2016
16 Osaka University, Department of Bioinformatic Engineering, Japan 0.3467 01 Apr 2016
17 Smart Imaging Technologies, USA 0.3385 14 May 2016
18 The Warwick-QU Team, United Kingdom 0.3052 01 Apr 2016
19 Technical University of Munich (CAMP), Germany, Method 2 0.2733 30 Aug 2016
20 BioMediTech, University of Tampere, Finland 0.2570 01 Apr 2016
21 BioMediTech, University of Tampere, Finland 0.2519 01 Apr 2016
22 DeepCare Inc, China 0.2430 05 Nov 2016
23 United Institute of Informatics Problems, Belarus 0.2267 01 Apr 2016
24 Smart Imaging Technologies, USA 0.2081 01 Apr 2016
25 HTW-BERLIN, Germany 0.1867 01 Apr 2016
26 Technical University of Munich (CAMP), Germany 0.1837 01 Apr 2016
27 University of South Florida, Computer Science and Engineering, USA 0.1793 01 Apr 2016
28 NSS college of Engineering, India 0.1652 01 Apr 2016
29 VISILAB, University of Castilla-La Mancha, Spain 0.1422 01 Apr 2016
30 Sorbonne Universites, Laboratoire d’Imagerie Biomdicale, France 0.1200 01 Apr 2016
31 VISILAB, University of Castilla-La Mancha, Spain 0.1163 01 Apr 2016
32 Mines Paris Tec, France 0.0970 01 Apr 2016


Consortium for Open Medical Image Computing © 2012-