Skip to content
PhaKIR - Challenge:
Phase, Keypoint and Instrument Recognition

part of

PhaKIR - Challenge:
Phase, Keypoint and Instrument Recognition

part of

PhaKIR - Challenge:
Phase, Keypoint and Instrument Recognition

part of

ABOUT THE CHALLENGE

Accurate and reliable recognition and localization of surgical instruments in endoscopic video recordings is the basis for a variety of applications in computer- and robot-assisted minimally invasive surgery (RAMIS) [1]. The robust handling of real-world conditions such as varying illumination levels, blurred movement of the instruments and the camera, severe sudden bleeding that impairs the field of view, or even unexpected smoke development is an important prerequisite for such procedures. To process the information extracted from the endoscopic images in the best possible way, the inclusion of the context of the operation can be used as a promising possibility, which can be realized, for example, by knowing the current phase of an intervention.

In our EndoVis2024 subchallenge, we present a dataset for which three tasks are to be performed: Instance segmentation of the surgical instruments, keypoint estimation, and procedure phase recognition. The following annotations are available for this: pixel-accurate instance segmentations of surgical instruments together with their instrument types, of which a total of 20 categories are distinguished, coordinates of relevant instrument keypoints, and a classification of the phases of an intervention into seven different phase categories. Our dataset consists of 13 individual real-world videos of human cholecystectomies ranging from 23 to 60 minutes in duration. The procedures were performed by experienced physicians, and the videos were recorded in three hospitals. In addition to the complete video sequences, we provide annotations in a one-frame-per-second time interval, resulting in approximately 30,000 annotated and 838,000 not annotated frames which can be utilized by taking the temporal information into account. In addition to existing datasets, our annotations provide instance segmentations of surgical instruments, relevant keypoints, and intervention phases in one dataset and thus comprehensively cover instrument localization and the context of the operation.

[1] T. Rueckert, D. Rueckert, and C. Palm, "Methods and datasets for segmentation of minimally invasive surgical instruments in endoscopic images and videos: A review of the state of the art", Computers in Biology and Medicine, vol. 169, pp. 107929, 2024, DOI: https://doi.org/10.1016/j.compbiomed.2024.107929.

 

TIMELINE

April 2024 Challenge Registration Opens
May 2024 Release of Training Data
1st August 2024 Release of Docker Submission Guide and Evaluation Instructions
8th September 2024 Start of Docker Submissions to Verify Functionality
15th September 2024 Submission Deadline and Registration Closing
15h September 2024 Submission Deadline Methodology Report
Day of EndoVis 2024 Challenge Day and Presentation of Results

CHALLENGE RULES

Challenge rules will be published soon.

We use cookies to improve the user-friendliness of the website. By visiting our website you agree to this.