论文标题

带有点注释的X射线图像中骨断裂检测和定位的新窗口损失函数

A New Window Loss Function for Bone Fracture Detection and Localization in X-ray Images with Point-based Annotation

论文作者

Zhang, Xinyu, Wang, Yirui, Cheng, Chi-Tung, Lu, Le, Harrison, Adam P., Xiao, Jing, Liao, Chien-Hung, Miao, Shun

论文摘要

对象检测方法被广泛用于使用医学图像的计算机辅助诊断。异常发现通常被视为通过边界框描述的对象。然而,由于相当大的例子,形状和边界模棱两可,许多病理发现,例如骨折,无法通过边界框明确定义。这使得有限的框注释及其相关的损失非常不适。在这项工作中,我们基于适用于异常发现的劳动有效且柔性的注释方案,提出了一种用于X射线图像的新骨断裂检测方法,没有明确的对象级空间范围或边界。我们的方法采用简单,直观且有用的基于点的注释协议来标记局部病理信息。为了解决通过点注释的骨折量表中的不确定性,我们将注释转换为像素的监督,该监督使用具有正,负和不确定区域的上限和上限。随后提出了一种新颖的窗户损失,以惩罚不确定区域之外的预测。我们的方法已对4410个独特患者的骨盆X射线图像进行了广泛的评估。实验表明,我们的方法优于先前的最先进的图像分类和对象检测基线,而健康边缘为0.983,FROC得分为89.6%。

Object detection methods are widely adopted for computer-aided diagnosis using medical images. Anomalous findings are usually treated as objects that are described by bounding boxes. Yet, many pathological findings, e.g., bone fractures, cannot be clearly defined by bounding boxes, owing to considerable instance, shape and boundary ambiguities. This makes bounding box annotations, and their associated losses, highly ill-suited. In this work, we propose a new bone fracture detection method for X-ray images, based on a labor effective and flexible annotation scheme suitable for abnormal findings with no clear object-level spatial extents or boundaries. Our method employs a simple, intuitive, and informative point-based annotation protocol to mark localized pathology information. To address the uncertainty in the fracture scales annotated via point(s), we convert the annotations into pixel-wise supervision that uses lower and upper bounds with positive, negative, and uncertain regions. A novel Window Loss is subsequently proposed to only penalize the predictions outside of the uncertain regions. Our method has been extensively evaluated on 4410 pelvic X-ray images of unique patients. Experiments demonstrate that our method outperforms previous state-of-the-art image classification and object detection baselines by healthy margins, with an AUROC of 0.983 and FROC score of 89.6%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源