Paper Info Reviews Meta-review Author Feedback Post-Rebuttal Meta-reviews

Authors

Fan Sun, Zhiming Luo, Shaozi Li

Abstract

Medical image segmentation is crucial for clinical diagnosis. However, current losses for medical image segmentation mainly focus on overall segmentation results, with fewer losses proposed to guide boundary segmentation. Those that do exist often need to be used in combination with other losses and produce ineffective results. To address this issue, we have developed a simple and effective loss called the Boundary Difference over Union Loss (Boundary DoU Loss) to guide boundary region segmentation. It is obtained by calculating the ratio of the difference set of prediction and ground truth to the union of the difference set and the partial intersection set. Our loss only relies on region calculation, making it easy to implement and training stable without needing any additional losses. Additionally, we use the target size to adaptively adjust attention applied to the boundary regions. Experimental results using UNet, TransUNet, and Swin-UNet on two datasets (ACDC and Synapse) demonstrate the effectiveness of our proposed loss function. Code is available at https://github.com/sunfan-bvb/BoundaryDoULoss.

Link to paper

DOI: https://doi.org/10.1007/978-3-031-43901-8_28

SharedIt: https://rdcu.be/dnwDc

Link to the code repository

https://github.com/sunfan-bvb/BoundaryDoULoss

Link to the dataset(s)

https://www.synapse.org/#!Synapse:syn3193805/wiki/217789

https://www.creatis.insa-lyon.fr/Challenge/acdc/


Reviews

Review #1

  • Please describe the contribution of the paper

    The paper introduces a novel loss function for image segmentation that focuses on accurately identifying boundary areas of the object of interest. The authors provide a clear explanation of the loss function, highlighting its differences to the commonly used Dice loss. The proposed loss function includes a hyperparameter, for which the authors propose a method to automatically adapt according to the target size. The method is evaluated on two public datasets, and both quantitative and qualitative results demonstrate superior performance compared to other segmentation losses. Overall, the paper is well-written, and the proposed method is well-motivated and explained.

  • Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.
    • The paper is generally well-written and easy-to-follow. The new loss function is well-explained and figures are relevant and clear.
    • The proposed loss function is novel, simple, easy-to-implement, and well-motivated. Although it can be seen as a variation of Dice loss, the authors provide a clear explanation of the differences between the two losses and results show that the proposed loss is superior to Dice loss across the two datasets.
    • It is also interesting that the authors provide a method for automatically adapting the hyperparameter of the loss according to the target size.
    • Extensive experiments are carried out on two public datasets and the results are compared to other segmentation losses. The results show that the proposed loss is superior to other losses. The authors also provide qualitative results which show that the proposed loss can better segment the boundary areas of the object of interest.
  • Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.
    • I have only comments on some of the results. I am wondering if a ~1% improvement in Dice score (in some of your results) has any clinical advantage. Do the authors have more insights on the clinical implications of their results. It would also be helpful to report the statistical significance of the results, compared to the Dice loss for example.
  • Please rate the clarity and organization of this paper

    Excellent

  • Please comment on the reproducibility of the paper. Note, that authors have filled out a reproducibility checklist upon submission. Please be aware that authors are not required to meet all criteria on the checklist - for instance, providing code and data is a plus, but not a requirement for acceptance

    The authors provide links to the two datasets used in their study and mention several hyperparameters used for their model and baselines. However, some important hyperparameters are not explicitly stated, such as the learning rate, batch size, and optimizer. The authors do note that their code will be made publicly available, which will allow interested readers to access this information.

  • Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review: https://conferences.miccai.org/2023/en/REVIEWER-GUIDELINES.html
    • Table 2 – UNet model: the Dice + CE loss performed better in terms of HD while the authors are indicating that the proposed loss is better (bold), please fix this.
    • The value of $d$ used for the B-IoU metric is not mentioned. The authors should add this information.
    • More implementation details such as learning rate, batch size, optimizer, etc. should be added, along with an explanation of how the hyperparameters were chosen. This could be added as an appendix to the paper.
  • Rate the paper on a scale of 1-8, 8 being the strongest (8-5: accept; 4-1: reject). Spreading the score helps create a distribution for decision-making

    7

  • Please justify your recommendation. What were the major factors that led you to your overall score for this paper?

    The paper is well-written, figures are clear and relevant. The loss function is simple, easy-to-implement, and well-motivated. It is clearly-explained and a theoretical comparison with Dice loss is reported. Results show improvement over the other losses on two datasets. The authors also provide a method for automatically adapting the hyperparameter of the loss according to the target size. Overall, the paper is well-motivated and the results are convincing. I think it is a good contribution to the community.

  • Reviewer confidence

    Very confident

  • [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed

    N/A

  • [Post rebuttal] Please justify your decision

    N/A



Review #2

  • Please describe the contribution of the paper

    This paper introduces a loss function based on the recently-proposed metric Boundary IoU. It then analyzes the proposed loss function, illustrates how it works, and conducts experiments on three neural networks and two datasets showing its outperformance compared to other loss functions.

  • Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.

    1) Novelty. The loss function proposed in this paper is based on a recent CVPR paper that introduced a new metric (Boundary IoU).

    2) Clarity. This paper provides with a very detailed explanation about how this loss function works. Specifically:

    • It is provided with a very intuitive illustration to understand the aforementioned metric and the proposed loss function (Fig 1).
    • The proposed loss function is compared with Dice loss graphically/intuitively (Fig. 2) and in their formulation (Section 2.3).
    • This method has a hyperparameter (alpha) that the authors compute automatically based on the data. The intuition behind this hyperparameter is also explained (paragraph above Section 2.3).

    3) Experiments. Very complete comparison. The proposed loss function is compared with previous loss functions on three neural networks (two of them very recent transformer-based networks) on two datasets.

  • Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.

    1) Small contradiction with previous work. My only concern is that, in the experiments, Boundary loss achieved worse Hausdorff distance than Dice loss, which contradicts the original boundary loss paper and the recent RegionWise loss paper that also utilized ACDC dataset.

  • Please rate the clarity and organization of this paper

    Very Good

  • Please comment on the reproducibility of the paper. Note, that authors have filled out a reproducibility checklist upon submission. Please be aware that authors are not required to meet all criteria on the checklist - for instance, providing code and data is a plus, but not a requirement for acceptance

    The reproducibility checklist agrees to what can be seen in the paper.

  • Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review: https://conferences.miccai.org/2023/en/REVIEWER-GUIDELINES.html
    • I think that the bottom-right figure does not really represent what is happening in reality with the denominator of the proposed Boundary DoU loss. My understanding of this figure is that, with \alpha < 1, there is “less” intersect(G, P) area. But I think that, in reality, the area is the same, and \alpha just have an effect on the magnitude of the gradients. I would then suggest to visualize the gradients with different values of alpha.
    • I recommend adding some discussion regarding the fact that Boundary loss achieved worse Hausdorff distance than Dice loss.
    • Can alpha be negative? If not, instead of defining \alpha < 1, define it better as: $\alpha \in [0, 1)$

    Minor suggestion to improve the paper

    • I think that the first sentence of Section 2.2. reads a bit odd. I would add a verb: “[…] union of the two boundaries […] actually highly correlated” -> “[…] union of the two boundaries […] are actually highly correlated” (or “correlate”).
  • Rate the paper on a scale of 1-8, 8 being the strongest (8-5: accept; 4-1: reject). Spreading the score helps create a distribution for decision-making

    7

  • Please justify your recommendation. What were the major factors that led you to your overall score for this paper?

    In my view, the number and importance of the main strengths of this paper outweighs the weaknesses. Specifically, the novelty, the clarity and the number of experiments. Additionally, I would like to emphasize the honesty of the authors showing that the proposed loss function looks similar to Dice loss when they are reformulated.

  • Reviewer confidence

    Very confident

  • [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed

    N/A

  • [Post rebuttal] Please justify your decision

    N/A



Review #3

  • Please describe the contribution of the paper

    This paper proposes a boundary DoU loss for medical image segmentation. It handles boundary without erode operation, instead it uses difference between union and intersection, then division a weighted difference as the loss. This method brings improvements especially on boundary and small targets.

  • Please list the main strengths of the paper; you should write about a novel formulation, an original way to use data, demonstration of clinical feasibility, a novel application, a particularly strong evaluation, or anything else that is a strong aspect of this work. Please provide details, for instance, if a method is novel, explain what aspect is novel and why this is interesting.
    1. The performances are good compared with the baselines.
    2. The discussion about DoU loss and Dice loss is adequate.
  • Please list the main weaknesses of the paper. Please provide details, for instance, if you think a method is not novel, explain why and provide a reference to prior work.
    1. The name “difference over union” is inappropriate. Eq.2 is more close to difference over weighted difference, and Eq.4 can be called weighted Dice loss.
    2. It’s wired to use numerator and denominator in the same meaning (difference). In another word, it makes things complex. Compared with it, Eq.4 is more fundamental but too simple.
    3. The key of this loss is C/S in Eq.3. It reflects the degree of regularity of objects (circle is the smallest value, and irregular objects are larger). So this loss is a reweight strategy according to the shape in fact. But this key reason is not discussed and explained. The explanation is not correct in the aspect of size, because if the shape is the same, large object and small object has the same α.
    4. Some typos lead to confusion, eg. “Ground Truth with Boundary Area” Fig.1 should be “Ground Truth of Boundary Area”.
  • Please rate the clarity and organization of this paper

    Poor

  • Please comment on the reproducibility of the paper. Note, that authors have filled out a reproducibility checklist upon submission. Please be aware that authors are not required to meet all criteria on the checklist - for instance, providing code and data is a plus, but not a requirement for acceptance

    It’s easy to reproduce.

  • Please provide detailed and constructive comments for the authors. Please also refer to our Reviewer’s guide on what makes a good review: https://conferences.miccai.org/2023/en/REVIEWER-GUIDELINES.html

    This paper makes simple thing to be complex, and lack of accuracy explanation. It’s better to improve the paper in the aspect of sampling or reweighting.

  • Rate the paper on a scale of 1-8, 8 being the strongest (8-5: accept; 4-1: reject). Spreading the score helps create a distribution for decision-making

    3

  • Please justify your recommendation. What were the major factors that led you to your overall score for this paper?

    This name is inaccurate, and the numerator and denominator of loss use the same meaning (difference), which is wired. Besides, this loss can be simplified to weighted Dice loss, but the adaptive weight is too sample and explained wrong.

  • Reviewer confidence

    Very confident

  • [Post rebuttal] After reading the author’s rebuttal, state your overall opinion of the paper if it has been changed

    N/A

  • [Post rebuttal] Please justify your decision

    N/A




Primary Meta-Review

  • Please provide your assessment of this work, taking into account all reviews. Summarize the key strengths and weaknesses of the paper and justify your recommendation. In case you deviate from the reviewers’ recommendations, explain in detail the reasons why. In case of an invitation for rebuttal, clarify which points are important to address in the rebuttal.

    While this paper has received mixed scores, the general comments are mainly positive. R1, R2 acknowledge the novelty of the proposed approach, its proper motivation and stress the extensive experiments to evaluate the presented loss function. Furthermore, reviewers have raised some concerns that can further improve the manuscript. In particular, R2 questions the results reported for Boundary loss (which seem to contradict the original and other prior works) and requests to discuss these results, R1 believes that the inclusion of missing hyperparameters could help in the reproducibility, and R3 has several concerns about the presentation of the formulation. Given the overall positive comments, and the fact that most of the raised concerns could be addressed in a minor revision, I recommend the acceptance of this work. Last, I strongly encourage the authors to consider the constructive feedback provided by the reviewers to prepare the camera ready version of this work, as well as share the code from this work with the community.




Author Feedback

N/A



back to top