Publication | Closed Access
MM-GAN: 3D MRI Data Augmentation for Medical Image Segmentation via Generative Adversarial Networks
42
Citations
28
References
2020
Year
Unknown Venue
Artificial IntelligenceMedical Image SegmentationEngineeringMachine LearningTumor SegmentationImage AnalysisData ScienceMri AugmentationGenerative ModelMri Data AugmentationRadiologySynthetic Image GenerationHealth SciencesData AugmentationMedical ImagingNeuroimagingComputer ScienceMedical Image ComputingDeep LearningGenerative Adversarial NetworkBiomedical ImagingGenerative Adversarial NetworksMedical Image AnalysisImage Segmentation3D Imaging
Due to the limited amount of the labelled dataset, which hampers the training of deep architecture in medical imaging. The data augmentation is an effective way to extend the training dataset for medical image processing. However, subjective intervention is inevitable during this process, not only in the pertinent augmentation but also the non-pertinent augmentation. In this paper, to simulate the distribution of real data and sample new data from the distribution of limited data to populate the training set, we propose a generative adversarial network based architecture for the MRI augmentation and segmentation (MM-GAN), which can translate the label maps to 3D MR images without worrying about violating the pathology. Through a series of experiments of the tumor segmentation on BRATS17 dataset, we validate the effectiveness of MM-GAN in data augmentation and anonymization. Our approach improves the dice scores of the whole tumor and the tumor core by 0.17 and 0.16 respectively. With our method, only 29 samples are used for fine-tuning the model trained with the pure fake data and achieve comparable performance to the real data, which demonstrates the ability for the patient privacy protection. Furthermore, to verify the expandability of MM-GAN model, the dataset LIVER100 is collected. Experiment results on the LIVER100 illustrate similar outcome as on BRATS17, which validates the performance of our model.
| Year | Citations | |
|---|---|---|
Page 1
Page 1