Home > Published Issues > 2022 > Volume 10, No. 4, December 2022 >
JOIG 2022 Vol.10(4): 166-171
doi: 10.18178/joig.10.4.166-171

A Comparison of Applying Image Processing and Deep Learning in Acne Region Extraction

Chengrui Zhang 1, Guangyao Huang 1, Kai Yao 1, Mark Leach 1, Jie Sun 1, Kaizhu Huang 2, Xiaoyun Zhou 3, and Liqiong Yuan 3
1. School of Advanced Technology, Xi’an Jiaotong-Liverpool University, Suzhou, China
2. Institute of Applied Physical Sciences and Engineering, Duke Kunshan University, Kunshan, China
3. Suzhou Hospital of Traditional Chinese Medicine, Suzhou, China

Abstract—Quantifying acne on face images is considered as a challenging task due to the complex skin surfaces, irregular edges and diverse appearances of acnes. A key in this campaign relies upon segmenting lesion areas precisely in the captured images against varying Imaging situation, e.g., illumination, skin condition, imaging angles and etc. To processing these acne data, either theory-driven image processing methods or data-driven deep learning (DL) based methods are commonly utilized in practice. In order to investigate the advantage and shortcoming of the abovementioned two technology roadmaps in quantifying acne task, we develop an image processing pipeline, and make comparison with the state-of-the-art DL methods such as SegFormer, UNETR, Swin-UNet and TransUNet using small data set. The quantitative comparison results reveal that TransUNet performs better in terms of precision, recall, F1-Score and accuracy, whilst image processing methods still have potential in practice due to its annotation-free.

Index Terms—deep learning, image processing, neural network, image segmentation method

Cite: Chengrui Zhang, Guangyao Huang, Kai Yao, Mark Leach, Jie Sun, Kaizhu Huang, Xiaoyun Zhou, and Liqiong Yuan, "A Comparison of Applying Image Processing and Deep Learning in Acne Region Extraction," Journal of Image and Graphics, Vol. 10, No. 4, pp. 166-171, December 2022.

Copyright © 2022 by the authors. This is an open access article distributed under the Creative Commons Attribution License (CC BY-NC-ND 4.0), which permits use, distribution and reproduction in any medium, provided that the article is properly cited, the use is non-commercial and no modifications or adaptations are made.