Defective sewing stitch semantic segmentation using DeeplabV3+ and EfficientNet
DOI:
https://doi.org/10.4114/intartif.vol25iss70pp64-76Keywords:
Computer Vision, Semantic Segmentation, Convolutional Neural Networks, DeeplabV3 , EfficientNetAbstract
Defective stitch inspection is an essential part of garment manufacturing quality assurance. Traditional mechanical defect detection systems are effective, but they are usually customized with handcrafted features that must be operated by a human. Deep learning approaches have recently demonstrated exceptional performance in a wide range of computer vision applications. The requirement for precise detail evaluation, combined with the small size of the patterns, undoubtedly increases the difficulty of identification. Therefore, image segmentation (semantic segmentation) was employed for this task. It is identified as a vital research topic in the field of computer vision, being indispensable in a wide range of real-world applications. Semantic segmentation is a method of labeling each pixel in an image. This is in direct contrast to classification, which assigns a single label to the entire image. And multiple objects of the same class are defined as a single entity. DeepLabV3+ architecture, with encoder-decoder architecture, is the proposed technique. EfficientNet models (B0-B2) were applied as encoders for experimental processes. The encoder is utilized to encode feature maps from the input image. The encoder's significant information is used by the decoder for upsampling and reconstruction of output. Finally, the best model is DeeplabV3+ with EfficientNetB1 which can classify segmented defective sewing stitches with superior performance (MeanIoU: 94.14%).
Downloads
Metrics
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Iberamia & The Authors
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Open Access publishing.
Lic. under Creative Commons CC-BY-NC
Inteligencia Artificial (Ed. IBERAMIA)
ISSN: 1988-3064 (on line).
(C) IBERAMIA & The Authors