Multi-Class Classification of Skin Cancer Images Using a Deep Learning-Based Convolutional Neural Network (CNN)
DOI:
https://doi.org/10.63318/waujpasv3i2_29Keywords:
Convolutional neural network, Deep leaning, Multi-class classification, Random oversampling, Skin cancerAbstract
Skin cancer is one of the most common and serious types of cancer, often resulting from the transformation of melanocytes due to excessive exposure to ultraviolet radiation. Early diagnosis significantly improves survival rates and reduces mortality. However, traditional visual diagnosis faces challenges in distinguishing between the seven main categories of skin conditions due to visual similarities and data imbalance. This study aims to develop an automated diagnostic model based on a Convolutional Neural Network (CNN) to enhance classification accuracy and reduce reliance on human evaluation. The HAM10000 dataset (Human Against Machine with 10,000 Training Images), comprising 10,015 images across seven categories, was used, with class imbalance addressed through Random Oversampling. The data were split into 80% for training and 20% for testing, with 20% of the training set reserved for validation. The model was constructed using sequential Conv2D and MaxPooling layers for feature extraction, followed by Flatten and Dense layers for final classification. Training was performed with the Adam optimizer for adaptive weight adjustment and the Categorical Crossentropy loss function suitable for multi-class classification. The model achieved an overall accuracy of 95.45% and demonstrated strong performance across all categories based on Precision, Recall, and F1-Score, with Precision ranging from 84% to 99% and a weighted average of 96% for both Precision and F1-Score. These results highlight the proposed model’s clinical significance as an intelligent decision-support tool, enhancing early diagnosis, reducing human error, and improving healthcare quality.
Downloads
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
This journal uses Creative Commons Attribution-Noncommerical 4.0 International License (CC BY-NC 4.0), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc/4.0/.
Copyright of articles
Authors retain copyright of their articles published in this journal.