Jasman Pardede

Work place: Department of Informatics Engineering, Institut Teknologi Nasional Bandung, Bandung, Indonesia

E-mail: jasman@itenas.ac.id

Website:

Research Interests:

Biography

Jasman Pardede received Bachelor degree in science math from Universitas Andalas (Unand), Indonesia, in 2001, and the M.Eng degree in Informatic Engineering from Institut Teknologi Bandung (ITB), Bandung, Indonesia in 2005. Since 2005 until now, he is active as a lecturer at Institut Teknologi Nasional Bandung. He currently continuing his doctoral studies at Institut Teknologi Bandung (ITB), Bandung, Indonesia since 2016 in field Image Retrieval.

Author Articles
Implementation of Transfer Learning Using VGG16 on Fruit Ripeness Detection

By Jasman Pardede Benhard Sitohang Saiful Akbar Masayu Leylia Khodra

DOI: https://doi.org/10.5815/ijisa.2021.02.04, Pub. Date: 8 Apr. 2021

In previous studies, researchers have determined the classification of fruit ripeness using the feature descriptor using color features (RGB, GSL, HSV, and L * a * b *). However, the performance from the experimental results obtained still yields results that are less than the maximum, viz the maximal accuracy is only 76%. Today, transfer learning techniques have been applied successfully in many real-world applications. For this reason, researchers propose transfer learning techniques using the VGG16 model. The proposed architecture uses VGG16 without the top layer. The top layer of the VGG16 replaced by adding a Multilayer Perceptron (MLP) block. The MLP block contains Flatten layer, a Dense layer, and Regularizes. The output of the MLP block uses the softmax activation function. There are three Regularizes that considered in the MLP block namely Dropout, Batch Normalization, and Regularizes kernels. The Regularizes selected are intended to reduce overfitting. The proposed architecture conducted on a fruit ripeness dataset that was created by researchers. Based on the experimental results found that the performance of the proposed architecture has better performance. Determination of the type of Regularizes is very influential on system performance. The best performance obtained on the MLP block that has Dropout 0.5 with increased accuracy reaching 18.42%. The Batch Normalization and the Regularizes kernels performance increased the accuracy amount of 10.52% and 2.63%, respectively. This study shows that the performance of deep learning using transfer learning always gets better performance than using machine learning with traditional feature extraction to determines fruit ripeness detection. This study gives also declaring that Dropout is the best technique to reduce overfitting in transfer learning.

[...] Read more.
Other Articles