Work place: Widyatama University, Bandung, 40125, Indonesia
E-mail: ilham.gustisyaf@widyatama.ac.id
Website:
Research Interests: Computer systems and computational processes, Information Systems, Data Structures and Algorithms, Information-Theoretic Security
Biography
Ahmad Ilham Gustisyaf, born in Balikpapan, March 10, 1998. He lives at Bandung, West Java. He has attended elementary school at SDS Tamansiswa Jakarta, SMPN 14 Jakarta, and SMAN 1 Mimika.
Graduating from high school in 2016, he applied to study at Widyatama University, majoring in informatics. Now his concentration on his lectures, namely information and technology.
By Ahmad Ilham Gustisyaf Ardiles Sinaga
DOI: https://doi.org/10.5815/ijmecs.2021.04.05, Pub. Date: 8 Aug. 2021
Gender is one of the vital information to identify someone. If we can decide with conviction whether an individual is male or female, it will restrain the inquiry list and abbreviate the pursuit time. The way toward distinguishing fingerprints is one of the significant, simple to do assortment strategies, the cost is cheap, and a dactyloscopy authority does the particular outcome. The classification of the image gets the issues in computer vision, where a computer can mimic the capacity of an individual to comprehend the data in the image. Process of classifying image can be performing with deep learning where the process like the working of the brain in thinking and trying to reproduce part of its functions by using units associated with relationship, like a neuron. Convolutional neural network is one type of deep learning. In this research, will be doing to classification gender based on fingerprint using method Convolutional Neural Network, and then we will make three models to determined gender, with a total of 49270 image data that included test data and training data by classifying two categories, male and female. Of the three models, we are taking the highest accuracy to use in making this application. Results of this research is we get Model2 will be used as a model CNN with the accuracy level of 99.9667%.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals