Work place: Department of Information Science & Engineering, NMAM Institute of Technology, Nitte, India
E-mail: abhishekrao@nitte.edu.in
Website:
Research Interests: Computational Learning Theory, Computer Vision, Computer Architecture and Organization, Data Structures and Algorithms
Biography
Abhishek S. Rao received his B. E degree in Information Science & Engineering from Canara Engineering College, Bantwal, M.Tech degree in Computer Science and Engineering from NMAM Institute of Technology, Nitte, Visvesvaraya Technological University, India and M.B.A degree in Operations Management from MIT, Pune. His major research interest is in the fields of Machine Learning, Deep Learning, and Computer Vision. He has 2 years of industrial experience and 9 years of teaching experience. He is currently working as an Assistant Professor in the Department of Information Science & Engineering at NMAM Institute of Technology, Nitte. He is a member of ISTE and IAENG
By Rajgopal K T Abhishek S. Rao Ramaprasad Poojary Deepak D
DOI: https://doi.org/10.5815/ijmecs.2023.06.04, Pub. Date: 8 Dec. 2023
In the recent era, there has been a significant surge in the demand for cloud computing due to its versatile applications in real-time situations. Cloud computing efficiently tackles extensive computing challenges, providing a cost-effective and energy-efficient solution for cloud service providers (CSPs). However, the surge in task requests has led to an overload on cloud servers, resulting in performance degradation. To address this problem, load balancing has emerged as a favorable approach, wherein incoming tasks are allocated to the most appropriate virtual machine (VM) according to their specific needs. However, finding the optimal VM poses a challenge as it is considered a difficult problem known as NP-hard. To address this challenge, current research has widely adopted meta-heuristic approaches for solving NP-hard problems. This research introduces a novel hybrid optimization approach, integrating the particle swarm optimization algorithm (PSO) to handle optimization, the gravitational search algorithm (GSA) to improve the search process, and leveraging fuzzy logic to create an effective rule for selecting virtual machines (VMs) efficiently. The integration of PSO and GSA results in a streamlined process for updating particle velocity and position, while the utilization of fuzzy logic assists in discerning the optimal solution for individual tasks. We assess the efficacy of our suggested method by gauging its performance through various metrics, including throughput, makespan, and execution time. In terms of performance, the suggested method demonstrates commendable performance, with average load, turnaround time, and response time measuring at 0.168, 18.20 milliseconds, and 11.26 milliseconds, respectively. Furthermore, the proposed method achieves an average makespan of 92.5 milliseconds and average throughput performance of 85.75. The performance of the intended method is improved by 90.5%, 64.9%, 36.11%, 24.72%, 18.27%, 11.36%, and 5.21 in comparison to the existing techniques. The results demonstrate the efficacy of this approach through significant improvements in execution time, CPU utilization, makespan, and throughput, providing a valuable contribution to the field of cloud computing load balancing.
[...] Read more.By Nagesh Shenoy H K. R. Anil Kumar Suchitra N Shenoy Abhishek S. Rao Rajgopal K T
DOI: https://doi.org/10.5815/ijwmt.2021.05.02, Pub. Date: 8 Oct. 2021
The demand for cloud computing systems has increased tremendously in the IT sector and various business applications due to their high computation and cost-effective solutions to various computing problems. This increased demand has raised several challenges such as load balancing and security in cloud systems. Numerous approaches have been presented for load balancing but providing security and maintaining integrity and privacy remains a less explored research area. Intrusion detection systems have emerged as a promising solution to predict attacks. In this work, we develop a deep learning-based scheme that contains data pre-processing, convolution operations, BiLSTM model, attention layer, and CRF modeling. The current study employs a machine learning-based approach to detect intrusions based on the attackers' historical behavior. Deep learning algorithms were used to extract features from the image and determine the significance of dense packets to generate the salient fine-grained feature that can be used to detect malicious traffic and presents the final classification using fused features.
[...] Read more.DOI: https://doi.org/10.5815/ijitcs.2017.12.03, Pub. Date: 8 Dec. 2017
Overcharging the passengers who are not familiar with the city for the services in railway stations and airport is a serious problem which can be addressed by installing prepaid counters. Prepaid counters encounter the problem of higher passenger frequency due to which the waiting time increases. The only alternative solution to this problem is queue formation for effective service. In most of the cases, only one counter for service is available due to which the queue length increases; thereby the passengers may lose patience and move away which may lead to the loss to the counter. In this paper, queuing model is used to solve this real case scenario in an optimal way. The experimental data were obtained from the counter and derived using Little’s Theorem and Single Server Queuing Model (M/M/1). Hence an attempt was made in studying the benefits of queuing model and optimizing it to minimize the waiting time thereby increasing the profit. This approach will help a busy prepaid autorickshaw counter to give service to passengers in a most feasible way.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals