IJITCS Vol. 8, No. 4, Apr. 2016
Cover page and Table of Contents: PDF (size: 190KB)
REGULAR PAPERS
Cloud computing is the new era technology, which is entirely dependent on the internet to maintain large applications, where data is shared over one platform to provide better services to clients belonging to a different organization. It ensures maximum utilization of computational resources by making availability of data, software and infrastructure with lower cost in a secure, reliable and flexible manner. Though cloud computing offers many advantages, but it suffers from certain limitation too, that during load balancing of data in cloud data centers the internet faces problems of network congestion, less bandwidth utilization, fault tolerance and security etc. To get rid out of this issue new computing model called Fog Computing is introduced which easily transfer sensitive data without delaying to distributed devices. Fog is similar to the cloud only difference lies in the fact that it is located more close to end users to process and give response to the client in less time. Secondly, it is beneficial to the real time streaming applications, sensor networks, Internet of things which need high speed and reliable internet connectivity. Our proposed architecture introduced a new scheduling policy for load balancing in Fog Computing environment, which complete real tasks within deadline, increase throughput and network utilization, maintaining data consistency with less complexity to meet the present day demand of end users.
[...] Read more.Sandboxing is a mechanism to monitor and control the execution of malicious or untrusted program. Memory overhead incurred by sandbox solutions is one of bottleneck for sandboxing most of applications in a system. Memory reclamation techniques proposed for traditional full virtualization do not suit sandbox environment due to lack of full scale guest operating system in sandbox. In this paper, we propose memory reclamation technique for sandboxed applications. The proposed technique indigenously works in virtual machine monitor layer without installing any driver in VMX non root mode and without new communication channel with host kernel. Proposed Page reclamation algorithm is a simple modified form of Least recently used page reclamation and Working set page reclamation algorithms. For efficiently collecting working set of application, we use a hardware virtualization extension, page Modification logging introduced by Intel. We implemented proposed technique with one of open source sandboxes to show effectiveness of proposed memory reclamation method. Experimental results show that proposed technique successfully reclaim up to 11% memory from sandboxed applications with negligible CPU overheads.
[...] Read more.In an e-learning environment, learn in a collaborative way is not always so easy because one of the difficulties when arranging e-learning contents can be that these contents and learning paths are not adapted to this type of learning. Online courses are constructed in a way that does not stimulate interaction, cooperation and collaborative learning. This is why the e-learning often is seen as individual and lonely. In this sense, one way to reduce these problems and promote collaborative e-learning and interaction learner-learner and learner-teacher is to model these contents in the form of collaborative educational games.
The primary aim of this work is to exploit the potential of educational games to improve students' collaboration in e-learning environments. Thus, this paper presents a framework for designing, implementing and building collaborative educational games targeted to e-learning. The proposed framework is composed of two main phases: game design phase that consists to propose a collaborative design process of educational games; and game development phase that consists to implement, package, describe and deliver the games using the IMS-LD standard. The paper describes the steps followed for modeling games, the framework architecture and adopted technical choices. The final framework supports the creation and the use of such games using one of the most popular tools of learning in the web era: the LMS (Learning Management System).
The computational intelligence such as artificial neural network (ANN) and fuzzy inference system (FIS) is a strong tool for prediction and simulation in engineering applications. In this paper, radial basis function (RBF) network and adaptive neuro-fuzzy inference system (ANFIS) are used for prediction of IC50 (the 50% inhibitory concentration) values evaluated by the MTT assay in human cancer cell lines. For developing of the proposed models, the input parameters are the concentration of the drug and the types of cell lines and the output is IC50 values in the A549, H157, H460 and H1975 cell lines. The predicted IC50 values using the proposed RBF and ANFIS models are compared with the experimental data. The obtained results show that both RBF and ANFIS models have achieved good agreement with the experimental data. Therefore, the proposed RBF and ANFIS models are useful, reliable, fast and cheap tools to predict the IC50 values determined by the MTT assay in human cancer cell lines.
[...] Read more.Identifying strongly associated clusters in large complex networks has received an increased amount of interest since the past decade. The problem of community detection in complex networks is an NP complete problem that necessitates the clustering of a network into communities of compactly linked nodes in such a manner that the interconnection between the nodes is found to be denser than the intra-connection between the communities. In this paper, different approaches given by the authors in the field of community detection have been described with each methodology being classified according to algorithm type, along with the comparative analysis of these approaches on the basis of NMI and Modularity for four real world networks.
[...] Read more.There has been many attempts to make authentication processes more robust. Biometric techniques are one among them. Biometrics is unique to an individual and hence their usage can overcome most of the issues in conventional authentication process. This paper makes a scrutinizing study of the existing biometric techniques, their usage and limitations pertaining to their deployment in real time cases. It also deals with the motivation behind adapting biometrics in present day scenarios. The paper also makes an attempt to throw light on the technical and security related issues pertaining to biometric systems.
[...] Read more.The basic rough set theory introduced by Pawlak as a model to capture imprecision in data has been extended in many directions and covering based rough set models are among them. Again from the granular computing point of view, the basic rough sets are unigranular by nature. Two types of extensions to the context of multigranular computing are done; called the optimistic and pessimistic multigranulation by Qian et al in 2006 and 2010 respectively. Combining these two concepts of covering and multigranulation, covering based multigranular models have been introduced by Liu et al in 2012. Extending the stringent concept of mathematical equality of sets rough equalities were introduced by Novotny and Pawlak in 1985. Three more types of such approximate equalities were introduced by Tripathy in 2011. In this paper we study the approximate equalities introduced by Novotny and Pawlak from the pessimistic multigranular computing point of view and establish several of their properties. These concepts and properties are shown to be useful in approximate reasoning.
[...] Read more.To understand completely the malicious intents of a zero-day malware there is really no automated way. There is no single best approach for malware analysis so it demands to combine existing static, dynamic and manual malware analysis techniques in a single unit. In this paper a hybrid real-time analysis and reporting system is presented. The proposed system integrates various malware analysis tools and utilities in a component-based architecture. The system automatically provides detail result about zero-day malware's behavior. The ultimate goal of this analysis and reporting is to gain a quick and brief understanding of the malicious activity performed by a zero-day malware while minimizing the time frame between the detection of zero-day attack and generation of a security solution. The results are paramount valuable for a malware analyst to perform zero-day malware detection and containment.
[...] Read more.This paper presents secure and high capacity watermarking technique using novel approach of Image Partitioning-Merging Scheme (IPMS). The IPMS is used as reduction method to reduce the size of watermark logically and increases security levels of proposed watermarking technique. The technique effectively uses special properties of Discrete Wavelet Transform (DWT), Fast Walse Hadamrd Transform (FWHT), Singular Value Decomposition (SVD) and proved strongly robust to 14 noise addition and filtering attacks. Fibonacci Lucas Transform (FLT) is used as effective scrambling technique to scramble the watermark to provide additional security in embedding process. Many researchers failed to achieve imperceptibility and robustness under high watermark embedding scenario with strong security provision as these quality parameters conflict each other. The novel technique presented here archives imperceptibility, high capacity watermark embedding, security and robustness against 14 noise addition and filtering attacks. The technique is non-blind and tested with grey scale cover images of size 512x512 and watermark images of size 512x512. The experimental results demonstrate that Lena image gives 75.8446dBs imperceptibility which is measured in terms of Peak signal to noise ratio. The robustness is measured in terms of normalized correlation (NC) equals to 1 showing exact recovery of watermark. The method is found strongly robust against noise addition and filtering attacks with compared to existing watermarking methods under consideration.
[...] Read more.Web applications are useful for various online services. These web applications are becoming ubiquitous in our daily lives. They are used for multiple purposes such as e-commerce, financial services, emails, healthcare services and many other captious services. But the presence of vulnerabilities in the web application may become a serious cause for the security of the web application. A web application may contain different types of vulnerabilities. Cross-site scripting is one of the type of code injection attacks. According to OWASP TOP 10 vulnerability report, Cross-site Scripting (XSS) is among top 5 vulnerabilities. So this research work aims to implement an effective solution for the prevention of cross- site scripting vulnerabilities. In this paper, we implemented a novel client-side XSS sanitizer that prevents web applications from XSS attacks. Our sanitizer is able to detect cross-site scripting vulnerabilities at the client-side. It strengthens web browser, because modern web browser do not provide any specific notification, alert or indication of security holes or vulnerabilities and their presence in the web application.
[...] Read more.