International Journal of Modern Education and Computer Science (IJMECS)

IJMECS Vol. 6, No. 6, Jun. 2014

Cover page and Table of Contents: PDF (size: 590KB)

Table Of Contents

REGULAR PAPERS

An Investigation into Mobile Application Development Processes: Challenges and Best Practices

By Harleen K. Flora Xiaofeng Wang Swati V. Chande

DOI: https://doi.org/10.5815/ijmecs.2014.06.01, Pub. Date: 8 Jun. 2014

The mobile device market has witnessed swift industrial growth over the last decade. The quick expansion of this new computing platform has almost outpaced the software engineering processes customized to mobile application development. However, there is still lack of novel research initiatives around the mobile application development process. There remains a deficiency in development standards and best practices which expose the mobile device to potential attacks. This deficiency needs to be addressed promptly and requires further work.
The objective of this research is to better understand the current methodologies adapted and to investigate challenges faced during the mobile application development processes that are different from traditional enterprise application. For this purpose, an online survey was conducted from the mobile research and development community. The survey questions covered the entire mobile application development lifecycle starting with requirements, and ending with bringing to life a complete mobile application.
The study contributes towards a greater understanding of mobile application development process, examines real challenges confronted, and investigates the best practices that can be successfully implemented to enhance, evaluate, and improve the performance of the mobile application development process. These findings can also be considered as a possible research topic that indicates the breadth of research requirements and prospects in mobile computing.

[...] Read more.
Studies on ICT Usage in the Academic CampusUsing Educational Data Mining

By Ajay Auddy Sripati Mukhopadhyay

DOI: https://doi.org/10.5815/ijmecs.2014.06.02, Pub. Date: 8 Jun. 2014

Inthe era of competition, change and complexity, innovation in teaching and learning practices in higher education sector has become unavoidable criteria.One of the biggest challenges that higher education system faces today is to assessthe services provided through Information and Communication Technology(ICT) facilities installed in the campus. This paper studies the responses collected through survey on ICT, in the campusof the University of Burdwan, among the students and research scholarswith the help ofan effective data mining methodology - Variable Consistency Dominance-based Rough Set Approach (VC-DRSA) model to extract meaningful knowledge to improve the quality of managerial decisions in this sphere. It is an extended version of Dominance Rough Set Approach (DRSA) and is applied here to generate a set of recommendations that can help the university to improvise the existing services and augmenting the boundaries of ICT in future development.

[...] Read more.
Mathematical Modeling and Analysis of Network Service Failure in DataCentre

By Malik UsmanDilawar Faiza Ayub Syed

DOI: https://doi.org/10.5815/ijmecs.2014.06.04, Pub. Date: 8 Jun. 2014

World has become a global village. With the advent of technology, the concept of Cloud Computing has evolved a lot. Cloud computing offers various benefits in terms of storage, computation, cost and flexibility. It focuses on delivering combination of technological components such as applications, platforms, infrastructure, security and web hosted services over the internet. One of the major elements of Cloud Computing infrastructure is Data centre. Companies host there applications and services online through Data centres, whose probability of downtime is expected to be very low. Since, data centre consists of number of servers; the rate of service failure is usually high. In this paper we have analysed service failure rate of a conventional data centre. The Fault Trend of network failure by assuming there occurrence as a Poisson Process was made. The accurate prediction of fault rate helps in managing the up gradation, replacement and other administrative issues of data centre components.

[...] Read more.
An Automatic Approach to Detect Software Anomalies in Cloud Computing Using Pragmatic Bayes Approach

By Nethaji V Chandrasekar C

DOI: https://doi.org/10.5815/ijmecs.2014.06.05, Pub. Date: 8 Jun. 2014

Software detection of anomalies is a vital element of operations in data centers and service clouds. Statistical Process Control (SPC) cloud charts sense routine anomalies and their root causes are identified based on the differential profiling strategy. By automating the tasks, most of the manual overhead incurred in detecting the software anomalies and the analysis time are reduced to a larger extent but detailed analysis of profiling data are not performed in most of the cases. On the other hand, the cloud scheduler judges both the requirements of the user and the available infrastructure to equivalent their requirements. OpenStack prototype works on cloud trust management which provides the scheduler but complexity occurs when hosting the cloud system. At the same time, Trusted Computing Base (TCB) of a computing node does not achieve the scalability measure. This unique paradigm brings about many software anomalies, which have not been well studied. This work, a Pragmatic Bayes approach studies the problem of detecting software anomalies and ensures scalability by comparing information at the current time to historical data. In particular, PB approach uses the two component Gaussian mixture to deviations at current time in cloud environment. The introduction of Gaussian mixture in PB approach achieves higher scalability measure which involves supervising massive number of cells and fast enough to be potentially useful in many streaming scenarios. Wherein previous works has been ensured for scheduling often lacks of scalability, this paper shows the superiority of the method using a Bayes per section error rate procedure through simulation, and provides the detailed analysis of profiling data in the marginal distributions using the Amazon EC2 dataset. Extensive performance analysis shows that the PB approach is highly efficient in terms of runtime, scalability, software anomaly detection ratio, CPU utilization, density rate, and computational complexity.

[...] Read more.
Dynamic Effort Allocation Problem Using Genetic Algorithm Approach

By Md. Nasar Prashant Johri Udayan Chanda

DOI: https://doi.org/10.5815/ijmecs.2014.06.06, Pub. Date: 8 Jun. 2014

Effort distribution plays a major role in software engineering field. Because the limited price projects are becoming common today, the process of effort estimation becomes crucial, to control the budget agreed upon. In last 10 years, numerous software reliability growth models (SRGM) have been developed but majority of model are under static assumption. The basic goal of this article is to explore an optimal resource allocation plan to minimize the software cost throughout the testing phase and operational phase under dynamic condition using genetic algorithm technique. This article also studies the resource allocation problems optimally for various conditions by investigating the activities of the model parameters and also suggests policies for the optimal release time of the software in market place.

[...] Read more.
Identification and Classification of Adenovirus Particles in Digital Microscopic Images using Active Contours

By Manjunatha Hiremath

DOI: https://doi.org/10.5815/ijmecs.2014.06.07, Pub. Date: 8 Jun. 2014

Medical imaging is the technique and process used to create images of the human body or medical science. Digital image processing is the use of computer algorithms to perform image processing on digital images. Microscope image processing dates back a half century when it was realized that some of the techniques of image capture and manipulation, first developed for television, could also be applied to images captured through the microscope. This paper presents semi-automated segmentation and identification of adenovirus particles using active contour with multi grid segmentation model. The geometric features are employed to identify the adenovirus particles in digital microscopic image. The min-max, 3 rules are used for recognition of adenovirus particles. The results are compared with manual method obtained by microbiologist.

[...] Read more.
A Harmony Search Algorithm with Multi-pitch Adjustment Rate for Symbolic Time Series Data Representation

By Almahdi M. Ahmed Azuraliza Abu Bakar Abdul Razak Hamdan

DOI: https://doi.org/10.5815/ijmecs.2014.06.08, Pub. Date: 8 Jun. 2014

The representation task in time series data mining has been a critical issue because the direct manipulation of continuous, high-dimensional data is extremely difficult to complete efficiently. One time series representation approach is a symbolic representation called the Symbolic Aggregate Approximation (SAX). The main function of SAX is to find the appropriate numbers of alphabet symbols and word size that represent the time series. The aim is to achieve the largest alphabet size and maximum word length with the minimum error rate. The purpose of this study is to propose an integrated approach for a symbolic time series data representation that attempts to improve SAX by improving alphabet and word size. The Relative Frequency (RF) binning method is employed to obtain alphabet size and is integrated with the proposed Multi-pitch Harmony Search (HSMPAR) algorithm to calculate the optimum alphabet and word size. RF is used because of its ability to obtain a sufficient number of intervals with a low error rate compared to other related techniques. HSMPAR algorithm is an optimization algorithm that randomly generates solutions for alphabet and word sizes and selects the best solutions. HS algorithms are compatible with multi-pitch adjustment. The integration of the RF and HSMPAR algorithms is developed to maximize information rather than to improve the error rate. The algorithms are tested on 20 standard time series datasets and are compared with the meta-heuristic algorithms GENEBLA and the original SAX algorithm. The experimental results show that the proposed method generates larger alphabet and word sizes and achieves a lower error rate than the compared methods. With larger alphabet and word sizes, the proposed method is capable of preserving important information.

[...] Read more.