International Journal of Information Technology and Computer Science (IJITCS)

IJITCS Vol. 9, No. 3, Mar. 2017

Cover page and Table of Contents: PDF (size: 218KB)

Table Of Contents

REGULAR PAPERS

A Concave Hull Based Algorithm for Object Shape Reconstruction

By Zahrah Yahya Rahmita W Rahmat Fatimah Khalid Amir Rizaan Ahmad Rizal

DOI: https://doi.org/10.5815/ijitcs.2017.03.01, Pub. Date: 8 Mar. 2017

Hull algorithms are the most efficient and closest methods to be redesigned for connecting vertices for geometric shape reconstruction. The vertices are the input points representing the original object shape. Our objective is to reconstruct the shape and edges but with no information on any pattern, it is challenging to reconstruct the lines to resemble the original shape. By comparing our results to recent concave hull based algorithms, two performance measures were conducted to evaluate the accuracy and time complexity of the proposed method. Besides achieving the most acceptable accuracy which is 100%, the time complexity of the proposed algorithm is evaluated to be O(wn). All results have shown a competitive and more effective algorithm compared to the most efficient similar ones. The algorithm is shown to be able to solve the problems of vertices connection in an efficient way by devising a new approach.

[...] Read more.
ATAM-based Architecture Evaluation Using LOTOS Formal Method

By Muhammad Usman Ashraf Wajdi Aljedaibi

DOI: https://doi.org/10.5815/ijitcs.2017.03.02, Pub. Date: 8 Mar. 2017

System Architecture evaluation and formal specification are the significant processes and practical endeavors in all domains. Many methods and formal descriptive techniques have been proposed to make a comprehensive analysis and formal representation of a system architecture. This paper consists of two main parts, in first we evaluated system performance, quality attribute in Remote Temperature Sensor clients-Server architecture by implementing an ATAM model, which provides a comprehensive support for evaluation of architecture designs by considering design quality attributes and how they can be represented in the architecture. In the second part, we computed the selected system architecture in ISO standards formal description technique LOTOS with which a system can be specified by the temporal relation between interactions and behavior of the system. Our proposed approach improves on factors such as ambiguity, inconsistency and incompleteness in current system architecture.

[...] Read more.
A Real-Time 6DOF Computational Model to Simulate Ram-Air Parachute Dynamics

By Sandaruwan Gunasinghe GKA Dias Damitha Sandaruwan Maheshya Weerasinghe

DOI: https://doi.org/10.5815/ijitcs.2017.03.03, Pub. Date: 8 Mar. 2017

Computer simulations are used in many disciplines as a methodology of mimicking behaviors of a physical system or a process. In our study we have developed a real-time six degree- of-freedom (6DOF), computational model that can replicate the dynamics of a ram-air parachute system which is the design of parachute that is widely used by the militaries worldwide for parachute jumps. The proposed model is expected to be adapted in to a real-time visual simulator which would be used to train parachute jumpers. A statistical evaluation of the proposed model is done using a dataset taken from NASA ram-air parachute wind tunnel test.

[...] Read more.
Study of Context Modelling Criteria in Information Retrieval

By Melyara. Mezzi Nadjia. Benblidia

DOI: https://doi.org/10.5815/ijitcs.2017.03.04, Pub. Date: 8 Mar. 2017

Whereas the majority of works and research about context-awareness in ubiquitous computing provide context models that make use of context features in a particular application, one of the main challenges these last years has been to come out with prospective standardization of context models. As for Information Retrieval, the lack of consensual Context Models represents the biggest issue. In this paper, we investigate the importance of good context modelling to overcome some of the issues surrounding a search task. Thus, after identifying those issues and listing and categorizing the modelling requirements, the objective of our research is to find correlations between the appreciations of context quality criteria taking into account the user dimension. Likewise, the results of a previous survey about search habits have been used such that many socio-demographic categories were considered and the Kendall's W evaluation performed together with the Friedman test provided very interesting results that encourage the feasibility of building large scale context models.

[...] Read more.
A Knowledge-Based System for Life Insurance Underwriting

By Mutai K. Joram Bii K. Harrison Kiplang at N. Joseph

DOI: https://doi.org/10.5815/ijitcs.2017.03.05, Pub. Date: 8 Mar. 2017

The purpose of this work is to enhance the life insurance underwriting process by building a knowledge-based system for life insurance underwriting. The knowledge-based system would be useful for organizations, which want to serve their clients better, promote expertise capture, retention, and reuse in the organization. The paper identifies the main input factors and output decisions that life insurance practitioners considered and made on a daily basis. Life underwriting knowledge was extracted through interviews in a leading insurance company in Kenya. The knowledge is incorporated into a knowledge-based system prototype designed and implemented, built to demonstrate the potential of this technology in life insurance industry. Unified modelling language and visual prolog language was used in the design and development of the prototype respectively. The system's knowledge base was populated with sample knowledge obtained from the life insurance company and results were generated to illustrate how the system is expected to function.

[...] Read more.
Data Cleaning In Data Warehouse: A Survey of Data Pre-processing Techniques and Tools

By Anosh Fatima Nosheen Nazir Muhammad Gufran Khan

DOI: https://doi.org/10.5815/ijitcs.2017.03.06, Pub. Date: 8 Mar. 2017

A Data Warehouse is a computer system designed for storing and analyzing an organization's historical data from day-to-day operations in Online Transaction Processing System (OLTP). Usually, an organization summarizes and copies information from its operational systems to the data warehouse on a regular schedule and management performs complex queries and analysis on the information without slowing down the operational systems. Data need to be pre-processed to improve quality of data, before storing into data warehouse. This survey paper presents data cleaning problems and the approaches in use currently for pre-processing. To determine which technique of pre-processing is best in what scenario to improve the performance of Data Warehouse is main goal of this paper. Many techniques have been analyzed for data cleansing, using certain evaluation attributes and tested on different kind of data sets. Data quality tools such as YALE, ALTERYX, and WEKA have been used for conclusive results to ready the data in data warehouse and ensure that only cleaned data populates the warehouse, thus enhancing usability of the warehouse. Results of paper can be useful in many future activities like cleansing, standardizing, correction, matching and transformation. This research can help in data auditing and pattern detection in the data.

[...] Read more.
A Secure and Semi-Blind Technique of Embedding Color Watermark in RGB Image Using Curvelet Domain

By Ranjeeta Sanjay Sharma L. R. Raheja

DOI: https://doi.org/10.5815/ijitcs.2017.03.07, Pub. Date: 8 Mar. 2017

A semi-blind and secure watermarking technique for the color image using curvelet domain has been proposed. To make the algorithm secure a Bijection mapping function has been used. The watermark also separated into color planes and each color plane into a bit planes. The most significant bit (MSB) planes of each color used as the embedding information and remaining bit planes are used as a key at the time of extraction. The MSB planes of each color of watermark image embedded into the curvelet coefficients of the blue color plane of the processed cover image. For embedding the MSB bit planes of watermark image in the cover image each curvelet coefficient of blue planes of the processed cover image has been compared with the value of its 8 connected coefficients (neighbors). The results of the watermarking scheme have been analyzed by different quality assessment metric such as PSNR, Correlation Coefficient (CC) and Mean Structure Similarity Index Measure (MSSIM). The experimental results show that the proposed technique gives the good invisibility of watermark, the quality of extracting watermark and robustness against different attacks.

[...] Read more.
Priority Based New Approach for Correlation Clustering

By Aaditya Jain Suchita Tyagi

DOI: https://doi.org/10.5815/ijitcs.2017.03.08, Pub. Date: 8 Mar. 2017

Emerging source of Information like social network, bibliographic data and interaction network of proteins have complex relation among data objects and need to be processed in different manner than traditional data analysis. Correlation clustering is one such new style of viewing data and analyzing it to detect patterns and clusters. Being a new field, it has lot of scope for research. This paper discusses a method to solve problem of chromatic correlation clustering where data objects as nodes of a graph are connected through color-labeled edges representing relations among objects. Purposed heuristic performs better than the previous works.

[...] Read more.