IJITCS Vol. 6, No. 6, May. 2014
Cover page and Table of Contents: PDF (size: 234KB)
REGULAR PAPERS
Long-Term Evolution (LTE) is the next generation of current mobile telecommunication networks. LTE has a new flat radio-network architecture and significant increase in spectrum efficiency. In this paper, main focus on throughput performance analysis of robust MIMO channel estimators for Downlink Long Term Evolution-Advance (DL LTE-A)-4G system using three Artificial Neural Networks: Feed-forward neural network (FFNN), Cascade-forward neural network (CFNN) and Time-Delay neural network (TDNN) are adopted to train the constructed neural networks’ models separately using Back-Propagation Algorithm. The methods use the information received by the received reference symbols to estimate the total frequency response of the channel in two important phases. In the first phase, the proposed ANN based method learns to adapt to the channel variations, and in the second phase, it estimates the MIMO channel matrix and try to improve throughput of LTE. The performance of the estimation methods is evaluated by simulations in Vienna LTE-A DL Link Level Simulator. Performance of the proposed channel estimator, Time-Delay neural network (TDNN) is compared with traditional Least Square (LS) algorithm and ANN based other estimators for Closed Loop Spatial Multiplexing (CLSM) - Single User Multi-input Multi-output (MIMO-2×2 and 4×4) in terms of throughput. Simulation result shows TDNN gives better performance than other ANN based estimations methods and LS.
[...] Read more.Congestion in Mobile Ad Hoc Network causes packet loss, longer end-to-end data delivery delay which affects the overall performance of the network significantly. To ensure high throughput, the routing protocol should be congestion adaptive and should be capable of handling the congestion. In this research work, we propose a Multilevel Congestion avoidance and Control Mechanism (MCCM) that exploits both congestion avoidance and control mechanism to handle the congestion problem in an effective and efficient way. MCCM is capable of finding an energy efficient path during route discovery process, provide longer lifetime of any developed route. The efficient admission control and selective data packet delivery mechanism of MCCM jointly overcome the congestion problem at any node and thus, MCCM improves the network performance in term of packet delivery ratio, lower data delivery delay and high throughput. The result of performance evaluation section shows that, MCCM outperforms the existing routing protocols carried out in Network Simulator-2(NS-2).
[...] Read more.In this paper, an Improved Firefly Algorithm with Chaos (IFCH) is presented for solving definite integral. The IFCH satisfies the question of parallel calculating numerical integration in engineering and those segmentation points are adaptive. Several numerical simulation results show that the algorithm offers an efficient way to calculate the numerical value of definite integrals, and has a high convergence rate, high accuracy and robustness.
[...] Read more.The recent popularity of applications based on wireless sensor networks (WSN) provides a strong motivation for pursuing research in different dimensions of WSN. Node placement is an essential task in wireless sensor network and is a multi-objective combinatorial problem in nature. The positions of sensor nodes are very important and must be able to provide maximum coverage with longer lifetimes. So, for efficient node placement, a novel multi-objective Artificial Bee Colony (ABC) algorithm based framework is proposed in this paper. The framework optimizes the operational modes of the sensor nodes along with clustering schemes and transmission signal strengths. The results show that the proposed algorithm outperformed the contemporary methodology based on TPSMA, PSO and ACO.
[...] Read more.Nowadays, recommendation has become an everyday activity in the World Wide Web. An increasing amount of work has been published in various areas related to the recommender system. Cross-domain recommendation is an emerging research topic. This type of recommendations has barely been investigated because it is difficult to obtain public datasets with user preferences crossing different domains. To solve dataset problem, one of the solution is to create different domains. Ontology is playing increasingly important roles in many research areas such as semantics interoperability and knowledge base and creating domain. Ontology defines a common vocabulary and a shared understanding and is applied for real world applications. Ontology is a formal representation of a set of concepts within a domain and the relationships between those concepts. This paper presents an approach for building ontologies using Taxonomic conversational case-based reasoning (Taxonomic CCBR) to apply cross-domain recommendation based on facial skin problems and related cosmetics. For linking cross-domain recommendation, Ford-Fulkerson algorithm is used to build the bridge of the concepts between two domain ontologies (Problems domain as the source domain and Cosmetics domain as the target domain).
[...] Read more.The merging of biology and computer science has created a new field called computational biology that explore the capacities of computers to gain knowledge from biological data, bioinformatics. Computational biology is rooted in life sciences as well as computers, information sciences, and technologies. The main problem in computational biology is sequence alignment that is a way of arranging the sequences of DNA, RNA or protein to identify the region of similarity and relationship between sequences. This paper introduces an enhancement of dynamic algorithm of genome sequence alignment, which called EDAGSA. It is filling the three main diagonals without filling the entire matrix by the unused data. It gets the optimal solution with decreasing the execution time and therefore the performance is increased. To illustrate the effectiveness of optimizing the performance of the proposed algorithm, it is compared with the traditional methods such as Needleman-Wunsch, Smith-Waterman and longest common subsequence algorithms. Also, database is implemented for using the algorithm in multi-sequence alignments for searching the optimal sequence that matches the given sequence.
[...] Read more.Computed tomography angiography (CTA) is a stabilized tool for vessel imaging in the medical image processing field. High-intense structures in the contrast image can seriously hamper luminal visualization. Metal artifacts are an extensive problem in computed tomography (CT) images. We proposed directional restoration filtering process with Fuzzy logic in order to reduce metal artifact from CT images. We create two sets by iteration process and these sets will be sorted in ascending order. After sorting we take two elements from two data sets and the tracking both elements will be selected from the second position of those sorting arrays. Intersection Fuzzy logic will be executed between two selected elements and Gaussian convolution operation will be performed in the entire images because of enhancement the artifact affected CT images. In this paper, we investigated a fully automated intensity-based filter and it depends on the gray level variation rating. This results in a better visualization of the vessel lumen, also of the smaller vessels, allowing a faster and more accurate inspection of the whole vascular structures.
[...] Read more.In this paper, an objective function based on minimal spanning tree (MST) of data points is proposed for clustering and a density-based clustering technique has been used in an attempt to optimize the specified objective function in order to detect the “natural grouping” present in a given data set. A threshold based on MST of data points of each cluster thus found is used to remove noise (if any present in the data) from the final clustering.
A comparison of the experimental results obtained by DBSCAN (Density Based Spatial Clustering of Applications with Noise) algorithm and the proposed algorithm has also been incorporated. It is observed that our proposed algorithm performs better than DBSCAN algorithm. Several experiments on synthetic data set in R^2 and R^3 show the utility of the proposed method. The proposed method has also found to provide good results for two real life data sets considered for experimentation. Note thatK-means is one of the most popular methods adopted to solve the clustering problem. This algorithm uses an objective function that is based on minimization of squared error criteria. Note that it may not always provide the “natural grouping” though it is useful in many applications.
To produce a release of software, ALM is a key for streaming the team’s ability. ALM consists of the core disciplines of requirements definition and its management, asset management, development, build creation, testing, and release that are all planned by project management and orchestrated by using some form of process [1]. The assets and their relationships are stored by the development team repository. Detailed reports and charts provide visibility into team’s progress. In this paper we will describe how the ALM involves software development activities and assets coordination for the production and management of software applications throughout their entire life cycle.
[...] Read more.In this research, new ideas are proposed to enhance content-based image retrieval applications by representing colored images in terms of its colors and angles as a histogram describing the number of pixels with particular color located in specific angle, then similarity is measured between the two represented histograms. The color quantization technique is a crucial stage in the CBIR system process, we made comparisons between the uniform and the non-uniform color quantization techniques, and then according to our results we used the non-uniform technique which showed higher efficiency.
In our tests we used the Corel-1000 images database in addition to a Matlab code, we compared our results with other approaches like Fuzzy Club, IRM, Geometric Histogram, Signature Based CBIR and Modified ERBIR, and our proposed technique showed high retrieving precision ratios compared to the other techniques.