ISSN: 2310-9025 (Print)
ISSN: 2310-9033 (Online)
DOI: https://doi.org/10.5815/ijmsc
Website: https://www.mecs-press.org/ijmsc
Published By: MECS Press
Frequency: 4 issues per year
Number(s) Available: 40
IJMSC is committed to bridge the theory and practice of mathematical sciences and computing. IJMSC publishes original, peer-reviewed, and high quality articles in the areas of mathematical sciences and computing. IJMSC is a well-indexed scholarly journal and is indispensable reading and references for people working at the cutting edge of mathematical sciences and computing applications.
IJMSC has been abstracted or indexed by several world class databases:Google Scholar, Microsoft Academic Search, CrossRef, CNKI, Baidu Wenku, JournalTOCs, etc..
IJMSC Vol. 10, No. 3, Sep. 2024
REGULAR PAPERS
In this paper using computer algebra system a new generalized algorithm is developed to study and generalize the Kaprekar’s operation which can be used for desired numbers of iterations and is also applicable to any n-digits number which is greater than or equal to two. Existing relevant results are verified with the available results in literature and further extended to examine the difference (kernel) of the obtained number during process with the number obtained in preceding iteration after each step. Sum of the digits of an acquired number obtained after each step is also noticed and found that sum its digits is divisible by 9. A detailed investigation is conducted for all two-digit number and the output acquired is exhibited in tabular form which has not been studied in earlier. An 8-digits number also considered and found that it does not converges to a unique kernel like 3-digits and 4-digits, but follows a regular pattern after initial iteration. Analytical illustrations are provided along with pictorial representations for 2-digits, 3-digits 4-digits and 8-digits number. This algorithm can further be employed for numbers having any number digits.
[...] Read more.Detecting property insurance fraud is critical for reducing financial losses and ensuring fair claim processing. Traditional methods of detecting insurance fraud had several drawbacks, including no feature selection process, no hyper parameter tuning, lower accuracy, and class imbalance problems. To address the aforementioned shortcomings, this paper examines advanced ML (machine learning) techniques for accurately detecting property insurance fraud. To determine the best model for predicting fraudulent activities, this paper tested several machine learning models, including Gradient Boosting, classical ML classifiers, and Stacking Ensemble methods. To address class imbalance and improve model performance, the selected model incorporates proper feature selection, hyper parameter tuning, and SMOTE techniques (synthetic minority over-sampling). The Stacking Ensemble method outperformed the other ML models, achieving an accuracy of 96% and a recall of 94%. The experimental results show that the proposed stacking ensemble-based prediction scheme improves accuracy by 3.4% and recall by 2.7% over previous works. This article also includes a web application for assisting with property insurance fraud, which includes ML-based fraud prediction, question submission, answer checking, and blog post access. According to the findings, more than 54% of users expressed satisfaction with the web application's usefulness for detecting property fraud.
[...] Read more.The FinTech sector, an innovative blend of finance and technology, has significantly reshaped financial services by making transactions more efficient and accessible. However, this rapid digitalization has also introduced substantial cybersecurity risks, making the sector an attractive target for cybercriminals. This paper explores the current digital security landscape within the FinTech industry, highlighting prevalent threats such as phishing, malware, and data breaches. It underscores the importance of raising digital security awareness among employees, customers, and other stakeholders to mitigate these risks. The paper analyzes significant case studies and regulatory frameworks and examines the challenges and barriers to implementing effective security measures. It also proposes comprehensive strategies for enhancing digital security awareness, including employee training, customer education, and industry collaboration. The paper concludes with recommendations for future trends and best practices, emphasizing the need for a proactive and collaborative approach to building a secure and resilient FinTech ecosystem.
[...] Read more.Simpson's Rule is a widely used numerical integration technique, but it cannot be applied to unequally spaced data. This paper presents a new generalization of Simpson's Rule using both Lagrange and Hermite interpolating polynomials to address this limitation. I provide a geometric interpretation of the method, showing its relationship to the area calculation of a trapezoid and a triangle, where the accuracy is significantly influenced by the chosen interpolating polynomial for midpoint determination. A comprehensive comparative analysis across various functions reveals that the Hermite-based approach consistently exhibits higher accuracy and stability than the Lagrange method, particularly with an increasing number of subintervals. This improved performance stems from the Hermite polynomial's ability to better approximate the function's behavior between data points. The findings highlight the effectiveness of the proposed Hermite-based generalization of Simpson's Rule in improving the accuracy of numerical integration for unequally spaced data, which is commonly encountered in practical applications.
[...] Read more.In the software development industry, ensuring software quality holds immense significance due to its direct influence on user satisfaction, system reliability, and overall end-users. Traditionally, the development process involved identifying and rectifying defects after the implementation phase, which could be time-consuming and costly. Determining software development methodologies, with a specific emphasis on Test-Driven Development, aims to evaluate its effectiveness in improving software quality. The study employs a mixed-methods approach, combining quantitative surveys and qualitative interviews to comprehensively investigate the impact of Test-Driven Development on various facets of software quality. The survey findings unveil that Test-Driven Development offers substantial benefits in terms of early defect detection, leading to reduced costs and effort in rectifying issues during the development process. Moreover, Test-Driven Development encourages improved code design and maintainability, fostering the creation of modular and loosely coupled code structures. These results underscore the pivotal role of Test-Driven Development in elevating code quality and maintainability. Comparative analysis with traditional development methodologies highlights Test-Driven Development's effectiveness in enhancing software quality, as rated highly by respondents. Furthermore, it clarifies Test-Driven Development's positive impact on user satisfaction, overall product quality, and code maintainability. Challenges related to Test-Driven Development adoption are identified, such as the initial time investment in writing tests and difficulties adapting to changing requirements. Strategies to mitigate these challenges are proposed, contributing to the practical application of Test-Driven Development. Offers valuable insights into the efficacy of Test-Driven Development in enhancing software quality. It not only highlights the benefits of Test-Driven Development but also provides a framework for addressing challenges and optimizing its utilization. This knowledge is invaluable for software development teams, project managers, and quality assurance professionals, facilitating informed decisions regarding adopting and implementing Test-Driven Development as a quality assurance technique in software development.
[...] Read more.With the reform of Chinese economic system, the development of enterprises is facing many risks and challenges. In order to understand the state of operation of enterprises, it is necessary to apply relevant methods to evaluate the enterprise performance. Taking Industrial and Commercial Bank of China as an example, this paper selects its financial data from 2018 to 2021. Firstly, DuPont analysis is applied to decompose the return on equity into the product of profit margin on sales, total assets turnover ratio and equity multiplier. Then analyzes the effect of the changes of these three factors on the return on equity respectively by using the Chain substitution method. The results show that the effect of profit margin on sales on return on equity decreases year by year and tends to be positive from negative. The effect of total assets turnover ratio on return on equity changes from positive to negative and then to positive, while the effect of equity multiplier is opposite. These results provide a direction for the adjustment of the return on equity of Industrial and Commercial Bank of China. Finally, according to the results, some suggestions are put forward for the development of Industrial and Commercial Bank of China.
[...] Read more.The process of making decisions on software architecture is the greatest significance for the achievement of a software system's success. Software architecture establishes the framework of the system, specifies its characteristics, and has significant and major effects across the whole life cycle of the system. The complicated characteristics of the software development context and the significance of the problem have caused the research community to build various methodologies focused on supporting software architects to improve their decision-making abilities. With these efforts, the implementation of such systematic methodologies looks to be somewhat constrained in practical application. Moreover, the decision-makers must overcome unexpected difficulties due to the varying software development processes that propose distinct approaches for architecture design. The understanding of these design approaches helps to develop the architectural design framework. In the area of software architecture, a significant change has occurred wherein the focus has shifted from primarily identifying the result of the architecting process, which was primarily expressed through the representation of components and connectors, to the documentation of architectural design decisions and the underlying reasoning behind them. This shift finally concludes in the creation of an architectural design framework. So, a correct decision- making approach is needed to design the software architecture. The present study analyzes the design decisions and proposes a new design decision model for the software architecture. This study introduces a new approach to the decision-making model, wherein software architecture design is viewed based on specific decisions.
[...] Read more.Cloud computing is a widely acceptable computing environment, and its services are also widely available. But the consumption of energy is one of the major issues of cloud computing as a green computing. Because many electronic resources like processing devices, storage devices in both client and server site and network computing devices like switches, routers are the main elements of energy consumption in cloud and during computation power are also required to cool the IT load in cloud computing. So due to the high consumption, cloud resources define the high energy cost during the service activities of cloud computing and contribute more carbon emissions to the atmosphere. These two issues inspired the cloud companies to develop such renewable cloud sustainability regulations to control the energy cost and the rate of CO2 emission. The main purpose of this paper is to develop a green computing environment through saving the energy of cloud resources using the specific approach of identifying the requirement of computing resources during the computation of cloud services. Only required computing resources remain ON (working state), and the rest become OFF (sleep/hibernate state) to reduce the energy uses in the cloud data centers. This approach will be more efficient than other available approaches based on cloud service scheduling or migration and virtualization of services in the cloud network. It reduces the cloud data center's energy usages by applying a power management scheme (ON/OFF) on computing resources. The proposed approach helps to convert the cloud computing in green computing through identifying an appropriate number of cloud computing resources like processing nodes, servers, disks and switches/routers during any service computation on cloud to handle the energy-saving or environmental impact.
[...] Read more.Fog computing is extending cloud computing by transferring computation on the edge of networks such as mobile collaborative devices or fixed nodes with built-in data storage, computing, and communication devices. Fog gives focal points of enhanced proficiency, better security, organize data transfer capacity sparing and versatility. With a specific end goal to give imperative subtle elements of Fog registering, we propose attributes of this region and separate from cloud computing research. Cloud computing is developing innovation which gives figuring assets to a specific assignment on pay per utilize. Cloud computing gives benefit three unique models and the cloud gives shoddy; midway oversaw assets for dependable registering for performing required errands. This paper gives correlation and attributes both Fog and cloud computing differs by outline, arrangement, administrations and devices for associations and clients. This comparison shows that Fog provides more flexible infrastructure and better service of data processing by consuming low network bandwidth instead of shifting whole data to the cloud.
[...] Read more.Currently, every company is concerned about the retention of their staff. They are nevertheless unable to recognize the genuine reasons for their job resignations due to various circumstances. Each business has its approach to treating employees and ensuring their pleasure. As a result, many employees abruptly terminate their employment for no apparent reason. Machine learning (ML) approaches have grown in popularity among researchers in recent decades. It is capable of proposing answers to a wide range of issues. Then, using machine learning, you may generate predictions about staff attrition. In this research, distinct methods are compared to identify which workers are most likely to leave their organization. It uses two approaches to divide the dataset into train and test data: the 70 percent train, the 30 percent test split, and the K-Fold approaches. Cat Boost, LightGBM Boost, and XGBoost are three methods employed for accuracy comparison. These three approaches are accurately generated by using Gradient Boosting Algorithms.
[...] Read more.Fast development in universal computing and the growth in radio/wireless and mobile strategies have led to the extended use of application space for Radio Frequency (RFID), wireless sensors, Internet of things (IoT). There are numerous applications that are safe and privacy sensitive. The increase of the new equipments has permitted intellectual methods of linking physical strategies and the computing worlds through numerous network interfaces. Consequently, it is compulsory to take note of the essential risks subsequent from these communications. In Wireless systems, RFID and sensor linkages are extremely organized in soldierly, profitable and locomotive submissions. With the extensive use of the wireless and mobile devices, safety has therefore become a major concern. As a consequence, need for extremely protected encryption and decryption primitives in such devices is very important than before.
[...] Read more.Predicting human emotion from speech is now important research topic. One’s mental state can be understood by emotion. The proposed research work is emotion recognition from human speech. Proposed system plays significant role in recognizing emotion while someone is talking. It has a great use for smart home environment. One can understand the emotion of other who is in home or may be in other place. University, service center or hospital can get a valuable decision support system with this emotion prediction system. Features like-MFCC (Mel-Frequency Cepstral Coefficients) and LPC are extracted from audio sample signal. Audios are collected by recording speeches. A test also applied by combining self-collected dataset and popular Ravdees dataset. Self-collected dataset is named as ABEG. MFCC and LPC features are used in this study to train and test for predicting emotion. This study is made on angry, happy and neutral emotion classes. Different machine learning algorithms are applied here and result is compared with each other. Logistic regression performs well as compared to other ML algorithm.
[...] Read more.Many different methods are applied and used in an attempt to solve higher order nonlinear boundary value problems (BVPs). Galerkin weighted residual method (GWRM) are widely used to solve BVPs. The main aim of this paper is to find the approximate solutions of fifth, seventh and ninth order nonlinear boundary value problems using GWRM. A trial function namely, Bezier Polynomials is assumed which is made to satisfy the given essential boundary conditions. Investigate the effectiveness of the current method; some numerical examples were considered. The results are depicted both graphically and numerically. The numerical solutions are in good agreement with the exact result and get a higher accuracy in the solutions. The present method is quit efficient and yields better results when compared with the existing methods. All problems are performed using the software MATLAB R2017a.
[...] Read more.Cervical Cancer is one of the main reason of deaths in countries having a low capita income. It becomes quite complicated while examining a patient on basis of the result obtained from various doctor’s preferred test for any automated system to determine if the patient is positive with the cancer. There were 898 new cases of cervical cancer diagnosed in Australia in 2014. The risk of a woman being diagnosed by age 85 is 1 in 167. We will try to use machine learning algorithms and determine if the patient has cancer based on numerous factors available in the dataset. Predicting the presence of cervical cancer can help the diagnosis process to start at an earlier stage.
[...] Read more.Since the inception of Blockchain, the computer database has been evolving into innovative technologies. Recent technologies emerge, the use of Blockchain is also flourishing. All the technologies from Blockchain use a mutual algorithm to operate. The consensus algorithm is the process that assures mutual agreements and stores information in the decentralized database of the network. Blockchain’s biggest drawback is the exposure to scalability. However, using the correct consensus for the relevant work can ensure efficiency in data storage, transaction finality, and data integrity. In this paper, a comparison study has been made among the following consensus algorithms: Proof of Work (PoW), Proof of Stake (PoS), Proof of Authority (PoA), and Proof of Vote (PoV). This study aims to provide readers with elementary knowledge about blockchain, more specifically its consensus protocols. It covers their origins, how they operate, and their strengths and weaknesses. We have made a significant study of these consensus protocols and uncovered some of their advantages and disadvantages in relation to characteristics details such as security, energy efficiency, scalability, and IoT (Internet of Things) compatibility. This information will assist future researchers to understand the characteristics of our selected consensus algorithms.
[...] Read more.With the reform of Chinese economic system, the development of enterprises is facing many risks and challenges. In order to understand the state of operation of enterprises, it is necessary to apply relevant methods to evaluate the enterprise performance. Taking Industrial and Commercial Bank of China as an example, this paper selects its financial data from 2018 to 2021. Firstly, DuPont analysis is applied to decompose the return on equity into the product of profit margin on sales, total assets turnover ratio and equity multiplier. Then analyzes the effect of the changes of these three factors on the return on equity respectively by using the Chain substitution method. The results show that the effect of profit margin on sales on return on equity decreases year by year and tends to be positive from negative. The effect of total assets turnover ratio on return on equity changes from positive to negative and then to positive, while the effect of equity multiplier is opposite. These results provide a direction for the adjustment of the return on equity of Industrial and Commercial Bank of China. Finally, according to the results, some suggestions are put forward for the development of Industrial and Commercial Bank of China.
[...] Read more.Many different methods are applied and used in an attempt to solve higher order nonlinear boundary value problems (BVPs). Galerkin weighted residual method (GWRM) are widely used to solve BVPs. The main aim of this paper is to find the approximate solutions of fifth, seventh and ninth order nonlinear boundary value problems using GWRM. A trial function namely, Bezier Polynomials is assumed which is made to satisfy the given essential boundary conditions. Investigate the effectiveness of the current method; some numerical examples were considered. The results are depicted both graphically and numerically. The numerical solutions are in good agreement with the exact result and get a higher accuracy in the solutions. The present method is quit efficient and yields better results when compared with the existing methods. All problems are performed using the software MATLAB R2017a.
[...] Read more.Among other factors affecting face recognition and verification, the aging of individuals is a particularly challenging one. Unlike other factors such as pose, expression, and illumination, aging is uncontrollable, personalized, and takes place throughout human life. Thus, while the effects of factors such as head pose, illumination, and facial expression on face recognition can be minimized by using images from controlled environments, the effect of aging cannot be so controlled. This work exploits the personalized nature of aging to reduce the effect of aging on face recognition so that an individual can be correctly recognized across his/her different age-separated face images. To achieve this, an individualized face pairing method was developed in this work to pair faces against entire sets of faces grouped by individuals then, similarity score vectors are obtained for both matching and non-matching image-individual pairs, and the vectors are then used for age-invariant face recognition. This model has the advantage of being able to capture all possible face matchings (intra-class and inter-class) within a face dataset without having to compute all possible image-to-image pairs. This reduces the computational demand of the model without compromising the impact of the ageing factor on the identity of the human face. The developed model was evaluated on the publicly available FG-NET dataset, two subsets of the CACD dataset, and a locally obtained FAGE dataset using leave-one-person (LOPO) cross-validation. The model achieved recognition accuracies of 97.01%, 99.89%, 99.92%, and 99.53% respectively. The developed model can be used to improve face recognition models by making them robust to age-variations in individuals in the dataset.
[...] Read more.An earlier research project that dealt with converting ASCII codes into 2D Cartesian coordinates and then applying translation and rotation transformations to construct an encryption system, is improved by this study. Here, we present a variation of the Cantor Pairing Function to convert ASCII values into distinctive 2D Coordinates. Then, we apply some novel methods to jumble the ciphertext generated as a result of the transformations. We suggest numerous improvements to the earlier research via simple tweaks in the existing code and by introducing a novel key generation protocol that generates an infinite integral key space with no decryption failures. The only way to break this protocol with no prior information would be brute force attack. With the help of elementary combinatorics and probability topics, we prove that this encryption protocol is seemingly infeasible to overcome by an unwelcome adversary.
[...] Read more.Quantum computing is a computational framework based on the Quantum Mechanism, which has gotten a lot of attention in the past few decades. In comparison to traditional computers, it has achieved amazing performance on several specialized tasks. Quantum computing is the study of quantum computers that use quantum mechanics phenomena such as entanglement, superposition, annealing, and tunneling to solve problems that humans cannot solve in their lifetime. This article offers a brief outline of what is happening in the field of quantum computing, as well as the current state of the art. It also summarizes the features of quantum computing in terms of major elements such as qubit computation, quantum parallelism, and reverse computing. The study investigates the cause of a quantum computer's great computing capabilities by utilizing quantum entangled states. It also emphasizes that quantum computer research requires a combination of the most sophisticated sciences, such as computer technology, micro-physics, and advanced mathematics.
[...] Read more.Currently, every company is concerned about the retention of their staff. They are nevertheless unable to recognize the genuine reasons for their job resignations due to various circumstances. Each business has its approach to treating employees and ensuring their pleasure. As a result, many employees abruptly terminate their employment for no apparent reason. Machine learning (ML) approaches have grown in popularity among researchers in recent decades. It is capable of proposing answers to a wide range of issues. Then, using machine learning, you may generate predictions about staff attrition. In this research, distinct methods are compared to identify which workers are most likely to leave their organization. It uses two approaches to divide the dataset into train and test data: the 70 percent train, the 30 percent test split, and the K-Fold approaches. Cat Boost, LightGBM Boost, and XGBoost are three methods employed for accuracy comparison. These three approaches are accurately generated by using Gradient Boosting Algorithms.
[...] Read more.Data mining and machine learning methods are important areas where studies have increased in recent years. Data is critical for these areas focus on inferring meaningful conclusions from the data collected. The preparation of the data is very important for the studies to be carried out and the algorithms to be applied. One of the most critical steps in data preparation is outlier detection. Because these observations, which have different characteristics from the observations in the data, affect the results of the algorithms to be applied and may cause erroneous results. New methods have been developed for outlier detection and machine learning and data mining algorithms have been provided with successful results with these methods. Algorithms such as Fuzzy C Means (FCM) and Self Organization Maps (SOM) have given successful results for outlier detection in this area. However, there is no outlier detection method in which these two powerful clustering methods are used together. This study proposes a new outlier detection algorithm using these two powerful clustering methods. In this study, a new outlier detection algorithm (FUSOMOUT) was developed by using SOM and FCM clustering methods together. With this algorithm, it is aimed to increase the success of both clustering and classification algorithms. The proposed algorithm was applied to four different datasets with different characteristics (Wisconsin breast cancer dataset (WDBC), Wine, Diabetes and Kddcup99) and it was shown to significantly increase the classification accuracy with the Silhouette, Calinski-Harabasz and Davies-Bouldin indexes as clustering success indexes.
[...] Read more.We aim to extract emotional components within statements to identify the emotional state of the writer and assigning emoji related to the emotion. Emojis have become a staple part of everyday text-based communication. It is normal and common to construct an entire response with the sole use of emoji. It comes as no surprise, therefore, that effort is being put into the automatic prediction and selection of emoji appropriate for a text message. Major companies like Apple and Google have made immense strides in this, and have already deployed such systems into production (for example, the Google Gboard). The proposed work is focused on the problem of automatic emoji selection for a given text message using machine learning classification algorithms to categorize the tone of a message which is further segregated through n-gram into one of seven distinct categories. Based on the output of the classifier, select one of the more appropriate emoji from a predefined list using natural language processing (NLP) and sentimental analysis techniques. The corpus is extracted from Twitter. The result is a boring text message made lively after being annotated with appropriate text messages
[...] Read more.A fundamental principle and assumption of cosmology says that the universe is homogeneous and isotropic when viewed on a large scale. According to the cosmological principle, space might be flat, or have a negative or positive curvature in cosmological model. Positively curved universe denotes the closed universe and negatively curved universe denotes the open universe. Our universe type is flat because it expands in every direction neither curving positively nor negatively. We have observed that the progression of the universe is based on radiation and matter domination. In this paper we also have observed that future possible upper limit age of the universe is 9.4203×〖10〗^10 years which varies with density.
[...] Read more.Forecasting is estimating the magnitude of uncertain future events and provides different results with different supposition. In order to identify the core data pattern of jute bale requirements for yarn production, we examined 10 years' worth of data from Jute Yarn/Twin that were shipped by their member mills Limited. Exponential smoothing and Holt’s methods are commonly used to forecast this output because it provides an adequate result. Selecting the right smoothing constant value is essential for reducing predicting errors. In this work, we created a method for choosing the smoothing constant's ideal value to reduce study errors measured by the mean square error (MSE), mean absolute deviation (MAD), and mean square percent error (MAPE). At the contrary, we discuss research finding result and future possibility so that Jute Mills Limited and similar companies may execute forecasting smoothly and develop the expertise level of the procurement system to stay competitive in the worldwide market.
[...] Read more.