ISSN: 2310-9025 (Print)
ISSN: 2310-9033 (Online)
DOI: https://doi.org/10.5815/ijmsc
Website: https://www.mecs-press.org/ijmsc
Published By: MECS Press
Frequency: 4 issues per year
Number(s) Available: 41
IJMSC is committed to bridge the theory and practice of mathematical sciences and computing. IJMSC publishes original, peer-reviewed, and high quality articles in the areas of mathematical sciences and computing. IJMSC is a well-indexed scholarly journal and is indispensable reading and references for people working at the cutting edge of mathematical sciences and computing applications.
IJMSC has been abstracted or indexed by several world class databases:Google Scholar, Microsoft Academic Search, CrossRef, CNKI, Baidu Wenku, JournalTOCs, etc..
IJMSC Vol. 10, No. 4, Dec. 2024
REGULAR PAPERS
This study analyzes wastewater treatment processes at a mining company in the Almaty region, Kazakhstan. Four treatment schemes were developed and assessed, with a focus on optimizing efficiency. The discharged water quality from different technological lines was evaluated using integral functions for a quantitative comparison of each scheme's performance. Additionally, an expert system was developed to validate the results and support future research in wastewater treatment.
[...] Read more.Consider a G=(V,E) and the function f :V -{0,1,2}. . Unguarded with regard to is defined as a node u with f (u) = 0 that is not next to a node with 1 or 2 . The function f (V0 ,V1
,V2 ) fulfilling the condition, in which each node u for which f (u)=0 is adjoint to at least one node v to which f (v) =2 , is referred to be a Roman dominant function, known as RDF of a graph G . Roman domination number (RDN) of graph represented by rR (G), is the bare lowest count of guards that must be employed in any RDF. In this paper, we introduce a new form for graph called Semi-Middle Graph for any given graph and we find the RDN for the Semi - Middle Graph of some specific graphs. In the field of networking, the concept of the Semi-Middle Graph can be applied to network topology optimization. Specifically, it can be used in the design of communication networks where each direct connection (edge) between two nodes (devices or routers) has an intermediary node that facilitates more efficient data routing or enhances fault tolerance. Usefulness of Roman Domination Number of a Semi-Middle Graph in Networking includes Network Coverage and Monitoring, Fault Tolerance and Redundancy, Optimal Placement of Relays and Routers, Load Balancing and Resource Allocation.
This research paper introduces SecretCentric, an innovative automated hardware-based password management system addressing the challenges of widely used password authentication methods, which have long been criticized for their poor performance. Password management plays a crucial role in protecting users' digital security and privacy, with key factors including password generation, storage, renewal, and reuse mitigation. Although numerous password managers and solutions have been introduced to tackle these challenges, password management automation has never been thoroughly explored. This study aims to revolutionize the field by eliminating the burden of manual password management from users by automating the entire process. Upon concluding a comprehensive survey, insights into user perceptions of password management and prevalent malpractices were identified. SecretCentric was designed to maximize the security and usability trade-off aligning with identified user expectations. Preliminary evaluations indicate that SecretCentric offers significant improvements over existing options, highlighting the necessity for an automated solution that balances security and usability in the era of increasing online services. The system's success demonstrates the importance of proper password management rather than replacement, contributing to research advancement in user authentication and credential management.
[...] Read more.Big Data is a new class of technology that gives businesses more insight into their massive data sets, allowing them to make better business decisions and satisfy customers. Big data systems are also a desirable target for hackers due to the aggregation of their data. Hadoop is used to handle large data sets through reading and writing application programs on a distributed system. Hadoop Distributed File System is used to store massive data. Since HDFS does not safeguard data privacy, encrypting the file is the right way to protect the stored data in HDFS but takes a long time. In this paper, regarding privacy concerns, we use different compression-type data storage file formats with the proposed user-defined function (XOR-Onetime pad with AES) to secure data in HDFS. In this way, we provide a dual level of security by masking the selective data and whole data in the file. Our experiment demonstrates that the whole process time is significantly smaller than that of a conventional method. The proposed UDF with ORC, Zlib file format gives 9-10% better performance results than 2DES and other methods. Finally, we decreased the load time of secure data and significantly improved query processing time with the Hive engine.
[...] Read more.In a Vehicular Ad Hoc Network (VANET), numerous vehicles are interconnected through a wireless network to facilitate communication. The primary objective of a VANET is to enhance driver safety and comfort by enabling the exchange of traffic-related messages within the vehicular environment. These messages can include vital information such as traffic conditions, accident alerts, and road hazards. However, addressing the security challenges in VANETs is paramount to avoid serious vulnerabilities that can compromise the entire network. One of the critical security challenges is conditional privacy-preserving authentication. This requirement mandates that each vehicle must be authenticated by other vehicles or Roadside Units (RSUs) while ensuring the privacy of the vehicle's identity. Moreover, it is essential to have the capability to trace a malicious user under specific conditions, such as in the event of a security breach or misuse of the network. In this research, we conduct an in-depth cryptanalysis of a recently proposed aggregate signature scheme designed for authentication in VANETs with conditional privacy-preserving property. Our analysis identifies the existing scheme is vulnerable against a malicious Key Generation Center (KGC) attacker, in contrast to the authors' claims. To address these issues, we propose a novel, secure, and efficient authentication scheme that maintains the conditional privacy-preserving property. We evaluate our scheme and provide a formal security proof within the Random Oracle Model (ROM). In addition to enhancing security, our scheme improves efficiency by reducing the computational and communication overhead typically associated with authentication processes in VANETs. This makes our solution not only secure but also practical for real-world deployment.
[...] Read more.With the reform of Chinese economic system, the development of enterprises is facing many risks and challenges. In order to understand the state of operation of enterprises, it is necessary to apply relevant methods to evaluate the enterprise performance. Taking Industrial and Commercial Bank of China as an example, this paper selects its financial data from 2018 to 2021. Firstly, DuPont analysis is applied to decompose the return on equity into the product of profit margin on sales, total assets turnover ratio and equity multiplier. Then analyzes the effect of the changes of these three factors on the return on equity respectively by using the Chain substitution method. The results show that the effect of profit margin on sales on return on equity decreases year by year and tends to be positive from negative. The effect of total assets turnover ratio on return on equity changes from positive to negative and then to positive, while the effect of equity multiplier is opposite. These results provide a direction for the adjustment of the return on equity of Industrial and Commercial Bank of China. Finally, according to the results, some suggestions are put forward for the development of Industrial and Commercial Bank of China.
[...] Read more.The process of making decisions on software architecture is the greatest significance for the achievement of a software system's success. Software architecture establishes the framework of the system, specifies its characteristics, and has significant and major effects across the whole life cycle of the system. The complicated characteristics of the software development context and the significance of the problem have caused the research community to build various methodologies focused on supporting software architects to improve their decision-making abilities. With these efforts, the implementation of such systematic methodologies looks to be somewhat constrained in practical application. Moreover, the decision-makers must overcome unexpected difficulties due to the varying software development processes that propose distinct approaches for architecture design. The understanding of these design approaches helps to develop the architectural design framework. In the area of software architecture, a significant change has occurred wherein the focus has shifted from primarily identifying the result of the architecting process, which was primarily expressed through the representation of components and connectors, to the documentation of architectural design decisions and the underlying reasoning behind them. This shift finally concludes in the creation of an architectural design framework. So, a correct decision- making approach is needed to design the software architecture. The present study analyzes the design decisions and proposes a new design decision model for the software architecture. This study introduces a new approach to the decision-making model, wherein software architecture design is viewed based on specific decisions.
[...] Read more.Cloud computing is a widely acceptable computing environment, and its services are also widely available. But the consumption of energy is one of the major issues of cloud computing as a green computing. Because many electronic resources like processing devices, storage devices in both client and server site and network computing devices like switches, routers are the main elements of energy consumption in cloud and during computation power are also required to cool the IT load in cloud computing. So due to the high consumption, cloud resources define the high energy cost during the service activities of cloud computing and contribute more carbon emissions to the atmosphere. These two issues inspired the cloud companies to develop such renewable cloud sustainability regulations to control the energy cost and the rate of CO2 emission. The main purpose of this paper is to develop a green computing environment through saving the energy of cloud resources using the specific approach of identifying the requirement of computing resources during the computation of cloud services. Only required computing resources remain ON (working state), and the rest become OFF (sleep/hibernate state) to reduce the energy uses in the cloud data centers. This approach will be more efficient than other available approaches based on cloud service scheduling or migration and virtualization of services in the cloud network. It reduces the cloud data center's energy usages by applying a power management scheme (ON/OFF) on computing resources. The proposed approach helps to convert the cloud computing in green computing through identifying an appropriate number of cloud computing resources like processing nodes, servers, disks and switches/routers during any service computation on cloud to handle the energy-saving or environmental impact.
[...] Read more.In the software development industry, ensuring software quality holds immense significance due to its direct influence on user satisfaction, system reliability, and overall end-users. Traditionally, the development process involved identifying and rectifying defects after the implementation phase, which could be time-consuming and costly. Determining software development methodologies, with a specific emphasis on Test-Driven Development, aims to evaluate its effectiveness in improving software quality. The study employs a mixed-methods approach, combining quantitative surveys and qualitative interviews to comprehensively investigate the impact of Test-Driven Development on various facets of software quality. The survey findings unveil that Test-Driven Development offers substantial benefits in terms of early defect detection, leading to reduced costs and effort in rectifying issues during the development process. Moreover, Test-Driven Development encourages improved code design and maintainability, fostering the creation of modular and loosely coupled code structures. These results underscore the pivotal role of Test-Driven Development in elevating code quality and maintainability. Comparative analysis with traditional development methodologies highlights Test-Driven Development's effectiveness in enhancing software quality, as rated highly by respondents. Furthermore, it clarifies Test-Driven Development's positive impact on user satisfaction, overall product quality, and code maintainability. Challenges related to Test-Driven Development adoption are identified, such as the initial time investment in writing tests and difficulties adapting to changing requirements. Strategies to mitigate these challenges are proposed, contributing to the practical application of Test-Driven Development. Offers valuable insights into the efficacy of Test-Driven Development in enhancing software quality. It not only highlights the benefits of Test-Driven Development but also provides a framework for addressing challenges and optimizing its utilization. This knowledge is invaluable for software development teams, project managers, and quality assurance professionals, facilitating informed decisions regarding adopting and implementing Test-Driven Development as a quality assurance technique in software development.
[...] Read more.Since the inception of Blockchain, the computer database has been evolving into innovative technologies. Recent technologies emerge, the use of Blockchain is also flourishing. All the technologies from Blockchain use a mutual algorithm to operate. The consensus algorithm is the process that assures mutual agreements and stores information in the decentralized database of the network. Blockchain’s biggest drawback is the exposure to scalability. However, using the correct consensus for the relevant work can ensure efficiency in data storage, transaction finality, and data integrity. In this paper, a comparison study has been made among the following consensus algorithms: Proof of Work (PoW), Proof of Stake (PoS), Proof of Authority (PoA), and Proof of Vote (PoV). This study aims to provide readers with elementary knowledge about blockchain, more specifically its consensus protocols. It covers their origins, how they operate, and their strengths and weaknesses. We have made a significant study of these consensus protocols and uncovered some of their advantages and disadvantages in relation to characteristics details such as security, energy efficiency, scalability, and IoT (Internet of Things) compatibility. This information will assist future researchers to understand the characteristics of our selected consensus algorithms.
[...] Read more.Fog computing is extending cloud computing by transferring computation on the edge of networks such as mobile collaborative devices or fixed nodes with built-in data storage, computing, and communication devices. Fog gives focal points of enhanced proficiency, better security, organize data transfer capacity sparing and versatility. With a specific end goal to give imperative subtle elements of Fog registering, we propose attributes of this region and separate from cloud computing research. Cloud computing is developing innovation which gives figuring assets to a specific assignment on pay per utilize. Cloud computing gives benefit three unique models and the cloud gives shoddy; midway oversaw assets for dependable registering for performing required errands. This paper gives correlation and attributes both Fog and cloud computing differs by outline, arrangement, administrations and devices for associations and clients. This comparison shows that Fog provides more flexible infrastructure and better service of data processing by consuming low network bandwidth instead of shifting whole data to the cloud.
[...] Read more.Predicting human emotion from speech is now important research topic. One’s mental state can be understood by emotion. The proposed research work is emotion recognition from human speech. Proposed system plays significant role in recognizing emotion while someone is talking. It has a great use for smart home environment. One can understand the emotion of other who is in home or may be in other place. University, service center or hospital can get a valuable decision support system with this emotion prediction system. Features like-MFCC (Mel-Frequency Cepstral Coefficients) and LPC are extracted from audio sample signal. Audios are collected by recording speeches. A test also applied by combining self-collected dataset and popular Ravdees dataset. Self-collected dataset is named as ABEG. MFCC and LPC features are used in this study to train and test for predicting emotion. This study is made on angry, happy and neutral emotion classes. Different machine learning algorithms are applied here and result is compared with each other. Logistic regression performs well as compared to other ML algorithm.
[...] Read more.Currently, every company is concerned about the retention of their staff. They are nevertheless unable to recognize the genuine reasons for their job resignations due to various circumstances. Each business has its approach to treating employees and ensuring their pleasure. As a result, many employees abruptly terminate their employment for no apparent reason. Machine learning (ML) approaches have grown in popularity among researchers in recent decades. It is capable of proposing answers to a wide range of issues. Then, using machine learning, you may generate predictions about staff attrition. In this research, distinct methods are compared to identify which workers are most likely to leave their organization. It uses two approaches to divide the dataset into train and test data: the 70 percent train, the 30 percent test split, and the K-Fold approaches. Cat Boost, LightGBM Boost, and XGBoost are three methods employed for accuracy comparison. These three approaches are accurately generated by using Gradient Boosting Algorithms.
[...] Read more.Among other factors affecting face recognition and verification, the aging of individuals is a particularly challenging one. Unlike other factors such as pose, expression, and illumination, aging is uncontrollable, personalized, and takes place throughout human life. Thus, while the effects of factors such as head pose, illumination, and facial expression on face recognition can be minimized by using images from controlled environments, the effect of aging cannot be so controlled. This work exploits the personalized nature of aging to reduce the effect of aging on face recognition so that an individual can be correctly recognized across his/her different age-separated face images. To achieve this, an individualized face pairing method was developed in this work to pair faces against entire sets of faces grouped by individuals then, similarity score vectors are obtained for both matching and non-matching image-individual pairs, and the vectors are then used for age-invariant face recognition. This model has the advantage of being able to capture all possible face matchings (intra-class and inter-class) within a face dataset without having to compute all possible image-to-image pairs. This reduces the computational demand of the model without compromising the impact of the ageing factor on the identity of the human face. The developed model was evaluated on the publicly available FG-NET dataset, two subsets of the CACD dataset, and a locally obtained FAGE dataset using leave-one-person (LOPO) cross-validation. The model achieved recognition accuracies of 97.01%, 99.89%, 99.92%, and 99.53% respectively. The developed model can be used to improve face recognition models by making them robust to age-variations in individuals in the dataset.
[...] Read more.Quantum computing is a computational framework based on the Quantum Mechanism, which has gotten a lot of attention in the past few decades. In comparison to traditional computers, it has achieved amazing performance on several specialized tasks. Quantum computing is the study of quantum computers that use quantum mechanics phenomena such as entanglement, superposition, annealing, and tunneling to solve problems that humans cannot solve in their lifetime. This article offers a brief outline of what is happening in the field of quantum computing, as well as the current state of the art. It also summarizes the features of quantum computing in terms of major elements such as qubit computation, quantum parallelism, and reverse computing. The study investigates the cause of a quantum computer's great computing capabilities by utilizing quantum entangled states. It also emphasizes that quantum computer research requires a combination of the most sophisticated sciences, such as computer technology, micro-physics, and advanced mathematics.
[...] Read more.With the reform of Chinese economic system, the development of enterprises is facing many risks and challenges. In order to understand the state of operation of enterprises, it is necessary to apply relevant methods to evaluate the enterprise performance. Taking Industrial and Commercial Bank of China as an example, this paper selects its financial data from 2018 to 2021. Firstly, DuPont analysis is applied to decompose the return on equity into the product of profit margin on sales, total assets turnover ratio and equity multiplier. Then analyzes the effect of the changes of these three factors on the return on equity respectively by using the Chain substitution method. The results show that the effect of profit margin on sales on return on equity decreases year by year and tends to be positive from negative. The effect of total assets turnover ratio on return on equity changes from positive to negative and then to positive, while the effect of equity multiplier is opposite. These results provide a direction for the adjustment of the return on equity of Industrial and Commercial Bank of China. Finally, according to the results, some suggestions are put forward for the development of Industrial and Commercial Bank of China.
[...] Read more.Many different methods are applied and used in an attempt to solve higher order nonlinear boundary value problems (BVPs). Galerkin weighted residual method (GWRM) are widely used to solve BVPs. The main aim of this paper is to find the approximate solutions of fifth, seventh and ninth order nonlinear boundary value problems using GWRM. A trial function namely, Bezier Polynomials is assumed which is made to satisfy the given essential boundary conditions. Investigate the effectiveness of the current method; some numerical examples were considered. The results are depicted both graphically and numerically. The numerical solutions are in good agreement with the exact result and get a higher accuracy in the solutions. The present method is quit efficient and yields better results when compared with the existing methods. All problems are performed using the software MATLAB R2017a.
[...] Read more.Among other factors affecting face recognition and verification, the aging of individuals is a particularly challenging one. Unlike other factors such as pose, expression, and illumination, aging is uncontrollable, personalized, and takes place throughout human life. Thus, while the effects of factors such as head pose, illumination, and facial expression on face recognition can be minimized by using images from controlled environments, the effect of aging cannot be so controlled. This work exploits the personalized nature of aging to reduce the effect of aging on face recognition so that an individual can be correctly recognized across his/her different age-separated face images. To achieve this, an individualized face pairing method was developed in this work to pair faces against entire sets of faces grouped by individuals then, similarity score vectors are obtained for both matching and non-matching image-individual pairs, and the vectors are then used for age-invariant face recognition. This model has the advantage of being able to capture all possible face matchings (intra-class and inter-class) within a face dataset without having to compute all possible image-to-image pairs. This reduces the computational demand of the model without compromising the impact of the ageing factor on the identity of the human face. The developed model was evaluated on the publicly available FG-NET dataset, two subsets of the CACD dataset, and a locally obtained FAGE dataset using leave-one-person (LOPO) cross-validation. The model achieved recognition accuracies of 97.01%, 99.89%, 99.92%, and 99.53% respectively. The developed model can be used to improve face recognition models by making them robust to age-variations in individuals in the dataset.
[...] Read more.An earlier research project that dealt with converting ASCII codes into 2D Cartesian coordinates and then applying translation and rotation transformations to construct an encryption system, is improved by this study. Here, we present a variation of the Cantor Pairing Function to convert ASCII values into distinctive 2D Coordinates. Then, we apply some novel methods to jumble the ciphertext generated as a result of the transformations. We suggest numerous improvements to the earlier research via simple tweaks in the existing code and by introducing a novel key generation protocol that generates an infinite integral key space with no decryption failures. The only way to break this protocol with no prior information would be brute force attack. With the help of elementary combinatorics and probability topics, we prove that this encryption protocol is seemingly infeasible to overcome by an unwelcome adversary.
[...] Read more.Currently, every company is concerned about the retention of their staff. They are nevertheless unable to recognize the genuine reasons for their job resignations due to various circumstances. Each business has its approach to treating employees and ensuring their pleasure. As a result, many employees abruptly terminate their employment for no apparent reason. Machine learning (ML) approaches have grown in popularity among researchers in recent decades. It is capable of proposing answers to a wide range of issues. Then, using machine learning, you may generate predictions about staff attrition. In this research, distinct methods are compared to identify which workers are most likely to leave their organization. It uses two approaches to divide the dataset into train and test data: the 70 percent train, the 30 percent test split, and the K-Fold approaches. Cat Boost, LightGBM Boost, and XGBoost are three methods employed for accuracy comparison. These three approaches are accurately generated by using Gradient Boosting Algorithms.
[...] Read more.Quantum computing is a computational framework based on the Quantum Mechanism, which has gotten a lot of attention in the past few decades. In comparison to traditional computers, it has achieved amazing performance on several specialized tasks. Quantum computing is the study of quantum computers that use quantum mechanics phenomena such as entanglement, superposition, annealing, and tunneling to solve problems that humans cannot solve in their lifetime. This article offers a brief outline of what is happening in the field of quantum computing, as well as the current state of the art. It also summarizes the features of quantum computing in terms of major elements such as qubit computation, quantum parallelism, and reverse computing. The study investigates the cause of a quantum computer's great computing capabilities by utilizing quantum entangled states. It also emphasizes that quantum computer research requires a combination of the most sophisticated sciences, such as computer technology, micro-physics, and advanced mathematics.
[...] Read more.There exist numerous numerical methods for solving the initial value problems of ordinary differential equations. The accuracy level and computational time are not the same for all of these methods. In this article, the Modified Euler method has been discussed for solving and finding the accurate solution of Ordinary Differential Equations using different step sizes. Approximate Results obtained by different step sizes are shown using the result analysis table. Some problems are solved by the proposed method then approximated results are shown graphically compare to the exact solution for a better understanding of the accuracy level of this method. Errors are estimated for each step and are represented graphically using Matlab Programming Language and MS Excel, which reveals that so much small step size gives better accuracy with less computational error. It is observed that this method is suitable for obtaining the accurate solution of ODEs when the taken step sizes are too much small.
[...] Read more.Cloud computing is a widely acceptable computing environment, and its services are also widely available. But the consumption of energy is one of the major issues of cloud computing as a green computing. Because many electronic resources like processing devices, storage devices in both client and server site and network computing devices like switches, routers are the main elements of energy consumption in cloud and during computation power are also required to cool the IT load in cloud computing. So due to the high consumption, cloud resources define the high energy cost during the service activities of cloud computing and contribute more carbon emissions to the atmosphere. These two issues inspired the cloud companies to develop such renewable cloud sustainability regulations to control the energy cost and the rate of CO2 emission. The main purpose of this paper is to develop a green computing environment through saving the energy of cloud resources using the specific approach of identifying the requirement of computing resources during the computation of cloud services. Only required computing resources remain ON (working state), and the rest become OFF (sleep/hibernate state) to reduce the energy uses in the cloud data centers. This approach will be more efficient than other available approaches based on cloud service scheduling or migration and virtualization of services in the cloud network. It reduces the cloud data center's energy usages by applying a power management scheme (ON/OFF) on computing resources. The proposed approach helps to convert the cloud computing in green computing through identifying an appropriate number of cloud computing resources like processing nodes, servers, disks and switches/routers during any service computation on cloud to handle the energy-saving or environmental impact.
[...] Read more.We aim to extract emotional components within statements to identify the emotional state of the writer and assigning emoji related to the emotion. Emojis have become a staple part of everyday text-based communication. It is normal and common to construct an entire response with the sole use of emoji. It comes as no surprise, therefore, that effort is being put into the automatic prediction and selection of emoji appropriate for a text message. Major companies like Apple and Google have made immense strides in this, and have already deployed such systems into production (for example, the Google Gboard). The proposed work is focused on the problem of automatic emoji selection for a given text message using machine learning classification algorithms to categorize the tone of a message which is further segregated through n-gram into one of seven distinct categories. Based on the output of the classifier, select one of the more appropriate emoji from a predefined list using natural language processing (NLP) and sentimental analysis techniques. The corpus is extracted from Twitter. The result is a boring text message made lively after being annotated with appropriate text messages
[...] Read more.Data mining and machine learning methods are important areas where studies have increased in recent years. Data is critical for these areas focus on inferring meaningful conclusions from the data collected. The preparation of the data is very important for the studies to be carried out and the algorithms to be applied. One of the most critical steps in data preparation is outlier detection. Because these observations, which have different characteristics from the observations in the data, affect the results of the algorithms to be applied and may cause erroneous results. New methods have been developed for outlier detection and machine learning and data mining algorithms have been provided with successful results with these methods. Algorithms such as Fuzzy C Means (FCM) and Self Organization Maps (SOM) have given successful results for outlier detection in this area. However, there is no outlier detection method in which these two powerful clustering methods are used together. This study proposes a new outlier detection algorithm using these two powerful clustering methods. In this study, a new outlier detection algorithm (FUSOMOUT) was developed by using SOM and FCM clustering methods together. With this algorithm, it is aimed to increase the success of both clustering and classification algorithms. The proposed algorithm was applied to four different datasets with different characteristics (Wisconsin breast cancer dataset (WDBC), Wine, Diabetes and Kddcup99) and it was shown to significantly increase the classification accuracy with the Silhouette, Calinski-Harabasz and Davies-Bouldin indexes as clustering success indexes.
[...] Read more.