Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 0 Reads
An Investigation on the Most Used Programming Language for Developing Applications in Software Development Industries (Case study of Kano State Nigeria)

The software development industry in Kano State, Nigeria, has seen rapid growth in recent years, driven by increasing demand for mobile and web applications. With the availability of various programming languages, understanding the most commonly used language is essential for developers to enhance their skill sets, allocate resources more efficiently, and drive industry growth. This study aimed to identify the most widely used programming language in Kano's application development sector and to explore the factors influencing its adoption. The research employed a mixed-methods approach, including surveys, interviews, and observations, to gather data from software development companies operating in Kano. Stratified sampling was used to select ten companies, representing a range of sizes and specializations within the industry. Questionnaires were designed to collect information on programming language usage, the factors influencing language adoption, and the demographic details of the developers. In addition to this, interviews with industry experts and experienced developers provided further qualitative insights. The results indicate that Java is the most commonly used programming language in Kano's application development industry, accounting for 40% of usage. Python follows with 30%, while JavaScript is used in 20% of projects. Other languages, including C++, C#, and PHP, make up the remaining portion. The primary factors driving programming language adoption in Kano include the specific requirements of projects, the skill levels of developers, and industry trends. These findings have important implications for developers and companies looking to optimize their development processes and stay competitive in a fast-evolving market.

  • Open access
  • 0 Reads
Ausculmate: Neural Networks for Better Patient Care
, ,

A crucial part of examining patients suspected of having respiratory diseases is auscultation. Doctors use a stethoscope to listen to breath sounds to diagnose lung conditions. These noises, such as wheezes and crackles, indicate different respiratory conditions. Nevertheless, distinguishing these sounds is frequently reliant on the doctor's expertise and is subject to interpretation, underscoring the necessity for a more impartial method. This research presents a method that utilizes deep learning to examine lung sounds. The setup includes an electronic stethoscope for collecting chest noises, a smartphone application for recording and tracking patient information, and a sophisticated artificial intelligence model for categorizing chest sounds. Data preprocessing for audio involves normalization, temporal segmentation, applying Short-Term Fourier Transform (STFT), and converting data into spectrograms suitable for CNN input. The CNN is trained using multilabel classification methods, utilizing categorical cross-entropy as the loss function and assessing metrics like accuracy, precision, recall, F1-score, and ROC curve examination. The mobile app, which is easy to use, utilizes Flutter for the frontend and MongoDB and Django for the backend and database, respectively, guaranteeing compatibility across platforms, as well as speed and scalability.

This study suggests a practical categorization of lung noises as wheezes, crackles, and normal states. The performance metrics of the model indicate its potential value in clinical environments, with a validation precision of 96.42%, a recall of 94.75%, and a validation loss of 0.15. The combination of an e-stethoscope, mobile app, and DL model in the automated analysis system for lung sounds shows great potential in enhancing the accuracy of diagnosing respiratory diseases. This approach has the potential to improve patient outcomes by reducing the subjectivity of traditional auscultation, leading to more accurate and timely intervention.

  • Open access
  • 0 Reads
Enhancing High-Resolution MRI for Precise Diagnosis and Treatment of Atypical Teratoid Rhabdoid Tumor
,

Atypical Teratoid Rhabdoid Tumor (ATRT) is an aggressive brain tumor in children, requiring precise imaging for accurate diagnosis and effective treatment planning. High-resolution Magnetic Resonance Imaging (MRI) is essential for visualizing ATRT, but traditional imaging methods face challenges in detecting and analyzing such rare tumors. This study examines the use of advanced machine learning models to improve MRI analysis for ATRT, aiming to enhance diagnostic accuracy and treatment outcomes. The study utilizes cutting-edge deep learning models, including Vision Transformers (ViTs), ResNet-based Convolutional Neural Networks (CNNs), and Generative Adversarial Networks (GANs). These models were customized for specific tasks such as tumor detection, segmentation, and classification. To overcome the issue of limited data for rare tumors, techniques like transfer learning, multi-scale image processing, and synthetic data augmentation were applied. These deep learning approaches led to enhanced tumor segmentation, providing more detailed visual analysis, which is crucial for developing precise treatment strategies. By optimizing high-resolution MRI scans with these technologies, the study seeks to assist clinicians in making better-informed decisions for ATRT treatment. The integration of these advanced techniques shows promise for improving diagnostic precision and tailoring treatment plans, representing a notable advancement in pediatric neuro-oncology. Ongoing refinement of these methods is key to furthering progress in ATRT diagnosis and management.

  • Open access
  • 0 Reads
Soluble fibers from persimmon by-product promote the growth of anti-inflammatory beneficial bacteria and display anti-inflammatory properties

Persimmon by-product serves as a good source of bioactive compounds like polyphenols, carotenoids or polysaccharides. The revalorisation of these by products could offer different venues to develop new functional ingredients that promote the health and well-being. Here we determined the potential of soluble polysaccharides from persimmon by-product to promote the growth of specific beneficial bacteria from the human gut that are implicated in anti-inflammatory properties. Our results suggest that persimmon polysaccharides promote the growth of Firmicutes species that display anti inflammatory properties, especially due to their stimulation in the production of short chain fatty acids like butyrate, showcasing a noted effect with beneficial fermentation profile stimulation than the control (glucose). These fractions also generated anti inflammatory properties in RAW 264.7 cells (macrophages), by reducing the levels of IL-6 and TNF-α. Also, the samples showed to promote the production of antixodiant enzymes such as catalase and superoxide dismutase. In this sense, the samples have displayed anti inflammatory properties by both, regulating the inflammatory response of stressed cells, as well as by promoting the growth of specific beneficial bacteria. These results suggest a noted potential of persimmon fractions to be used as functional ingredients, especially for diseases related to uncontrolled pro-inflammatory that require supplements.

  • Open access
  • 0 Reads
How biomarkers and artificial intelligence (AI) are innovating personalized nutrition: the importance of a robust computational infrastructure

Personalized nutrition has the potential to revolutionize health by integrating biomarkers and artificial intelligence (AI) to provide tailored dietary recommendations. This systematic review synthesizes existing research on the role of biomarkers and AI in personalized nutrition, focusing on the critical need for a robust computational infrastructure. A comprehensive search of major databases, including PubMed, Scopus, and Web of Science, was conducted to identify studies published between 2010 and 2024. A total of 50 studies were selected based on their relevance and contribution to the understanding of how biomarkers and AI enhance the accuracy of nutritional assessments and recommendations. The review found that biomarkers play a crucial role in detecting molecular and biochemical changes linked to nutrient intake and metabolism, providing a precise assessment of nutritional status. Concurrently, AI technologies analyze large datasets—encompassing genetic, dietary, and health data—to generate personalized recommendations that account for individual variability. This combination offers significant advantages over traditional population-based studies. However, the review highlights the persistent challenges in building a robust computational infrastructure necessary to support these innovations. Effective infrastructure must address the complexity of dietary databases, enabling the accurate translation of food components into nutrients and energy. Additionally, the development of bioinformatics frameworks could standardize and annotate nutritional data, facilitating better dietary monitoring and management. This review concludes that while biomarkers and AI are transforming personalized nutrition, advancing computational infrastructure is essential to fully realize their potential and improve the quality and accessibility of personalized dietary guidance. Further research is recommended to overcome the current limitations and ensure the ethical and effective implementation of these technologies.

  • Open access
  • 0 Reads
Stock Market Prediction Using Machine Learning: A Comprehensive Approach to Data Collection, Model Selection, and Performance Evaluation

Stock market prediction using machine learning (ML) is a complex yet important area of financial research aimed at forecasting future stock prices based on historical and real-time data. Accurate predictions are crucial for investors, financial analysts, and policymakers to develop better investment strategies and risk management practices. However, stock prices are highly volatile and influenced by various factors, such as geopolitical events, economic indicators, and investor sentiment, making accurate prediction challenging.

This study examines different ML techniques for stock market prediction, focusing on data collection, feature engineering, model selection, and performance evaluation. Effective data collection involves gathering diverse data types, including historical prices, trading volumes, economic indicators, sentiment data, and company-specific information. Feature engineering enhances model inputs with relevant variables like moving averages, RSI, sentiment scores, and volatility measures.

The research evaluates both traditional models (linear regression, decision trees) and advanced techniques (neural networks, ensemble methods). Traditional models are effective for linear trends but less so for complex market behaviors, whereas advanced methods like Long Short-Term Memory (LSTM) networks excel in modeling sequential data and time-series forecasting. Ensemble methods, such as stacking and boosting, improve predictive performance by combining multiple models to reduce bias and variance.

The study also emphasizes backtesting models through simulated trading strategies to assess their real-world applicability and robustness. Challenges such as market efficiency, data quality, and overfitting are highlighted, with solutions including reinforcement learning and anomaly detection to enhance model adaptability and robustness.

Overall, the study provides a framework for developing robust stock prediction models, integrating various ML techniques while addressing ethical and regulatory considerations. Continuous evaluation and adaptation of models are essential to ensure their reliability and effectiveness in the ever-changing financial markets.

  • Open access
  • 0 Reads
A Hybrid Feature Extraction Approach Using DenseNet and Local Binary Patterns for Alzheimer's Disease Classification

Alzheimer’s Disease (AD) is a prevalent neurodegenerative disorder that significantly impacts cognitive and functional abilities. Early and accurate diagnosis of AD and its associated cognitive impairments is critical for effective management and intervention. In this study, we propose a hybrid feature extraction method combining Local Binary Patterns (LBPs) and the DenseNet deep learning model to enhance the classification accuracy of AD and related cognitive conditions. The ADNI3 dataset, consisting of five distinct classes, Alzheimer's Disease (AD), Control, Early Mild Cognitive Impairment (EMCI), Late Mild Cognitive Impairment (LMCI), and Mild Cognitive Impairment (MCI), was employed in this analysis.

Images from the dataset were preprocessed by converting them to grayscale for LBP extraction and resized to 224x224 pixels for DenseNet processing. The extracted LBP and DenseNet features were concatenated to form a comprehensive feature set, which was then used to train a multi-class Support Vector Machine (SVM) classifier with Error-Correcting Output Codes (ECOCs).

The proposed method demonstrated a robust performance with an overall accuracy of 95.36%. The confusion matrix analysis revealed precision, recall, and F1 scores of 96.93%, 91.54%, and 93.96%, respectively, indicating high reliability in classifying the different stages of cognitive impairment. These findings suggest that the integration of LBP and DenseNet features provides a powerful approach for the early diagnosis and classification of Alzheimer's Disease, with potential applications in clinical settings for facilitating timely interventions and improving patient outcomes.

  • Open access
  • 0 Reads
Transformer-based Purchase Intention Mining: A Comparative Study Using A Novel Dataset
, , , ,

The scarcity of high-quality, contextually relevant datasets is a significant impediment to progress in the field of purchase intention mining. The absence of comprehensive datasets that capture the subtleties of consumer intentions across various contexts often hinders the development of robust and accurate predictive models. This research addresses this critical gap by developing a novel dataset specifically designed to encapsulate the intricacies of consumer purchase intentions. The primary aim of this research is to develop and evaluate a new dataset tailored for purchase intention mining and to assess the effectiveness of transformer-based models in this domain. To achieve this, we collected, preprocessed and analyzed a new dataset and fine-tuned advanced transformer models, including RoBERTa, ALBERT, and DistilBERT, on the newly created dataset. These models were then rigorously compared against traditional machine learning algorithms and deep learning architectures, such as Logistic Regression, Support Vector Machines, LightGBM, and Convolutional Neural Networks (CNN) with and without LSTM layers. The results of our comprehensive evaluation demonstrate that transformer models significantly outperform traditional approaches, achieving near-perfect accuracy, precision, recall, and F1-scores across different purchase intention categories (Positive, Negative, and Neutral). This superior performance was consistent across both undersampled and oversampled versions of the dataset, underscoring the robustness of our proposed dataset in facilitating high-precision sentiment analysis tasks. In conclusion, this research not only highlights the effectiveness of transformer models in understanding and predicting consumer purchase intentions but also emphasizes the importance of developing specialized datasets to overcome the challenges posed by dataset scarcity. Our findings align with existing literature, reinforcing the dominance of transformer-based approaches in natural language processing applications and setting a new standard in the field of purchase intention mining.

  • Open access
  • 0 Reads
Optimizing Network Traffic Classification Through a Novel BAT-ANN Model: An Empirical investigation in Improved Accuracy and Scalability in Network Security

The modern network has expanded significantly due to the rapid rise of network usage, making it a big, dynamic, and complicated system. The management of modern network traffic has now become a major challenge as a result of large-scale network-based applications. Consequently, the smart traffic analysis-based monitoring of networks has become an urgent need. Network traffic classification is an important approach used in network management, as well as in network security. Traditional methods often require ongoing maintenance, struggle with dynamic ports, and lack the granularity needed for precise classification. They can also be resource-intensive and have scalability issues for large networks. To overcome these limitations, many organizations are turning to more advanced techniques like machine learning and behavior analysis for better network traffic classification and security. Accurate classification is crucial for network traffic due to its multifaceted importance. It serves as the foundation of network security by enabling the rapid detection of security threats and illegal activity, which is critical for protecting against cyberattacks. This study suggests a smart, intelligent system based on a BAT artificial network for network traffic classification. The proposed system makes use of a publicly available NIMS dataset. Furthermore, we have applied some preprocessing and feature selection techniques before feeding the data into the classifier. The experimental outcomes reveal that the proposed approach achieved high accuracy and low computation time, and performed better than the previous approaches used for network traffic classification.

  • Open access
  • 0 Reads
Comparative Analysis of LSTM and GRU Models for Chicken Egg Fertility Classification using Deep Learning

This study explores the application of advanced Recurrent Neural Network (RNN) architectures—specifically Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU)—for classifying chicken egg fertility based on embryonic development detected in egg images. Traditional methods, such as candling, are labor-intensive and often inaccurate, making them unsuitable for large-scale poultry operations. By leveraging the capabilities of LSTM and GRU models, this research aims to automate and enhance the accuracy of egg fertility classification, thereby contributing to agricultural automation. A dataset comprising 240 high-resolution egg images was employed, resized to 255x255 pixels for optimal processing efficiency. LSTM and GRU models were trained to discern fertile from infertile eggs by analyzing the sequential data represented by the pixel rows in these images. The LSTM model demonstrated superior performance, achieving a validation accuracy of 89.58% with a loss of 1.1691, outperforming the GRU model, which recorded a lower accuracy of 66.67% and a significantly higher loss of 12.6634. The LSTM’s complex gating mechanisms were more effective in capturing long-range dependencies within the data, leading to more reliable predictions. The findings suggest that LSTM models are better suited for precision-critical applications in poultry farming, where accurate fertility classification is paramount. In contrast, GRU models, while more computationally efficient, may struggle with generalization under constrained data conditions. This study underscores the potential of advanced RNNs in enhancing the efficiency and accuracy of automated farming systems, paving the way for future research to further optimize these models for real-world agricultural applications.

Top