Please login first

List of accepted submissions

 
 
Show results per page
Find papers
 
  • Open access
  • 0 Reads
IOT-based Smart Helmet with Accident Identification and Logistics Monitoring for Delivery Riders

The study developed a smart helmet prototype that prioritizes delivery rider safety and facilitates logistical communication for small businesses. It is achieved with the smart helmet, utilizing IoT equipped with crash detection and logistics monitoring functions. Various sensors such as accelerometer and alcohol sensors are calibrated to improve accuracy and minimize errors. A mobile he application was introduced to coordinate delivery logistics and track the location of drivers. The system returned 90 percent accuracy in distinguishing from real accidents, and it also had drunk driver detection with an accuracy of 88 percent. An ATTM336H GPS module was used for geolocation tracking, and a mobile application built with Bubble.io and Firebase was integrated into the helmet to send alerts the shop owners of Roger’s Top Silog House who provided delivery drivers as participants for the study, which gave us positive feedback indicating that Smart Helmet performed very well and exceeded expectations.

  • Open access
  • 0 Reads
Predicting Heart Disease using Sensor Networks, IoT, and Machine Learning: A Study on Physiological Sensor Data and Predictive Models

Context: The Internet of Things (IoT) and sensor networks are used for structural health monitoring (SHM). It can predict cardiac disease in healthcare by utilizing sensors and machine learning.

Objective: The goal of this research is to create a model for predicting cardiac disease by using sensor networks, IoT, and machine learning. To construct a prediction model, physiological data from patients will be collected and analysed using machine learning techniques.

Methodology: The methodology for this study is employing wearable sensors to collect physiological data from patients such as heart rate, blood pressure, and oxygen saturation levels. The data is subsequently processed and translated into an analysis-ready format. The most important predictors of heart disease are identified using feature selection and engineering techniques.

Statistical Measurement: A predictive model's performance can be assessed using statistical metrics, which can also assist pinpoint areas that need improvement. It's crucial to pick the right statistical measures based on the demands and objectives of the predictive model. Accuracy, precision, Recall, F1-score etc. are used for the performance of the proposed cardiac diseases model.

Conclusion: The heart disease prediction model developed in this work has the potential for improving patient outcomes while also lowering healthcare costs by identifying patients at risk of developing heart disease and offering appropriate interventions and treatments. Future research can expand on the possibilities of sensor networks, IoT, and machine learning approaches in healthcare, allowing for the development of more accurate and effective predictive models for heart disease and other medical diseases.

  • Open access
  • 0 Reads
Design and Development of Internet of Things based Condition Monitoring System for Industrial Rotating Machines

In general, the industries utilize more number of rotating machines and the efficient functioning of these machines is vital for the smooth operation of industrial processes. Further, the detection and identification of motor issues in a timely manner is crucial to prevent unexpected downtime and expensive repairs. In this work, a novel approach is proposed to monitor and assess the condition of motors in real-time by analysing the environmental parameters using a sensor which is capable of measuring temperature and humidity, to gather data about the operating environment of motors in industrial settings. Also, by continuously monitoring these environmental factors, deviations from optimal conditions can be detected, allowing for proactive maintenance actions to be taken. The proposed system consists of a network of temperature and humidity sensors strategically placed in proximity to the motors being monitored. Further, these sensors collect temperature and humidity data at regular intervals and transmits it to an IoT Cloud platform. Finally, the data is analysed using fuzzy logic decision making algorithm and compared against pre-defined threshold values to determine if the motor is operating within acceptable conditions. This work appears to be of high industry relevance since an automated notifications or alerts shall be sent to maintenance personnel when abnormal conditions are detected.

  • Open access
  • 0 Reads
Sons al Balcó: A subjective approach to the WASN-based LAeq measured values during the COVID-19 lockdown

The lockdown in Spain due to COVID-19 caused a strong decrease of the urban noise levels observed in most cities, clearly followed in the case that these cities had acoustic sensor networks deployed. This fact had an impact on people's lives, who, at that moment, were mainly locked at home due to health reasons. In this paper, we present a qualitative analysis of the subjective vision of the citizens participating in a data collecting campaign during the COVID-19 lockdown in Girona, a Catalan city, named 'Sons al Balcó'. The alignment of the subjective data gathered - too scarce to conduct to final conclusions - indicate that the objective $LAeq measurements, which showed a clear decrease of noise in the streets during the lockdown, were supported by the fact that new sounds found during the lockdown were not very annoying. Former existing noise sources, as road traffic noise or leisure noise, are depicted as annoying but their decrease during the lockdown improved the soundscape of many homes. This paper goal is to show the possibility of gathering both objective and calibrated data with perceptive approximation supports our conclusions, in a limited - in number of participants - but exceptional survey conducted during the 2020 COVID-19 lockdown period in Catalonia.

  • Open access
  • 0 Reads
Design of Artificial Intelligence based novel device for fault diagnosis of Integrated Circuits

The rapid advancement of integrated circuit (IC) technology has revolutionised various industries, but it has also introduced challenges in detecting faulty ICs. Traditional testing methods often rely on manual inspection or complex equipment, resulting in time-consuming and costly processes. In this work, a novel approach is proposed which uses a thermal camera and the Internet of Things (IoT) physical device namely Raspberry PI microcontroller for the detection of faulty and non-faulty ICs. Further, a deep learning algorithm namely You Only Look Once (YOLO) is coded inside the Raspberry PI controller using Python programming software to detect faulty ICs efficiently and accurately. Also, the various images of faulty and non-faulty IC are used to train the algorithm and once the algorithm is trained, the thermal camera along with the Raspberry PI microcontroller is used for real-time detection of faulty ICs and the YOLO algorithm analyses the thermal images to identify regions with abnormal temperature patterns, indicating potential faults. The proposed approach offers several advantages over traditional methods, including increased efficiency and improved accuracy.

  • Open access
  • 0 Reads
Fuzzy Inference System and IoT Based Smart Irrigation for Smallholder Agriculture in Rural Bangladesh

Agriculture constitutes one of the cornerstones of Bangladesh's social, economic, and political well-being. It is the important sector in terms of water consumption and waste, owing to inefficient irrigation techniques. In fact, poor mitigation strategies in cases of over- or under-irrigation have resulted in a drop in production rates. The purpose of this paper is to propose a Fuzzy Inference System (FIS) based smart irrigation mechanism for smallholder agriculture in rural Bangladesh that works to reduce irrigation frequency while increasing production rate. The system consists of a Mamdani fuzzy inference controller that gathers data from various sensors, including those measuring soil moisture, humidity, light intensity, and temperature. It uses a set of rules in a fuzzy inference system to control the flow of water from the water pump and activate irrigation at appropriate intervals. The system is simple to use and financially viable. It remains advantageous even when considering vast agricultural fields, as it effectively reduces water and energy consumption.

  • Open access
  • 0 Reads
A novel ensemble of FTIR Spectroscopic Biosensing and Deep Learning post-processing for early diagnosis of Endometrial Cancer

Cancers have been seen to be prevalent worldwide and affect a substantial amount of the global population, where the early and pro-active diagnosis of the disease continues to be a global medical challenge. Endometrial cancer represents a gynecological variant of a cancer that not only is difficult to diagnose but is also produces symptoms of which are not distinct and exclusive to just the cancer itself. Blood spectroscopy has recently prevailed as a means towards high throughput and a largely inexpensive method towards the diagnosis of the endometrial cancer, where, by the post-processing of the accompanying spectra alongside the use of multivariate statistics, an inference can be formed of which gives an indication of the presence and extent of the cancer.

Subsequent work done in this area showed that the prediction results for cancer could be improved with the use of signal decomposition models alongside machine learning prediction machines, thus showing the potential appeal of decomposition models in the processing pipeline of the spectroscopy data. As part of this exploratory study, we employ for the first time, the use of Deep Learning for the processing of acquired FTIR spectra, which allows for a fully unsupervised decomposition and feature extraction of the resulting spectra’s, coupled with prediction machines capable of predicting the presence of the cancer.

The obtained results show that the use of the Deep Learning allows for enhanced predictions of endometrial cancer, whilst allowing for a clinical decision support platform that carries a greater degree of autonomy and therein diagnosis throughput.

  • Open access
  • 0 Reads
Bio-Magneto Sensing and Unsupervised Deep Multiresolution Analysis for Labour Predictions in Term and Preterm Pregnancies

The effective prediction of preterm labour continues to be a topic for continued research within the area of pregnancy medicine, of which uterine contraction signals have now shown to be insightful towards the inference of a potential preterm. Bio-Magnetomyography(MMG) is a physiological measurement based tool that measures the orthogonal offset of bio-electrical manifestations from uterine contractions and can serve as an insight towards a potential premature delivery. The decoding of the associated physiological signal is an area of substantial research where classical signal processing approaches and metaheuristics optimisation routines have been utilised towards the post-processing and decomposition of the MMG signals. All of which require a degree of expert knowledge and a certain level of tuning and parameter initialisation.

As part of strides towards creating a more automated clinical decision support platform for the predictions of preterm, we employ the use of the Deep Wavelet Scattering(DWS) model, which allows for a deep multiresolution analysis alongside unsupervised feature learning, for the post-processing of candidate MMG signals. The DWS is combined with select pattern recognition-based prediction machines in order to assemble a Clinical Decision Pipeline for the prediction of the states of various pregnancies, with a greater degree of machine intelligence. The patient cohort involved a mixture of patients from a multitude of ethnicities, of whom delivered a mixture of Term and Preterm split within under and over 48 hours of labour imminency. The results show around a 5% increase in the prediction accuracy when compared to the classical methods, in addition to providing a more automated signal processing pipeline for the predictions.

  • Open access
  • 0 Reads
Strides Towards the clinical use of Artificial Intelligence and Haematological Measurements for a Rapid Throughput Diagnosis and care for Malaria Patients in West Africa

Malaria continues to be a major cause of death worldwide with a broad range of people spread across over 90 countries being at risk towards suffering from the disease. Due to this, there continues to be a substantial amount of investments towards, not just the treatment of the disease but also-a more rapid and accurate means towards diagnosis of the disease. In this work, we look to explore how measurements obtained from the Complete Blood Count(CBC) technique from patients blood, alongside Artificial Intelligence(AI) methods could form an affordable analytical pipeline that could be adopted in hospital settings in both developed and developing countries.

As part of this work, we utilise patient bloods measurement acquired from Ghana, West Africa alongside various configurations of AI models towards distinguishing between severe malaria (SM), uncomplex malaria (UM) and non-malarial infections (nMI) in a sample set comprising over 2000 patients. From which it is believed that the results showcase how a combination of measurements and AI modelling can contribute towards tackling the malaria epidemic from a diagnostics perspective and ultimately enhancing patient care strategies.

  • Open access
  • 0 Reads
Optimal Resource Allocation Scheme-based Time Slot Switching for Point-to-Point SISO SWIPT Systems

In this paper, an optimal resource allocation scheme for simultaneous wireless information and power transfer (SWIPT) systems is presented. The scheme considers nonlinear energy harvesting (EH) point-to-point single-input single-output (SISO) SWIPT system where the proposed scheme is based on timeslot switching (TS) aiming at maximizing the average achievable rate. The problem is formulated as a nonconvex optimization problem due to the presence of TS ratio binary aspect. We have solved the optimization problem by considering both the time-sharing strong duality theorem and Lagrange dual method. The derivation of optimal solution shows that the EH should be done in region of intermediate signal-to-noise ratio (SNR) while information decoding (ID) should be done in either the region of lesser SNR or higher SNR. Simulations are carried out in comparison with traditional state-in-the-art TS resource allocation schemes to evaluate the energy efficiency of the system in terms of transmission power, distance between source and destination, pathloss exponent, and minimum required harvested energy. Simulations showed that the proposed scheme improves energy efficiency with respect of different transmission power by 20%, 10%, and 3% for high SNR, medium SNR, and low SNR regions, respectively. Improvement with respect to other system performance metrics has also been noted for the proposed scheme.

Top