20 resultados para Processing methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent years observed massive growth in wearable technology, everything can be smart: phones, watches, glasses, shirts, etc. These technologies are prevalent in various fields: from wellness/sports/fitness to the healthcare domain. The spread of this phenomenon led the World-Health-Organization to define the term 'mHealth' as "medical and public health practice supported by mobile devices, such as mobile phones, patient monitoring devices, personal digital assistants, and other wireless devices". Furthermore, mHealth solutions are suitable to perform real-time wearable Biofeedback (BF) systems: sensors in the body area network connected to a processing unit (smartphone) and a feedback device (loudspeaker) to measure human functions and return them to the user as (bio)feedback signal. During the COVID-19 pandemic, this transformation of the healthcare system has been dramatically accelerated by new clinical demands, including the need to prevent hospital surges and to assure continuity of clinical care services, allowing pervasive healthcare. Never as of today, we can say that the integration of mHealth technologies will be the basis of this new era of clinical practice. In this scenario, this PhD thesis's primary goal is to investigate new and innovative mHealth solutions for the Assessment and Rehabilitation of different neuromotor functions and diseases. For the clinical assessment, there is the need to overcome the limitations of subjective clinical scales. Creating new pervasive and self-administrable mHealth solutions, this thesis investigates the possibility of employing innovative systems for objective clinical evaluation. For rehabilitation, we explored the clinical feasibility and effectiveness of mHealth systems. In particular, we developed innovative mHealth solutions with BF capability to allow tailored rehabilitation. The main goal that a mHealth-system should have is improving the person's quality of life, increasing or maintaining his autonomy and independence. To this end, inclusive design principles might be crucial, next to the technical and technological ones, to improve mHealth-systems usability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non Destructive Testing (NDT) and Structural Health Monitoring (SHM) are becoming essential in many application contexts, e.g. civil, industrial, aerospace etc., to reduce structures maintenance costs and improve safety. Conventional inspection methods typically exploit bulky and expensive instruments and rely on highly demanding signal processing techniques. The pressing need to overcome these limitations is the common thread that guided the work presented in this Thesis. In the first part, a scalable, low-cost and multi-sensors smart sensor network is introduced. The capability of this technology to carry out accurate modal analysis on structures undergoing flexural vibrations has been validated by means of two experimental campaigns. Then, the suitability of low-cost piezoelectric disks in modal analysis has been demonstrated. To enable the use of this kind of sensing technology in such non conventional applications, ad hoc data merging algorithms have been developed. In the second part, instead, imaging algorithms for Lamb waves inspection (namely DMAS and DS-DMAS) have been implemented and validated. Results show that DMAS outperforms the canonical Delay and Sum (DAS) approach in terms of image resolution and contrast. Similarly, DS-DMAS can achieve better results than both DMAS and DAS by suppressing artefacts and noise. To exploit the full potential of these procedures, accurate group velocity estimations are required. Thus, novel wavefield analysis tools that can address the estimation of the dispersion curves from SLDV acquisitions have been investigated. An image segmentation technique (called DRLSE) was exploited in the k-space to draw out the wavenumber profile. The DRLSE method was compared with compressive sensing methods to extract the group and phase velocity information. The validation, performed on three different carbon fibre plates, showed that the proposed solutions can accurately determine the wavenumber and velocities in polar coordinates at multiple excitation frequencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain functioning relies on the interaction of several neural populations connected through complex connectivity networks, enabling the transmission and integration of information. Recent advances in neuroimaging techniques, such as electroencephalography (EEG), have deepened our understanding of the reciprocal roles played by brain regions during cognitive processes. The underlying idea of this PhD research is that EEG-related functional connectivity (FC) changes in the brain may incorporate important neuromarkers of behavior and cognition, as well as brain disorders, even at subclinical levels. However, a complete understanding of the reliability of the wide range of existing connectivity estimation techniques is still lacking. The first part of this work addresses this limitation by employing Neural Mass Models (NMMs), which simulate EEG activity and offer a unique tool to study interconnected networks of brain regions in controlled conditions. NMMs were employed to test FC estimators like Transfer Entropy and Granger Causality in linear and nonlinear conditions. Results revealed that connectivity estimates reflect information transmission between brain regions, a quantity that can be significantly different from the connectivity strength, and that Granger causality outperforms the other estimators. A second objective of this thesis was to assess brain connectivity and network changes on EEG data reconstructed at the cortical level. Functional brain connectivity has been estimated through Granger Causality, in both temporal and spectral domains, with the following goals: a) detect task-dependent functional connectivity network changes, focusing on internal-external attention competition and fear conditioning and reversal; b) identify resting-state network alterations in a subclinical population with high autistic traits. Connectivity-based neuromarkers, compared to the canonical EEG analysis, can provide deeper insights into brain mechanisms and may drive future diagnostic methods and therapeutic interventions. However, further methodological studies are required to fully understand the accuracy and information captured by FC estimates, especially concerning nonlinear phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deep Neural Networks (DNNs) have revolutionized a wide range of applications beyond traditional machine learning and artificial intelligence fields, e.g., computer vision, healthcare, natural language processing and others. At the same time, edge devices have become central in our society, generating an unprecedented amount of data which could be used to train data-hungry models such as DNNs. However, the potentially sensitive or confidential nature of gathered data poses privacy concerns when storing and processing them in centralized locations. To this purpose, decentralized learning decouples model training from the need of directly accessing raw data, by alternating on-device training and periodic communications. The ability of distilling knowledge from decentralized data, however, comes at the cost of facing more challenging learning settings, such as coping with heterogeneous hardware and network connectivity, statistical diversity of data, and ensuring verifiable privacy guarantees. This Thesis proposes an extensive overview of decentralized learning literature, including a novel taxonomy and a detailed description of the most relevant system-level contributions in the related literature for privacy, communication efficiency, data and system heterogeneity, and poisoning defense. Next, this Thesis presents the design of an original solution to tackle communication efficiency and system heterogeneity, and empirically evaluates it on federated settings. For communication efficiency, an original method, specifically designed for Convolutional Neural Networks, is also described and evaluated against the state-of-the-art. Furthermore, this Thesis provides an in-depth review of recently proposed methods to tackle the performance degradation introduced by data heterogeneity, followed by empirical evaluations on challenging data distributions, highlighting strengths and possible weaknesses of the considered solutions. Finally, this Thesis presents a novel perspective on the usage of Knowledge Distillation as a mean for optimizing decentralized learning systems in settings characterized by data heterogeneity or system heterogeneity. Our vision on relevant future research directions close the manuscript.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of ancient, undeciphered scripts presents unique challenges, that depend both on the nature of the problem and on the peculiarities of each writing system. In this thesis, I present two computational approaches that are tailored to two different tasks and writing systems. The first of these methods is aimed at the decipherment of the Linear A afraction signs, in order to discover their numerical values. This is achieved with a combination of constraint programming, ad-hoc metrics and paleographic considerations. The second main contribution of this thesis regards the creation of an unsupervised deep learning model which uses drawings of signs from ancient writing system to learn to distinguish different graphemes in the vector space. This system, which is based on techniques used in the field of computer vision, is adapted to the study of ancient writing systems by incorporating information about sequences in the model, mirroring what is often done in natural language processing. In order to develop this model, the Cypriot Greek Syllabary is used as a target, since this is a deciphered writing system. Finally, this unsupervised model is adapted to the undeciphered Cypro-Minoan and it is used to answer open questions about this script. In particular, by reconstructing multiple allographs that are not agreed upon by paleographers, it supports the idea that Cypro-Minoan is a single script and not a collection of three script like it was proposed in the literature. These results on two different tasks shows that computational methods can be applied to undeciphered scripts, despite the relatively low amount of available data, paving the way for further advancement in paleography using these methods.