13 resultados para joint correspondence analysis
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Aims: the broad objective of this study is to investigate the ecological, biodiversity and conservation status of the coastal forests of Kenya fragments. The specific aims of the study are: (1) to investigate current quantitative trends in plant diversity; (2) develop a spatial and standardised vegetation database for the coastal forests Kenya; (3) investigate forest structure, species diversity and composition across the forests; (4) investigate the effect of forest fragment area on plant species diversity; (5) investigate phylogenetic diversity across these coastal remnants (6) assess vulnerability and provide conservation perspectives to concrete policy issues; (7) investigate plant and butterfly diversity correlation. Methods: I performed various analytical methods including species diversity metrics; multiple regression models for species-area relationship and small island effect; non-metric multidimensional scaling; ANOSIM; PERMANOVA; multiplicative beta diversity partitioning; species accumulation curve and species indicator analysis; statistical tests, rarefaction of species richness; phylogenetic diversity metrics of Phylogenetic diversity index, mean pairwise distance, mean nearest taxon distance, and their null-models: and Co-correspondence analysis. Results: developed the first large standardised, spatial and geo-referenced vegetation database for coastal forests of Kenya consisting of 600 plant species, across 25 forest fragments using 158 plots subdivided into 3160 subplots, 18 sacred forests and seven forest reserves; species diversity, composition and forest structure was significantly different across forest sites and between forest reserves and sacred forests, higher beta diversity, species-area relationship explained significant variability of plant diversity, small Island effect was not evident; sacred forests exhibited higher phylogenetic diversity compared to forest reserves; the threatened Red List species contributed higher evolutionary history; a strong correlation between plants and butterfly diversity. Conclusions: This study provides for the first time a standardized and large vegetation data. Results emphasizes need to improve sacred forests protection status and enhance forest connectivity across forest reserves and sacred forests.
Resumo:
The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.
Resumo:
3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.
Resumo:
A main objective of the human movement analysis is the quantitative description of joint kinematics and kinetics. This information may have great possibility to address clinical problems both in orthopaedics and motor rehabilitation. Previous studies have shown that the assessment of kinematics and kinetics from stereophotogrammetric data necessitates a setup phase, special equipment and expertise to operate. Besides, this procedure may cause feeling of uneasiness on the subjects and may hinder with their walking. The general aim of this thesis is the implementation and evaluation of new 2D markerless techniques, in order to contribute to the development of an alternative technique to the traditional stereophotogrammetric techniques. At first, the focus of the study has been the estimation of the ankle-foot complex kinematics during stance phase of the gait. Two particular cases were considered: subjects barefoot and subjects wearing ankle socks. The use of socks was investigated in view of the development of the hybrid method proposed in this work. Different algorithms were analyzed, evaluated and implemented in order to have a 2D markerless solution to estimate the kinematics for both cases. The validation of the proposed technique was done with a traditional stereophotogrammetric system. The implementation of the technique leads towards an easy to configure (and more comfortable for the subject) alternative to the traditional stereophotogrammetric system. Then, the abovementioned technique has been improved so that the measurement of knee flexion/extension could be done with a 2D markerless technique. The main changes on the implementation were on occlusion handling and background segmentation. With the additional constraints, the proposed technique was applied to the estimation of knee flexion/extension and compared with a traditional stereophotogrammetric system. Results showed that the knee flexion/extension estimation from traditional stereophotogrammetric system and the proposed markerless system were highly comparable, making the latter a potential alternative for clinical use. A contribution has also been given in the estimation of lower limb kinematics of the children with cerebral palsy (CP). For this purpose, a hybrid technique, which uses high-cut underwear and ankle socks as “segmental markers” in combination with a markerless methodology, was proposed. The proposed hybrid technique is different than the abovementioned markerless technique in terms of the algorithm chosen. Results showed that the proposed hybrid technique can become a simple and low-cost alternative to the traditional stereophotogrammetric systems.
Resumo:
The surface electrocardiogram (ECG) is an established diagnostic tool for the detection of abnormalities in the electrical activity of the heart. The interest of the ECG, however, extends beyond the diagnostic purpose. In recent years, studies in cognitive psychophysiology have related heart rate variability (HRV) to memory performance and mental workload. The aim of this thesis was to analyze the variability of surface ECG derived rhythms, at two different time scales: the discrete-event time scale, typical of beat-related features (Objective I), and the “continuous” time scale of separated sources in the ECG (Objective II), in selected scenarios relevant to psychophysiological and clinical research, respectively. Objective I) Joint time-frequency and non-linear analysis of HRV was carried out, with the goal of assessing psychophysiological workload (PPW) in response to working memory engaging tasks. Results from fourteen healthy young subjects suggest the potential use of the proposed indices in discriminating PPW levels in response to varying memory-search task difficulty. Objective II) A novel source-cancellation method based on morphology clustering was proposed for the estimation of the atrial wavefront in atrial fibrillation (AF) from body surface potential maps. Strong direct correlation between spectral concentration (SC) of atrial wavefront and temporal variability of the spectral distribution was shown in persistent AF patients, suggesting that with higher SC, shorter observation time is required to collect spectral distribution, from which the fibrillatory rate is estimated. This could be time and cost effective in clinical decision-making. The results held for reduced leads sets, suggesting that a simplified setup could also be considered, further reducing the costs. In designing the methods of this thesis, an online signal processing approach was kept, with the goal of contributing to real-world applicability. An algorithm for automatic assessment of ambulatory ECG quality, and an automatic ECG delineation algorithm were designed and validated.
Resumo:
Articular cartilage lesions, with their inherent limited healing potential, are hard to treat and remain a challenging problem for orthopedic surgeons. Despite the development of several treatment strategies, the real potential of each procedure in terms of clinical benefit and effects on the joint degeneration processes is not clear. Aim of this PhD project was to evaluate the results, both in terms of clinical and imaging improvement, of new promising procedures developed to address the challenging cartilage pathology. Several studies have been followed in parallel and completed over the 3-year PhD, and are reported in detail in the following pages. In particular, the studies have been focused on the evaluation of the treatment indications of a scaffold based autologous chondrocyte implantation procedure, documenting its results for the classic indication of focal traumatic lesions, as well as its use for the treatment of more challenging patients, older, with degenerative lesions, or even as salvage procedure for more advanced stages of articular degeneration. The second field of study involved the analysis of the results obtained treating lesions of the articular surface with a new biomimetic osteochondral scaffold, which showed promise for the treatment of defects where the entire osteochondral unit is involved. Finally, a new minimally invasive procedure based on the use of growth factors derived from autologous platelets has been explored, showing results and underlining indicatios for the treatment of cartilage lesions and different stages of joint degeneration. These studies shed some light on the potential of the evaluated procedures, underlining good results as well as limits, they give some indications on the most appropriate candidates for their application, and document the current knowledge on cartilage treatment procedures suggesting the limitations that need to be addressed by future studies to improve the management of cartilage lesions.
Resumo:
Gleno-humeral joint (GHJ) is the most mobile joint of the human body. This is related to theincongr uence between the large humeral head articulating with the much smaller glenoid (ratio 3:1). The GHJ laxity is the ability of the humeral head to be passively translated on the glenoid fossa and, when physiological, it guarantees the normal range of motion of the joint. Three-dimensional GHJ linear displacements have been measured, both in vivo and in vitro by means of different instrumental techniques. In vivo gleno-humeral displacements have been assessed by means of stereophotogrammetry, electromagnetic tracking sensors, and bio-imaging techniques. Both stereophotogrammetric systems and electromagnetic tracking devices, due to the deformation of the soft tissues surrounding the bones, are not capable to accurately assess small displacements, such as gleno-humeral joint translations. The bio-imaging techniques can ensure for an accurate joint kinematic (linear and angular displacement) description, but, due to the radiation exposure, most of these techniques, such as computer tomography or fluoroscopy, are invasive for patients. Among the bioimaging techniques, an alternative which could provide an acceptable level of accuracy and that is innocuous for patients is represented by magnetic resonance imaging (MRI). Unfortunately, only few studies have been conducted for three-dimensional analysis and very limited data is available in situations where preset loads are being applied. The general aim of this doctoral thesis is to develop a non-invasive methodology based on open-MRI for in-vivo evaluation of the gleno-humeral translation components in healthy subjects under the application of external loads.
Resumo:
Wearable inertial and magnetic measurements units (IMMU) are an important tool for underwater motion analysis because they are swimmer-centric, they require only simple measurement set-up and they provide the performance results very quickly. In order to estimate 3D joint kinematics during motion, protocols were developed to transpose the IMMU orientation estimation to a biomechanical model. The aim of the thesis was to validate a protocol originally propositioned to estimate the joint angles of the upper limbs during one-degree-of-freedom movements in dry settings and herein modified to perform 3D kinematics analysis of shoulders, elbows and wrists during swimming. Eight high-level swimmers were assessed in the laboratory by means of an IMMU while simulating the front crawl and breaststroke movements. A stereo-photogrammetric system (SPS) was used as reference. The joint angles (in degrees) of the shoulders (flexion-extension, abduction-adduction and internal-external rotation), the elbows (flexion-extension and pronation-supination), and the wrists (flexion-extension and radial-ulnar deviation) were estimated with the two systems and compared by means of root mean square errors (RMSE), relative RMSE, Pearson’s product-moment coefficient correlation (R) and coefficient of multiple correlation (CMC). Subsequently, the athletes were assessed during pool swimming trials through the IMMU. Considering both swim styles and all joint degrees of freedom modeled, the comparison between the IMMU and the SPS showed median values of RMSE lower than 8°, representing 10% of overall joint range of motion, high median values of CMC (0.97) and R (0.96). These findings suggest that the protocol accurately estimated the 3D orientation of the shoulders, elbows and wrists joint during swimming with accuracy adequate for the purposes of research. In conclusion, the proposed method to evaluate the 3D joint kinematics through IMMU was revealed to be a useful tool for both sport and clinical contexts.
Resumo:
The recent advent of Next-generation sequencing technologies has revolutionized the way of analyzing the genome. This innovation allows to get deeper information at a lower cost and in less time, and provides data that are discrete measurements. One of the most important applications with these data is the differential analysis, that is investigating if one gene exhibit a different expression level in correspondence of two (or more) biological conditions (such as disease states, treatments received and so on). As for the statistical analysis, the final aim will be statistical testing and for modeling these data the Negative Binomial distribution is considered the most adequate one especially because it allows for "over dispersion". However, the estimation of the dispersion parameter is a very delicate issue because few information are usually available for estimating it. Many strategies have been proposed, but they often result in procedures based on plug-in estimates, and in this thesis we show that this discrepancy between the estimation and the testing framework can lead to uncontrolled first-type errors. We propose a mixture model that allows each gene to share information with other genes that exhibit similar variability. Afterwards, three consistent statistical tests are developed for differential expression analysis. We show that the proposed method improves the sensitivity of detecting differentially expressed genes with respect to the common procedures, since it is the best one in reaching the nominal value for the first-type error, while keeping elevate power. The method is finally illustrated on prostate cancer RNA-seq data.
Resumo:
The evaluation of the knee joint behavior is fundamental in many applications, such as joint modeling, prosthesis and orthosis design. In-vitro tests are important in order to analyse knee behavior when simulating various loading conditions and studying physiology of the joint. A new test rig for in-vitro evaluation of the knee joint behavior is presented in this paper. It represents the evolution of a previously proposed rig, designed to overcome its principal limitations and to improve its performances. The design procedure and the adopted solution in order to satisfy the specifications are presented here. Thanks to its 6-6 Gough-Stewart parallel manipulator loading system, the rig replicates general loading conditions, like daily actions or clinical tests, on the specimen in a wide range of flexion angles. The restraining actions of knee muscles can be simulated when active actions are simulated. The joint motion in response to the applied loads, guided by passive articular structures and muscles, is permitted by the characteristics of the loading system which is force controlled. The new test rig guarantees visibility so that motion can be measured by an optoelectronic system. Furthermore, the control system of the new test rig allows the estimation of the contribution of the principal leg muscles in guaranteeing the equilibrium of the joint by the system for muscle simulation. Accuracy in positioning is guaranteed by the designed tibia and femur fixation systems,which allow unmounting and remounting the specimen in the same pose. The test rig presented in this paper permits the analysis of the behavior of the knee joint and comparative analysis on the same specimen before and after surgery, in a way to assess the goodness of prostheses or surgical treatments.
Resumo:
Three dimensional (3D) printers of continuous fiber reinforced composites, such as MarkTwo (MT) by Markforged, can be used to manufacture such structures. To date, research works devoted to the study and application of flexible elements and CMs realized with MT printer are only a few and very recent. A good numerical and/or analytical tool for the mechanical behavior analysis of the new composites is still missing. In addition, there is still a gap in obtaining the material properties used (e.g. elastic modulus) as it is usually unknown and sensitive to printing parameters used (e.g. infill density), making the numerical simulation inaccurate. Consequently, the aim of this thesis is to present several work developed. The first is a preliminary investigation on the tensile and flexural response of Straight Beam Flexures (SBF) realized with MT printer and featuring different interlayer fiber volume-fraction and orientation, as well as different laminate position within the sample. The second is to develop a numerical analysis within the Carrera' s Unified Formulation (CUF) framework, based on component-wise (CW) approach, including a novel preprocessing tool that has been developed to account all regions printed in an easy and time efficient way. Among its benefits, the CUF-CW approach enables building an accurate database for collecting first natural frequencies modes results, then predicting Young' s modulus based on an inverse problem formulation. To validate the tool, the numerical results are compared to the experimental natural frequencies evaluated using a digital image correlation method. Further, we take the CUF-CW model and use static condensation to analyze smart structures which can be decomposed into a large number of similar components. Third, the potentiality of MT in combination with topology optimization and compliant joints design (CJD) is investigated for the realization of automated machinery mechanisms subjected to inertial loads.
Resumo:
The thesis represents the conclusive outcome of the European Joint Doctorate programmein Law, Science & Technology funded by the European Commission with the instrument Marie Skłodowska-Curie Innovative Training Networks actions inside of the H2020, grantagreement n. 814177. The tension between data protection and privacy from one side, and the need of granting further uses of processed personal datails is investigated, drawing the lines of the technological development of the de-anonymization/re-identification risk with an explorative survey. After acknowledging its span, it is questioned whether a certain degree of anonymity can still be granted focusing on a double perspective: an objective and a subjective perspective. The objective perspective focuses on the data processing models per se, while the subjective perspective investigates whether the distribution of roles and responsibilities among stakeholders can ensure data anonymity.
Resumo:
In this PhD thesis a new firm level conditional risk measure is developed. It is named Joint Value at Risk (JVaR) and is defined as a quantile of a conditional distribution of interest, where the conditioning event is a latent upper tail event. It addresses the problem of how risk changes under extreme volatility scenarios. The properties of JVaR are studied based on a stochastic volatility representation of the underlying process. We prove that JVaR is leverage consistent, i.e. it is an increasing function of the dependence parameter in the stochastic representation. A feasible class of nonparametric M-estimators is introduced by exploiting the elicitability of quantiles and the stochastic ordering theory. Consistency and asymptotic normality of the two stage M-estimator are derived, and a simulation study is reported to illustrate its finite-sample properties. Parametric estimation methods are also discussed. The relation with the VaR is exploited to introduce a volatility contribution measure, and a tail risk measure is also proposed. The analysis of the dynamic JVaR is presented based on asymmetric stochastic volatility models. Empirical results with S&P500 data show that accounting for extreme volatility levels is relevant to better characterize the evolution of risk. The work is complemented by a review of the literature, where we provide an overview on quantile risk measures, elicitable functionals and several stochastic orderings.