1000 resultados para Posture Data
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.
Resumo:
BACKGROUND Multiple sclerosis (MS) is a neurodegenerative, autoimmune disease of the central nervous system. Genome-wide association studies (GWAS) have identified over hundred polymorphisms with modest individual effects in MS susceptibility and they have confirmed the main individual effect of the Major Histocompatibility Complex. Additional risk loci with immunologically relevant genes were found significantly overrepresented. Nonetheless, it is accepted that most of the genetic architecture underlying susceptibility to the disease remains to be defined. Candidate association studies of the leukocyte immunoglobulin-like receptor LILRA3 gene in MS have been repeatedly reported with inconsistent results. OBJECTIVES In an attempt to shed some light on these controversial findings, a combined analysis was performed including the previously published datasets and three newly genotyped cohorts. Both wild-type and deleted LILRA3 alleles were discriminated in a single-tube PCR amplification and the resulting products were visualized by their different electrophoretic mobilities. RESULTS AND CONCLUSION Overall, this meta-analysis involved 3200 MS patients and 3069 matched healthy controls and it did not evidence significant association of the LILRA3 deletion [carriers of LILRA3 deletion: p = 0.25, OR (95% CI) = 1.07 (0.95-1.19)], even after stratification by gender and the HLA-DRB1*15:01 risk allele.
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
Evaluation of segmentation methods is a crucial aspect in image processing, especially in the medical imaging field, where small differences between segmented regions in the anatomy can be of paramount importance. Usually, segmentation evaluation is based on a measure that depends on the number of segmented voxels inside and outside of some reference regions that are called gold standards. Although some other measures have been also used, in this work we propose a set of new similarity measures, based on different features, such as the location and intensity values of the misclassified voxels, and the connectivity and the boundaries of the segmented data. Using the multidimensional information provided by these measures, we propose a new evaluation method whose results are visualized applying a Principal Component Analysis of the data, obtaining a simplified graphical method to compare different segmentation results. We have carried out an intensive study using several classic segmentation methods applied to a set of MRI simulated data of the brain with several noise and RF inhomogeneity levels, and also to real data, showing that the new measures proposed here and the results that we have obtained from the multidimensional evaluation, improve the robustness of the evaluation and provides better understanding about the difference between segmentation methods.
Resumo:
A new ambulatory technique for qualitative and quantitative movement analysis of the humerus is presented. 3D gyroscopes attached on the humerus were used to recognize the movement of the arm and to classify it as flexion, abduction and internal/external rotations. The method was first validated in a laboratory setting and then tested on 31 healthy volunteer subjects while carrying the ambulatory system during 8 h of their daily life. For each recording, the periods of sitting, standing and walking during daily activity were detected using an inertial sensor attached on the chest. During each period of daily activity the type of arm movement (flexion, abduction, internal/external rotation) its velocity and frequency (number of movement/hour) were estimated. The results showed that during the whole daily activity and for each activity (i.e. walking, sitting and walking) the frequency of internal/external rotation was significantly higher while the frequency of abduction was the lowest (P < 0.009). In spite of higher number of flexion, abduction and internal/external rotation in the dominant arm, we have not observed in our population a significant difference with the non-dominant arm, implying that in healthy subjects the arm dominance does not lie considerably on the number of movements. As expected, the frequency of the movement increased from sitting to standing and from standing to walking, while we provide a quantitative value of this change during daily activity. This study provides preliminary evidence that this system is a useful tool for objectively assessing upper-limb activity during daily activity. The results obtained with the healthy population could be used as control data to evaluate arm movement of patients with shoulder diseases during daily activity.
Resumo:
Several eco-toxicological studies have shown that insectivorous mammals, due to theirfeeding habits, easily accumulate high amounts of pollutants in relation to other mammal species. To assess the bio-accumulation levels of toxic metals and their in°uenceon essential metals, we quantified the concentration of 19 elements (Ca, K, Fe, B, P,S, Na, Al, Zn, Ba, Rb, Sr, Cu, Mn, Hg, Cd, Mo, Cr and Pb) in bones of 105 greaterwhite-toothed shrews (Crocidura russula) from a polluted (Ebro Delta) and a control(Medas Islands) area. Since chemical contents of a bio-indicator are mainly compositional data, conventional statistical analyses currently used in eco-toxicology can givemisleading results. Therefore, to improve the interpretation of the data obtained, weused statistical techniques for compositional data analysis to define groups of metalsand to evaluate the relationships between them, from an inter-population viewpoint.Hypothesis testing on the adequate balance-coordinates allow us to confirm intuitionbased hypothesis and some previous results. The main statistical goal was to test equalmeans of balance-coordinates for the two defined populations. After checking normality,one-way ANOVA or Mann-Whitney tests were carried out for the inter-group balances
Resumo:
First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated
Resumo:
Objective: The Agency for Healthcare Research and Quality (AHRQ) developed Patient Safety Indicators (PSIs) for use with ICD-9-CM data. Many countries have adopted ICD-10 for coding hospital diagnoses. We conducted this study to develop an internationally harmonized ICD-10 coding algorithm for the AHRQ PSIs. Methods: The AHRQ PSI Version 2.1 has been translated into ICD-10-AM (Australian Modification), and PSI Version 3.0a has been independently translated into ICD-10-GM (German Modification). We converted these two country-specific coding algorithms into ICD-10-WHO (World Health Organization version) and combined them to form one master list. Members of an international expert panel-including physicians, professional medical coders, disease classification specialists, health services researchers, epidemiologists, and users of the PSI-independently evaluated this master list and rated each code as either "include," "exclude," or "uncertain," following the AHRQ PSI definitions. After summarizing the independent rating results, we held a face-to-face meeting to discuss codes for which there was no unanimous consensus and newly proposed codes. A modified Delphi method was employed to generate a final ICD-10 WHO coding list. Results: Of 20 PSIs, 15 that were based mainly on diagnosis codes were selected for translation. At the meeting, panelists discussed 794 codes for which consensus had not been achieved and 2,541 additional codes that were proposed by individual panelists for consideration prior to the meeting. Three documents were generated: a PSI ICD-10-WHO version-coding list, a list of issues for consideration on certain AHRQ PSIs and ICD-9-CM codes, and a recommendation to WHO to improve specification of some disease classifications. Conclusion: An ICD-10-WHO PSI coding list has been developed and structured in a manner similar to the AHRQ manual. Although face validity of the list has been ensured through a rigorous expert panel assessment, its true validity and applicability should be assessed internationally.
Resumo:
PURPOSE: : We describe a retinal endovascular fibrinolysis technique to directly reperfuse experimentally occluded retinal veins using a simple micropipette. METHODS: : Retinal vein occlusion was photochemically induced in 12 eyes of 12 minipigs: after intravenous injection of 10% fluorescein (1-mL bolus), the targeted retinal vein segment was exposed to thrombin (50 units) and to Argon laser (100-200 mW) through a pars plana approach. A beveled micropipette with a 30-μm-diameter sharp edge was used for micropuncture of the occluded vein and endovascular microinjection of tissue plasminogen activator (50 μg/mL) in 11 eyes. In one control eye, balanced salt solution was injected. The lesion site was examined histologically. RESULTS: : Retinal vein occlusion was achieved in all cases. Endovascular microinjection of tissue plasminogen activator or balanced salt solution led to reperfusion of the occluded retinal vein in all cases. Indicative of successful reperfusion were the following: continuous endovascular flow, unaffected collateral circulation, no optic disk ischemia, and no venous wall bleeding. However, balanced salt solution injection was accompanied by thrombus formation at the punctured site, whereas no thrombus was observed with tissue plasminogen activator injection. CONCLUSION: : Retinal endovascular fibrinolysis constitutes an efficient method of micropuncture and reperfusion of an experimentally occluded retinal vein. Thrombus formation at the punctured site can be prevented by injection of tissue plasminogen activator.
Resumo:
Structural analysis of low-grade rocks highlights the allochthonous character of Mesozoic schists in southeastern Rhodope, Bulgaria. The deformation can be related to the Late Jurassic-Early Cretaceous thrusting and Tertiary detachment faulting. Petrologic and geochemical data show a volcanic arc origin of the greenschists and basaltic rocks. These results are interpreted as representing an island arc-accretionary complex related to the southward subduction of the Meliata-Maliac Ocean under the supra-subduction back-arc Vardar ocean/island arc system. This arc-trench system collided with the Rhodope in Late Jurassic times. (C) 2003 Academie des sciences. Published by Editions scientifiques et medicales Elsevier SAS. All rights reserved.
Resumo:
With advances in the effectiveness of treatment and disease management, the contribution of chronic comorbid diseases (comorbidities) found within the Charlson comorbidity index to mortality is likely to have changed since development of the index in 1984. The authors reevaluated the Charlson index and reassigned weights to each condition by identifying and following patients to observe mortality within 1 year after hospital discharge. They applied the updated index and weights to hospital discharge data from 6 countries and tested for their ability to predict in-hospital mortality. Compared with the original Charlson weights, weights generated from the Calgary, Alberta, Canada, data (2004) were 0 for 5 comorbidities, decreased for 3 comorbidities, increased for 4 comorbidities, and did not change for 5 comorbidities. The C statistics for discriminating in-hospital mortality between the new score generated from the 12 comorbidities and the Charlson score were 0.825 (new) and 0.808 (old), respectively, in Australian data (2008), 0.828 and 0.825 in Canadian data (2008), 0.878 and 0.882 in French data (2004), 0.727 and 0.723 in Japanese data (2008), 0.831 and 0.836 in New Zealand data (2008), and 0.869 and 0.876 in Swiss data (2008). The updated index of 12 comorbidities showed good-to-excellent discrimination in predicting in-hospital mortality in data from 6 countries and may be more appropriate for use with more recent administrative data.
Resumo:
The 2009-2010 Data Fusion Contest organized by the Data Fusion Technical Committee of the IEEE Geoscience and Remote Sensing Society was focused on the detection of flooded areas using multi-temporal and multi-modal images. Both high spatial resolution optical and synthetic aperture radar data were provided. The goal was not only to identify the best algorithms (in terms of accuracy), but also to investigate the further improvement derived from decision fusion. This paper presents the four awarded algorithms and the conclusions of the contest, investigating both supervised and unsupervised methods and the use of multi-modal data for flood detection. Interestingly, a simple unsupervised change detection method provided similar accuracy as supervised approaches, and a digital elevation model-based predictive method yielded a comparable projected change detection map without using post-event data.
Resumo:
La varietat de poma ‘Granny Smith’ es cull actualment atenent al temps transcorregut des de la floració de l’arbre, i no als paràmetres de qualitat del fruit. Els objectius d’aquest treball són determinar la data òptima de collita segons el grau de satisfacció dels consumidors (mitjançant la realització d’un tast de consumidors), relacionar la sensibilitat a l’escaldat superficial amb els paràmetres de maduresa a collita (valorant fruit a fruit després d’un període de conservació en cambra frigorífica de 5 mesos) i la posta a punt del DA-meter com a eina per a la gestió de la collita de la ‘Granny Smith’. Les conclusions del treball han estat: els consumidors han valorat més positivament la fruita collida més verda, però aquesta presenta l’inconvenient que té una afectació per escaldat superficial molt més elevada que la collida en un estat de maduresa més avançat. Pel que fa al DA-meter, encara no està preparat per ser utilitzat com a eina per a la gestió de la collita de la ‘Granny Smith’ tal com està concebut actualment