908 resultados para METHOD OF MULTIPLE SCALES
Resumo:
ABSTRACT Researchers frequently have to analyze scales in which some participants have failed to respond to some items. In this paper we focus on the exploratory factor analysis of multidimensional scales (i.e., scales that consist of a number of subscales) where each subscale is made up of a number of Likert-type items, and the aim of the analysis is to estimate participants' scores on the corresponding latent traits. We propose a new approach to deal with missing responses in such a situation that is based on (1) multiple imputation of non-responses and (2) simultaneous rotation of the imputed datasets. We applied the approach in a real dataset where missing responses were artificially introduced following a real pattern of non-responses, and a simulation study based on artificial datasets. The results show that our approach (specifically, Hot-Deck multiple imputation followed of Consensus Promin rotation) was able to successfully compute factor score estimates even for participants that have missing data.
Resumo:
Immunolabeling is commonly used to localize antigens within frozen or paraffin tissue sections. We modified existing immunolabeling techniques to allow the detection of three antigens simultaneously within the one tissue section. The approach relies on the use of three monoclonal antibodies in sequential immunoperoxidase staining steps, each with colored substrates, resulting in the deposition of black, brown, and rose stains. The method is rapid and does not require novel techniques or materials. In this report, we demonstrate the colocalization of mast cell tryptase, neurofilament protein, and CD31 (platelet-endothelial cell adhesion molecule) or laminin in normal human skin and normal buccal mucosa, as an illustration of the power and simplicity of the multiple antigen localization technique.
Resumo:
Bioelectrical impedance analysis has found extensive application as a simple noninvasive method for the assessment of body fluid volumes, The measured impedance is, however, not only related to the volume of fluid but also to its inherent resistivity. The primary determinant of the resistivities of body fluids is the concentration of ions. The aim of this study was to investigate the sensitivity of bioelectrical impedance analysis to bodily ion status. Whole body impedance over a range of frequencies (4-1012 kHz) of rats was measured during infusion of various concentrations of saline into rats concomitant with measurement of total body and intracellular water by tracer dilution techniques. Extracellular resistance (R-o), intracellular resistance (R-i) and impedance at the characteristic frequency (Z(c)) were calculated. R-o and Z(c) were used to predict extracellular and total body water respectively using previously published formulae. The results showed that whilst R-o and Z(c) decreased proportionately to the amount of NaCl infused, R-i increased only slightly. Impedances at the end of infusion predicted increases iu TBW and ECW of approximately 4-6% despite a volume increase of less than 0.5% in TBW due to the volume of fluid infused. These data are discussed in relation to the assumption of constant resistivity in the prediction of fluid volumes from impedance data.
Resumo:
Quantifying mass and energy exchanges within tropical forests is essential for understanding their role in the global carbon budget and how they will respond to perturbations in climate. This study reviews ecosystem process models designed to predict the growth and productivity of temperate and tropical forest ecosystems. Temperate forest models were included because of the minimal number of tropical forest models. The review provides a multiscale assessment enabling potential users to select a model suited to the scale and type of information they require in tropical forests. Process models are reviewed in relation to their input and output parameters, minimum spatial and temporal units of operation, maximum spatial extent and time period of application for each organization level of modelling. Organizational levels included leaf-tree, plot-stand, regional and ecosystem levels, with model complexity decreasing as the time-step and spatial extent of model operation increases. All ecosystem models are simplified versions of reality and are typically aspatial. Remotely sensed data sets and derived products may be used to initialize, drive and validate ecosystem process models. At the simplest level, remotely sensed data are used to delimit location, extent and changes over time of vegetation communities. At a more advanced level, remotely sensed data products have been used to estimate key structural and biophysical properties associated with ecosystem processes in tropical and temperate forests. Combining ecological models and image data enables the development of carbon accounting systems that will contribute to understanding greenhouse gas budgets at biome and global scales.
Resumo:
Trials conducted in Queensland, Australia between 1997 and 2002 demonstrated that fungicides belonging to the triazole group were the most effective in minimising the severity of infection of sorghum by Claviceps africana, the causal agent of sorghum ergot. Triadimenol ( as Bayfidan 250EC) at 0.125 kg a. i./ha was the most effective fungicide. A combination of the systemic activated resistance compound acibenzolar-S-methyl ( as Bion 50WG) at 0.05 kg a. i./ha and mancozeb ( as Penncozeb 750DF) at 1.5 kg a. i./ha has the potential to provide protection against the pathogen, should triazole-resistant isolates be detected. Timing and method of fungicide application are important. Our results suggest that the triazole fungicides have no systemic activity in sorghum panicles, necessitating the need for multiple applications from first anthesis to the end of flowering, whereas acibenzolar-S-methyl is most effective when applied 4 days before flowering. The flat fan nozzles tested in the trials provided higher levels of protection against C. africana and greater droplet deposition on panicles than the tested hollow cone nozzles. Application of triadimenol by a fixed wing aircraft was as efficacious as application through a tractor-mounted boom spray.
Resumo:
In the last decade, local image features have been widely used in robot visual localization. In order to assess image similarity, a strategy exploiting these features compares raw descriptors extracted from the current image with those in the models of places. This paper addresses the ensuing step in this process, where a combining function must be used to aggregate results and assign each place a score. Casting the problem in the multiple classifier systems framework, in this paper we compare several candidate combiners with respect to their performance in the visual localization task. For this evaluation, we selected the most popular methods in the class of non-trained combiners, namely the sum rule and product rule. A deeper insight into the potential of these combiners is provided through a discriminativity analysis involving the algebraic rules and two extensions of these methods: the threshold, as well as the weighted modifications. In addition, a voting method, previously used in robot visual localization, is assessed. Furthermore, we address the process of constructing a model of the environment by describing how the model granularity impacts upon performance. All combiners are tested on a visual localization task, carried out on a public dataset. It is experimentally demonstrated that the sum rule extensions globally achieve the best performance, confirming the general agreement on the robustness of this rule in other classification problems. The voting method, whilst competitive with the product rule in its standard form, is shown to be outperformed by its modified versions.
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
The multiple endocrine neoplasia type 2A (MEN2A) is a monogenic disorder characterized by an autosomal dominant pattern of inheritance which is characterized by high risk of medullary thyroid carcinoma in all mutation carriers. Although this disorder is classified as a rare disease, the patients affected have a low life quality and a very expensive and continuous treatment. At present, MEN2A is diagnosed by gene sequencing after birth, thus trying to start an early treatment and by reduction of morbidity and mortality. We first evaluated the presence of MEN2A mutation (C634Y) in serum of 25 patients, previously diagnosed by sequencing in peripheral blood leucocytes, using HRM genotyping analysis. In a second step, we used a COLD-PCR approach followed by HRM genotyping analysis for non-invasive prenatal diagnosis of a pregnant woman carrying a fetus with a C634Y mutation. HRM analysis revealed differences in melting curve shapes that correlated with patients diagnosed for MEN2A by gene sequencing analysis with 100% accuracy. Moreover, the pregnant woman carrying the fetus with the C634Y mutation revealed a melting curve shape in agreement with the positive controls in the COLD-PCR study. The mutation was confirmed by sequencing of the COLD-PCR amplification product. In conclusion, we have established a HRM analysis in serum samples as a new primary diagnosis method suitable for the detection of C634Y mutations in MEN2A patients. Simultaneously, we have applied the increase of sensitivity of COLD-PCR assay approach combined with HRM analysis for the non-invasive prenatal diagnosis of C634Y fetal mutations using pregnant women serum.
Resumo:
The Committee of the European Concerted Action for Multiple Sclerosis (Charcot Foundation) organised five workshops to discuss CSF analytical standards in the diagnosis of multiple sclerosis. This consensus report from 12 European countries summarises the results of those workshops. It is hoped that neurologists will confer with their colleagues in clinical chemistry to arrange the best possible local practice. The most sensitive method for the detection of oligoclonal immunoglobulin bands is isoelectric focusing. The same amounts of IgG in parallel CSF and serum samples are used and oligoclonal bands are revealed with IgG specific antibody staining. All laboratories performing isoelectric focusing should check their technique at least annually using "blind" standards for the five different CSF and serum patterns. Quantitative measurements of IgG production in the CNS are less sensitive than isoelectric focusing. The preferred method for detection of blood-CSF barrier dysfunction is the albumin quotient. The CSF albumin or total protein concentrations are less satisfactory. These results must be interpreted with reference to the age of the patient and the local method of determination. Cells should be counted. The normal value is no more than 4 cells/microliters. Among evolving optional tests, measurement of the combined local synthesis of antibodies against measles, rubella, and/or varicella zoster could represent a significant advance if it offers higher specificity (not sensitivity) for identifying chronic rather than acute inflammation. Other tests that may have useful correlations with clinical indices include those for oligoclonal free light chains, IgM, IgA, or myelin basic protein concentrations.
Resumo:
Anabolic androgenic steroids (AAS) are doping agents that are mostly used for improvement of strength and muscle hypertrophy. In some sports, athletes reported that the intake of AAS is associated with a better recovery, a higher training load capacity and therefore an increase in physical and mental performances. The purpose of this study was to evaluate, the effect of multiple doses of AAS on different physiological parameters that could indirectly relate the physical state of athletes during a hard endurance training program. In a double blind settings, three groups (n = 9, 8 and 8) were orally administered placebo, testosterone undecanoate or 19-norandrostenedione, 12 times during 1 month. Serum biomarkers (creatine kinase, ASAT and urea), serum hormone profiles (testosterone, cortisol and LH) and urinary catecholamines (noradrenalin, adrenalin and dopamine) were evaluated during the treatment. Running performance was assessed before and after the intervention phase by means of a standardized treadmill test. None of the measured biochemical variables showed significant impact of AAS on physical stress level. Data from exercise testing on submaximal and maximal level did not reveal any performance differences between the three groups or their response to the treatment. In the present study, no effect of multiple oral doses of AAS on endurance performance or bioserum recovery markers was found.
Resumo:
Molecular species identification in mixed or contaminated biological material has always been problematic. We developed a simple and accurate method for mammal DNA identification in mixtures, based on interspecific mitochondrial DNA control region length polymorphism. Contrary to other published methods dealing with species mixtures, our protocol requires a single universal primer pair and amplification step, and is not based on a pre-defined panel of species. This protocol has been routinely employed by our laboratory for species identification in dozens of human and animal forensic caseworks. Six representative forensic caseworks involving the specific identification of mixed animal samples are reported in this paper, in order to demonstrate the applicability and usefulness of the method.
Resumo:
The use of multiple legal and illegal substances by adolescents is a growing concern in all countries, but since no consensus about a taxonomy did emerge yet, it is difficult to understand the different patterns of consumption and to implement tailored prevention and treatment programs directed towards specific subgroups of the adolescent population. Using data from a Swiss survey on adolescent health, we analyzed the age at which ten legal and illegal substances were consumed for the first time ever by applying a method combining the strength of both automatic clustering and use of substance experts. Results were then compared to 30 socio-economic factors to establish the usefulness of and to validate our taxonomy. We also analyzed the succession of substance first use for each group. The final taxonomy consists of eight groups ranging from non-consumers to heavy drug addicts. All but four socio-economic factors were significantly associated with the taxonomy, the strongest associations being observed with health, behavior, and sexuality factors. Numerous factors influence adolescents in their decision to first try substances or to use them on a regular basis, and no factor alone can be considered as an absolute marker of problematic behavior regarding substance use. Different processes of experimentation with substances are associated with different behaviors, therefore focusing on only one substance or only one factor is not efficient. Prevention and treatment programs can then be tailored to address specific issues related to different youth subgroups.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
The integration of geophysical data into the subsurface characterization problem has been shown in many cases to significantly improve hydrological knowledge by providing information at spatial scales and locations that is unattainable using conventional hydrological measurement techniques. In particular, crosshole ground-penetrating radar (GPR) tomography has shown much promise in hydrology because of its ability to provide highly detailed images of subsurface radar wave velocity, which is strongly linked to soil water content. Here, we develop and demonstrate a procedure for inverting together multiple crosshole GPR data sets in order to characterize the spatial distribution of radar wave velocity below the water table at the Boise Hydrogeophysical Research Site (BHRS) near Boise, Idaho, USA. Specifically, we jointly invert 31 intersecting crosshole GPR profiles to obtain a highly resolved and consistent radar velocity model along the various profile directions. The model is found to be strongly correlated with complementary neutron porosity-log data and is further corroborated by larger-scale structural information at the BHRS. This work is an important prerequisite to using crosshole GPR data together with existing hydrological measurements for improved groundwater flow and contaminant transport modeling.