11 resultados para correlation methods
em CentAUR: Central Archive University of Reading - UK
Resumo:
An important element of the developing field of proteomics is to understand protein-protein interactions and other functional links amongst genes. Across-species correlation methods for detecting functional links work on the premise that functionally linked proteins will tend to show a common pattern of presence and absence across a range of genomes. We describe a maximum likelihood statistical model for predicting functional gene linkages. The method detects independent instances of the correlated gain or loss of pairs of proteins on phylogenetic trees, reducing the high rates of false positives observed in conventional across-species methods that do not explicitly incorporate a phylogeny. We show, in a dataset of 10,551 protein pairs, that the phylogenetic method improves by up to 35% on across-species analyses at identifying known functionally linked proteins. The method shows that protein pairs with at least two to three correlated events of gain or loss are almost certainly functionally linked. Contingent evolution, in which one gene's presence or absence depends upon the presence of another, can also be detected phylogenetically, and may identify genes whose functional significance depends upon its interaction with other genes. Incorporating phylogenetic information improves the prediction of functional linkages. The improvement derives from having a lower rate of false positives and from detecting trends that across-species analyses miss. Phylogenetic methods can easily be incorporated into the screening of large-scale bioinformatics datasets to identify sets of protein links and to characterise gene networks.
Resumo:
Background. In separate studies and research from different perspectives, five factors are found to be among those related to higher quality outcomes of student learning (academic achievement). Those factors are higher self-efficacy, deeper approaches to learning, higher quality teaching, students’ perceptions that their workload is appropriate, and greater learning motivation. University learning improvement strategies have been built on these research results. Aim. To investigate how students’ evoked prior experience, perceptions of their learning environment, and their approaches to learning collectively contribute to academic achievement. This is the first study to investigate motivation and self-efficacy in the same educational context as conceptions of learning, approaches to learning and perceptions of the learning environment. Sample. Undergraduate students (773) from the full range of disciplines were part of a group of over 2,300 students who volunteered to complete a survey of their learning experience. On completing their degrees 6 and 18 months later, their academic achievement was matched with their learning experience survey data. Method. A 77-item questionnaire was used to gather students’ self-report of their evoked prior experience (self-efficacy, learning motivation, and conceptions of learning), perceptions of learning context (teaching quality and appropriate workload), and approaches to learning (deep and surface). Academic achievement was measured using the English honours degree classification system. Analyses were conducted using correlational and multi-variable (structural equation modelling) methods. Results. The results from the correlation methods confirmed those found in numerous earlier studies. The results from the multi-variable analyses indicated that surface approach to learning was the strongest predictor of academic achievement, with self-efficacy and motivation also found to be directly related. In contrast to the correlation results, a deep approach to learning was not related to academic achievement, and teaching quality and conceptions of learning were only indirectly related to achievement. Conclusions. Research aimed at understanding how students experience their learning environment and how that experience relates to the quality of their learning needs to be conducted using a wider range of variables and more sophisticated analytical methods. In this study of one context, some of the relations found in earlier bivariate studies, and on which learning intervention strategies have been built, are not confirmed when more holistic teaching–learning contexts are analysed using multi-variable methods.
Resumo:
Procedures for routine analysis of soil phosphorus (P) have been used for assessment of P status, distribution and P losses from cultivated mineral soils. No similar studies have been carried out on wetland peat soils. The objective was to compare extraction efficiency of ammonium lactate (PAL), sodium bicarbonate (P-Olsen), and double calcium lactate (P-DCaL) and P distribution in the soil profile of wetland peat soils. For this purpose, 34 samples of the 0-30, 30-60 and 60-90 cm layers were collected from peat soils in Germany, Israel, Poland, Slovenia, Sweden and the United Kingdom and analysed for P. Mean soil pH (CaCl2, 0.01 M) was 5.84, 5.51 and 5.47 in the 0-30, 30-60 and 60-90 cm layers, respectively. The P-DCaL was consistently about half the magnitude of either P-AL or P-Olsen. The efficiency of P extraction increased in the order P-DCaL < P-AL &LE; P-Olsen, with corresponding means (mg kg(-1)) for all soils (34 samples) of 15.32, 33.49 and 34.27 in 0-30 cm; 8.87, 17.30 and 21.46 in 30-60 cm; and 5.69, 14.00 and 21.40 in 60-90 cm. The means decreased with depth. When examining soils for each country separately, P-Olsen was relatively evenly distributed in the German, UK and Slovenian soils. P-Olsen was linearly correlated (r = 0.594, P = 0.0002) with pH, whereas the three P tests (except P-Olsen vs P-DCaL) significantly correlated with each other (P = 0.017850.0001). The strongest correlation (r = 0.617, P = 0.0001) was recorded for P-AL vs P-DCaL) and the two methods were inter-convertible using a regression equation: P-AL = -22.593 + 5.353 pH + 1.423 P-DCaL, R-2 = 0.550.
Resumo:
Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.
Resumo:
Modelling the interaction of terahertz(THz) radiation with biological tissueposes many interesting problems. THzradiation is neither obviously described byan electric field distribution or anensemble of photons and biological tissueis an inhomogeneous medium with anelectronic permittivity that is bothspatially and frequency dependent making ita complex system to model.A three-layer system of parallel-sidedslabs has been used as the system throughwhich the passage of THz radiation has beensimulated. Two modelling approaches havebeen developed a thin film matrix model anda Monte Carlo model. The source data foreach of these methods, taken at the sametime as the data recorded to experimentallyverify them, was a THz spectrum that hadpassed though air only.Experimental verification of these twomodels was carried out using athree-layered in vitro phantom. Simulatedtransmission spectrum data was compared toexperimental transmission spectrum datafirst to determine and then to compare theaccuracy of the two methods. Goodagreement was found, with typical resultshaving a correlation coefficient of 0.90for the thin film matrix model and 0.78 forthe Monte Carlo model over the full THzspectrum. Further work is underway toimprove the models above 1 THz.
Resumo:
The antioxidant capacity of oak wood used in the ageing of wine was studied by four different methods: measurement of scavenging capacity against a given radical (ABTS, DPPH), oxygen radical absorbance capacity (ORAC) and the ferric reducing antioxidant power (FRAP). Although, the four methods tested gave comparable results for the antioxidant capacity measured in oak wood extracts, the ORAC method gave results with some differences from the other methods. Non-toasted oak wood samples displayed more antioxidant power than toasted ones due to differences in the polyphenol compositon. A correlation analysis revealed that ellagitannins were the compounds mainly responsible for the antioxidant capacity of oak wood. Some phenolic acids, mainly gallic acid, also showed a significant correlation with antioxidant capacity.
Resumo:
Novel imaging techniques are playing an increasingly important role in drug development, providing insight into the mechanism of action of new chemical entities. The data sets obtained by these methods can be large with complex inter-relationships, but the most appropriate statistical analysis for handling this data is often uncertain - precisely because of the exploratory nature of the way the data are collected. We present an example from a clinical trial using magnetic resonance imaging to assess changes in atherosclerotic plaques following treatment with a tool compound with established clinical benefit. We compared two specific approaches to handle the correlations due to physical location and repeated measurements: two-level and four-level multilevel models. The two methods identified similar structural variables, but higher level multilevel models had the advantage of explaining a greater proportion of variation, and the modeling assumptions appeared to be better satisfied.
Resumo:
Two genetic fingerprinting techniques, pulsed-field gel electrophoresis (PFGE) and ribotyping, were used to characterize 207 Escherichia coli O157 isolates from food animals, foods of animal origin, and cases of human disease (206 of the isolates were from the United Kingdom). In addition, 164 of these isolates were also phage typed. The isolates were divided into two general groups: (i) unrelated isolates not known to be epidemiologically linked (n = 154) and originating from food animals, foods and the environment, or humans and (ii) epidemiologically related isolates (n = 53) comprised of four related groups (RGs) originating either from one farm plus the abattoir where cattle from that farm were slaughtered or from one of three different English abattoirs. PFGE was conducted with the restriction endonuclease XbaI. while for ribotyping, two restriction endonucleases (PstI and SphI) were combined to digest genomic DNAs simultaneously. The 207 E. coli O157 isolates produced 97 PFGE profiles and 51 ribotypes. The two genetic fingerprinting methods had similar powers to discriminate the 154 epidemiologically unrelated E. coli O157 isolates in the study (Simpson's index of diversity [D] = 0.98 and 0.94 for PFGE typing and ribotyping, respectively). There was no correlation between the source of an isolate (healthy meat or milk animals, retail meats, or cases of human infection) and either particular PFGE or ribotype profiles or clusters. Combination of the results of both genetic fingerprinting methods produced 146 types, significantly more than when either of the two methods was used individually. Consequently, the superior discriminatory performance of the PFGE-ribotyping combination was proven in two ways: (i) by demonstrating that the majority of the E. coli O157 isolates with unrelated histories were indeed distinguishable types and (ii) by identifying some clonal groups among two of the four RGs of E. coli O157 isolates (comprising PFGE types different by just one or two bands), the relatedness of which would have remained unconfirmed otherwise.
Resumo:
The Fourier series can be used to describe periodic phenomena such as the one-dimensional crystal wave function. By the trigonometric treatements in Hückel theory it is shown that Hückel theory is a special case of Fourier series theory. Thus, the conjugated π system is in fact a periodic system. Therefore, it can be explained why such a simple theorem as Hückel theory can be so powerful in organic chemistry. Although it only considers the immediate neighboring interactions, it implicitly takes account of the periodicity in the complete picture where all the interactions are considered. Furthermore, the success of the trigonometric methods in Hückel theory is not accidental, as it based on the fact that Hückel theory is a specific example of the more general method of Fourier series expansion. It is also important for education purposes to expand a specific approach such as Hückel theory into a more general method such as Fourier series expansion.
Resumo:
Crystallization must occur in honey in order to produce set or creamed honey; however, the process must occur in a controlled manner in order to obtain an acceptable product. As a consequence, reliable methods are needed to measure the crystal content of honey (φ expressed as kg crystal per kg honey), which can also be implemented with relative ease in industrial production facilities. Unfortunately, suitable methods do not currently exist. This article reports on the development of 2 independent offline methods to measure the crystal content in honey based on differential scanning calorimetry and high-performance liquid chromatography. The 2 methods gave highly consistent results on the basis of paired t-test involving 143 experimental points (P > 0.05, r**2 = 0.99). The crystal content also correlated with the relative viscosity, defined as the ratio of the viscosity of crystal containing honey to that of the same honey when all crystals are dissolved, giving the following correlation: μr = 1 + 1398.8∅**2.318. This correlation can be used to estimate the crystal content of honey in industrial production facilities. The crystal growth rate at a temperature of 14 ◦C—the normal crystallization temperature used in practice—was linear, and the growth rate also increased with the total glucose content in the honey.
Resumo:
In an adaptive seamless phase II/III clinical trial interim analysis, data are used for treatment selection, enabling resources to be focused on comparison of more effective treatment(s) with a control. In this paper, we compare two methods recently proposed to enable use of short-term endpoint data for decision-making at the interim analysis. The comparison focuses on the power and the probability of correctly identifying the most promising treatment. We show that the choice of method depends on how well short-term data predict the best treatment, which may be measured by the correlation between treatment effects on short- and long-term endpoints.