125 resultados para Approximate relative percentages (wt. %)
em CentAUR: Central Archive University of Reading - UK
Resumo:
We focus on the comparison of three statistical models used to estimate the treatment effect in metaanalysis when individually pooled data are available. The models are two conventional models, namely a multi-level and a model based upon an approximate likelihood, and a newly developed model, the profile likelihood model which might be viewed as an extension of the Mantel-Haenszel approach. To exemplify these methods, we use results from a meta-analysis of 22 trials to prevent respiratory tract infections. We show that by using the multi-level approach, in the case of baseline heterogeneity, the number of clusters or components is considerably over-estimated. The approximate and profile likelihood method showed nearly the same pattern for the treatment effect distribution. To provide more evidence two simulation studies are accomplished. The profile likelihood can be considered as a clear alternative to the approximate likelihood model. In the case of strong baseline heterogeneity, the profile likelihood method shows superior behaviour when compared with the multi-level model. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
We describe and evaluate a new estimator of the effective population size (N-e), a critical parameter in evolutionary and conservation biology. This new "SummStat" N-e. estimator is based upon the use of summary statistics in an approximate Bayesian computation framework to infer N-e. Simulations of a Wright-Fisher population with known N-e show that the SummStat estimator is useful across a realistic range of individuals and loci sampled, generations between samples, and N-e values. We also address the paucity of information about the relative performance of N-e estimators by comparing the SUMMStat estimator to two recently developed likelihood-based estimators and a traditional moment-based estimator. The SummStat estimator is the least biased of the four estimators compared. In 32 of 36 parameter combinations investigated rising initial allele frequencies drawn from a Dirichlet distribution, it has the lowest bias. The relative mean square error (RMSE) of the SummStat estimator was generally intermediate to the others. All of the estimators had RMSE > 1 when small samples (n = 20, five loci) were collected a generation apart. In contrast, when samples were separated by three or more generations and Ne less than or equal to 50, the SummStat and likelihood-based estimators all had greatly reduced RMSE. Under the conditions simulated, SummStat confidence intervals were more conservative than the likelihood-based estimators and more likely to include true N-e. The greatest strength of the SummStat estimator is its flexible structure. This flexibility allows it to incorporate any, potentially informative summary statistic from Population genetic data.
Resumo:
Background: The large-scale production of G-protein coupled receptors (GPCRs) for functional and structural studies remains a challenge. Recent successes have been made in the expression of a range of GPCRs using Pichia pastoris as an expression host. P. pastoris has a number of advantages over other expression systems including ability to post-translationally modify expressed proteins, relative low cost for production and ability to grow to very high cell densities. Several previous studies have described the expression of GPCRs in P. pastoris using shaker flasks, which allow culturing of small volumes (500 ml) with moderate cell densities (OD600 similar to 15). The use of bioreactors, which allow straightforward culturing of large volumes, together with optimal control of growth parameters including pH and dissolved oxygen to maximise cell densities and expression of the target receptors, are an attractive alternative. The aim of this study was to compare the levels of expression of the human Adenosine 2A receptor (A(2A)R) in P. pastoris under control of a methanol-inducible promoter in both flask and bioreactor cultures. Results: Bioreactor cultures yielded an approximately five times increase in cell density (OD600 similar to 75) compared to flask cultures prior to induction and a doubling in functional expression level per mg of membrane protein, representing a significant optimisation. Furthermore, analysis of a C-terminally truncated A2AR, terminating at residue V334 yielded the highest levels (200 pmol/mg) so far reported for expression of this receptor in P. pastoris. This truncated form of the receptor was also revealed to be resistant to C-terminal degradation in contrast to the WT A(2A)R, and therefore more suitable for further functional and structural studies. Conclusion: Large-scale expression of the A(2A)R in P. pastoris bioreactor cultures results in significant increases in functional expression compared to traditional flask cultures.
Resumo:
The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an “inner” direct or iterative process. In comparison with Newton’s method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact Gauss–Newton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the Gauss–Newton method of two types of approximation used commonly in data assimilation. First, we examine “truncated” Gauss–Newton methods where the inner linear least squares problem is not solved exactly, and second, we examine “perturbed” Gauss–Newton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed Gauss–Newton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.
Resumo:
Chemical methods to predict the bioavailable fraction of organic contaminants are usually validated in the literature by comparison with established bioassays. A soil spiked with polycyclic aromatic hydrocarbons (PAHs) was aged over six months and subjected to butanol, cyclodextrin and tenax extractions as well as an exhaustive extraction to determine total PAH concentrations at several time points. Earthworm (Eisenia fetida) and rye grass root (Lolium multiflorum) accumulation bioassays were conducted in parallel. Butanol extractions gave the best relationship with earthworm accumulation (r2 ≤ 0.54, p ≤ 0.01); cyclodextrin, butanol and acetone–hexane extractions all gave good predictions of accumulation in rye grass roots (r2 ≤ 0.86, p ≤ 0.01). However, the profile of the PAHs extracted by the different chemical methods was significantly different (p < 0.01) to that accumulated in the organisms. Biota accumulated a higher proportion of the heavier 4-ringed PAHs. It is concluded that bioaccumulation is a complex process that cannot be predicted by measuring the bioavailable fraction alone. The ability of chemical methods to predict PAH accumulation in Eisenia fetida and Lolium multiflorum was hindered by the varied metabolic fate of the different PAHs within the organisms.
Resumo:
Key climate feedbacks due to water vapor and clouds rest largely on how relative humidity R changes in a warmer climate, yet this has not been extensively analyzed in models. General circulation models (GCMs) from the CMIP3 archive and several higher resolution atmospheric GCMs examined here generally predict a characteristic pattern of R trend with global temperature that has been reported previously in individual models, including increase around the tropopause, decrease in the tropical upper troposphere, and decrease in midlatitudes. This pattern is very similar to that previously reported for cloud cover in the same GCMs, confirming the role of R in controlling changes in simulated cloud. Comparing different models, the trend in each part of the troposphere is approximately proportional to the upward and/or poleward gradient of R in the present climate. While this suggests that the changes simply reflect a shift of the R pattern upward with the tropopause and poleward with the zonal jets, the drying trend in the subtropics is roughly three times too large to be attributable to shifts of subtropical features, and the subtropical R minima deepen in most models. R trends are correlated with horizontal model resolution, especially outside the tropics, where they show signs of convergence and latitudinal gradients become close to available observations for GCM resolutions near T85 and higher. We argue that much of the systematic change in R can be explained by the local specific humidity having been set (by condensation) in remote regions with different temperature changes, hence the gradients and trends each depend on a model’s ability to resolve moisture transport. Finally, subtropical drying trends predicted from the warming alone fall well short of those observed in recent decades. While this discrepancy supports previous reports of GCMs underestimating Hadley Cell expansion, our results imply that shifts alone are not a sufficient interpretation of changes.
Resumo:
Trace elements may present an environmental hazard in the vicinity of mining and smelting activities. However, the factors controlling their distribution and transfer within the soil and vegetation systems are not always well defined. Total concentrations of up to 15,195 mg center dot kg (-1) As, 6,690 mg center dot kg(-1) Cu, 24,820 mg center dot kg(-1) Pb and 9,810 mg center dot kg(-1) Zn in soils, and 62 mg center dot kg(-1) As, 1,765 mg center dot kg(-1) Cu, 280 mg center dot kg(-1) Pb and 3,460 mg center dot kg (-1) Zn in vegetation were measured. However, unusually for smelters and mines of a similar size, the elevated trace element concentrations in soils were found to be restricted to the immediate vicinity of the mines and smelters (maximum 2-3 km). Parent material, prevailing wind direction, and soil physical and chemical characteristics were found to correlate poorly with the restricted trace element distributions in soils. Hypotheses are given for this unusual distribution: (1) the contaminated soils were removed by erosion or (2) mines and smelters released large heavy particles that could not have been transported long distances. Analyses of the accumulation of trace elements in vegetation (median ratios: As 0.06, Cu 0.19, Pb 0.54 and Zn 1.07) and the percentage of total trace elements being DTPA extractable in soils (median percentages: As 0.06%, Cu 15%, Pb 7% and Zn 4%) indicated higher relative trace element mobility in soils with low total concentrations than in soils with elevated concentrations.
Resumo:
This study evaluates computer-generated written explanations about drug prescriptions that are based on an analysis of both patient and doctor informational needs. Three experiments examine the effects of varying the type of information given about the possible side effects of the medication, and the order of information within the explanation. Experiment 1 investigated the effects of these two factors on people's ratings of how good they consider the explanations to be and of their perceived likelihood of taking the medication, as well as on their memory for the information in the explanation. Experiment 2 further examined the effects of varying information about side effects by separating out the contribution of number and severity of side effects. It was found that participants in this study did not “like” explanations that described severe side effects, and also judged that they would be less likely to take the medication if given such explanations. Experiment 3 therefore investigated whether information about severe side effects could be presented in such a way as to increase judgements of how good explanations are thought to be, as well as the perceived likelihood of adherence. The results showed some benefits of providing additional explanatory information.
Resumo:
We consider the application of the conjugate gradient method to the solution of large, symmetric indefinite linear systems. Special emphasis is put on the use of constraint preconditioners and a new factorization that can reduce the number of flops required by the preconditioning step. Results concerning the eigenvalues of the preconditioned matrix and its minimum polynomial are given. Numerical experiments validate these conclusions.
Resumo:
An efficient method is described for the approximate calculation of the intensity of multiply scattered lidar returns. It divides the outgoing photons into three populations, representing those that have experienced zero, one, and more than one forward-scattering event. Each population is parameterized at each range gate by its total energy, its spatial variance, the variance of photon direction, and the covariance, of photon direction and position. The result is that for an N-point profile the calculation is O(N-2) efficient and implicitly includes up to N-order scattering, making it ideal for use in iterative retrieval algorithms for which speed is crucial. In contrast, models that explicitly consider each scattering order separately are at best O(N-m/m!) efficient for m-order scattering and often cannot be performed to more than the third or fourth order in retrieval algorithms. For typical cloud profiles and a wide range of lidar fields of view, the new algorithm is as accurate as an explicit calculation truncated at the fifth or sixth order but faster by several orders of magnitude. (C) 2006 Optical Society of America.