982 resultados para PREDICTIONS
Resumo:
Lettuce greenhouse experiments were carried out from March to June 2011 in order to analyze how pesticides behave from the time of application until their intake via human consumption taking into account the primary distribution of pesticides, field dissipation, and post-harvest processing. In addition, experimental conditions were used to evaluate a new dynamic plant uptake model comparing its results with the experimentally derived residues. One application of imidacloprid and two of azoxystrobin were conducted. For evaluating primary pesticide distribution, two approaches based on leaf area index and vegetation cover were used and results were compared with those obtained from a tracer test. High influence of lettuce density, growth stage and type of sprayer was observed in primary distribution showing that low densities or early growth stages implied high losses of pesticides on soil. Washed and unwashed samples of lettuce were taken and analyzed from application to harvest to evaluate removal of pesticides by food processing. Results show that residues found on the Spanish preharvest interval days were in all cases below officially set maximum residue limits, although it was observed that time between application and harvest is as important for residues as application amounts. An overall reduction of 40–60% of pesticides residues was obtained from washing lettuce. Experimentally derived residues were compared with modeled residues and deviate from 1.2 to 1.4 for imidacloprid and azoxystrobin, respectively, presenting good model predictions. Resulting human intake fractions range from... for imidacloprid to ... for azoxystrobin.
Resumo:
Aims: Plasma concentrations of imatinib differ largely between patients despite same dosage, owing to large inter-individual variability in pharmacokinetic (PK) parameters. As the drug concentration at the end of the dosage interval (Cmin) correlates with treatment response and tolerability, monitoring of Cmin is suggested for therapeutic drug monitoring (TDM) of imatinib. Due to logistic difficulties, random sampling during the dosage interval is however often performed in clinical practice, thus rendering the respective results not informative regarding Cmin values.Objectives: (I) To extrapolate randomly measured imatinib concentrations to more informative Cmin using classical Bayesian forecasting. (II) To extend the classical Bayesian method to account for correlation between PK parameters. (III) To evaluate the predictive performance of both methods.Methods: 31 paired blood samples (random and trough levels) were obtained from 19 cancer patients under imatinib. Two Bayesian maximum a posteriori (MAP) methods were implemented: (A) a classical method ignoring correlation between PK parameters, and (B) an extended one accounting for correlation. Both methods were applied to estimate individual PK parameters, conditional on random observations and covariate-adjusted priors from a population PK model. The PK parameter estimates were used to calculate trough levels. Relative prediction errors (PE) were analyzed to evaluate accuracy (one-sample t-test) and to compare precision between the methods (F-test to compare variances).Results: Both Bayesian MAP methods allowed non-biased predictions of individual Cmin compared to observations: (A) - 7% mean PE (CI95% - 18 to 4 %, p = 0.15) and (B) - 4% mean PE (CI95% - 18 to 10 %, p = 0.69). Relative standard deviations of actual observations from predictions were 22% (A) and 30% (B), i.e. comparable to the intraindividual variability reported. Precision was not improved by taking into account correlation between PK parameters (p = 0.22).Conclusion: Clinical interpretation of randomly measured imatinib concentrations can be assisted by Bayesian extrapolation to maximum likelihood Cmin. Classical Bayesian estimation can be applied for TDM without the need to include correlation between PK parameters. Both methods could be adapted in the future to evaluate other individual pharmacokinetic measures correlated to clinical outcomes, such as area under the curve(AUC).
Resumo:
We study the determinants of political myopia in a rational model of electoral accountability where the key elements are informational frictions and uncertainty. We build aframework where political ability is ex-ante unknown and policy choices are not perfectlyobservable. On the one hand, elections improve accountability and allow to keep well-performing incumbents. On the other, politicians invest too little in costly policies withfuture returns in an attempt to signal high ability and increase their reelection probability.Contrary to the conventional wisdom, uncertainty reduces political myopia and may, undersome conditions, increase social welfare. We use the model to study how political rewardscan be set so as to maximise social welfare and the desirability of imposing a one-term limitto governments. The predictions of our theory are consistent with a number of stylised factsand with a new empirical observation documented in this paper: aggregate uncertainty, measured by economic volatility, is associated to better fiscal discipline in a panel of 20 OECDcountries.
Resumo:
Information about the genomic coordinates and the sequence of experimentally identified transcription factor binding sites is found scattered under a variety of diverse formats. The availability of standard collections of such high-quality data is important to design, evaluate and improve novel computational approaches to identify binding motifs on promoter sequences from related genes. ABS (http://genome.imim.es/datasets/abs2005/index.html) is a public database of known binding sites identified in promoters of orthologous vertebrate genes that have been manually curated from bibliography. We have annotated 650 experimental binding sites from 68 transcription factors and 100 orthologous target genes in human, mouse, rat or chicken genome sequences. Computational predictions and promoter alignment information are also provided for each entry. A simple and easy-to-use web interface facilitates data retrieval allowing different views of the information. In addition, the release 1.0 of ABS includes a customizable generator of artificial datasets based on the known sites contained in the collection and an evaluation tool to aid during the training and the assessment of motif-finding programs.
Resumo:
Public goods cooperation is common in microbes, and there is much interest in understanding how such traits evolve. Research in recent years has identified several important factors that shape the evolutionary dynamics of such systems, yet few studies have investigated scenarios involving interactions between multiple public goods. Here, we offer general predictions about the evolutionary trajectories of two public goods traits having positive, negative or neutral regulatory influence on one another's expression, and we report on a test of some of our predictions in the context of Pseudomonas aeruginosa's production of two interlinked iron-scavenging siderophores. First, we confirmed that both pyoverdine and pyochelin siderophores do operate as public goods under appropriate environmental conditions. We then tracked their production in lines experimentally evolved under different iron-limitation regimes known to favour different siderophore expression profiles. Under strong iron limitation, where pyoverdine represses pyochelin, we saw a decline in pyoverdine and a concomitant increase in pyochelin - consistent with expansion of pyoverdine-defective cheats derepressed for pyochelin. Under moderate iron limitation, pyochelin declined - again consistent with an expected cheat invasion scenario - but there was no concomitant shift in pyoverdine because cross-suppression between the traits is unidirectional only. Alternating exposure to strong and moderate iron limitation caused qualitatively similar though lesser shifts compared to the constant-environment regimes. Our results confirm that the regulatory interconnections between public goods traits can significantly modulate the course of evolution, yet also suggest how we can start to predict the impacts such complexities will have on phenotypic divergence and community stability.
Resumo:
A network of 25 sonic stage sensors were deployed in the Squaw Creek basin upstream from Ames Iowa to determine if the state-of-the-art distributed hydrological model CUENCAS can produce reliable information for all road crossings including those that cross small creeks draining basins as small as 1 sq. mile. A hydraulic model was implemented for the major tributaries of the Squaw Creek where IFC sonic instruments were deployed and it was coupled to CUENCAS to validate the predictions made at small tributaries in the basin. This study demonstrates that the predictions made by the hydrological model at internal locations in the basins are as accurate as the predictions made at the outlet of the basin. Final rating curves based on surveyed cross sections were developed for the 22 IFC-bridge sites that are currently operating, and routine forecast is provided at those locations (see IFIS). Rating curves were developed for 60 additional bridge locations in the basin, however, we do not use those rating curves for routine forecast because the lack of accuracy of LiDAR derived cross sections is not optimal. The results of our work form the basis for two papers that have been submitted for publication to the Journal of Hydrological Engineering. Peer review of our work will gives a strong footing to our ability to expand our results from the pilot Squaw Creek basin to all basins in Iowa.
Resumo:
The problem of how cooperation can evolve between individuals or entities with conflicting interests is central to biology as many of the major evolutionary transitions, from the first replicating molecules to human societies, have required solving this problem. There are many routes to cooperation but humans seem to be distinct from other species as they have more complex and diverse mechanisms, often due to their higher cognitive skills, allowing them to reap the benefits from living in groups. Among those mechanisms, the use of reputation or past experience with others as well as sanctioning mechanisms both seem to be of major importance. They have often been considered separately but the interaction between the two might provide new insights as to how punishment could have appeared as a means to enforce cooperation in early humans. In this thesis, I firstly use theoretical approaches from evolutionary game theory to investigate the evolution of punishment and cooperation through a reputation system based on punitive actions, and compare the efficacy of this system, in terms of cooperation achieved, with one based on cooperative actions. On the other hand, I use empirical approaches from economics to test, in real life, predictions from theoretical models but also to explore further conditions such as environmental variation, constrained memory, or even the scale of competition between individuals. Both approaches have allowed contributing to the understanding of how these factors affect reputation and punishment use, and ultimately how cooperation is achieved.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.
Resumo:
We study the effects of the adoption of new agricultural technologies on structural transformation. To guide empirical work, we present a simple model where the effect of agriculturalproductivity on industrial development depends on the factor bias of technical change. We testthe predictions of the model by studying the introduction of genetically engineered soybeanseeds in Brazil, which had heterogeneous effects on agricultural productivity across areas withdifferent soil and weather characteristics. We find that technical change in soy production wasstrongly labor saving and lead to industrial growth, as predicted by the model.
Resumo:
We have studied the nucleation and the physical properties of a -1/2 wedge disclination line near the free surface of a confined nematic liquid crystal. The position of the disclination line has been related to the material parameters (elastic constants, anchoring energy, and favored anchoring angle of the molecules at the free surface). The use of a planar model for the structure of the director field (whose predictions have been contrasted to those of a fully three-dimensional model) has allowed us to relate the experimentally observed position of the disclination line to the relevant properties of the liquid crystals. In particular, we have been able to observe the collapse of the disclination line due to a temperature-induced anchoring-angle transition, which has allowed us to rule out the presence of a real disclination line near the nematic/isotropic front in directional growth experiments.
Resumo:
Aim To evaluate the effects of using distinct alternative sets of climatic predictor variables on the performance, spatial predictions and future projections of species distribution models (SDMs) for rare plants in an arid environment. . Location Atacama and Peruvian Deserts, South America (18º30'S - 31º30'S, 0 - 3 000 m) Methods We modelled the present and future potential distributions of 13 species of Heliotropium sect. Cochranea, a plant group with a centre of diversity in the Atacama Desert. We developed and applied a sequential procedure, starting from climate monthly variables, to derive six alternative sets of climatic predictor variables. We used them to fit models with eight modelling techniques within an ensemble forecasting framework, and derived climate change projections for each of them. We evaluated the effects of using these alternative sets of predictor variables on performance, spatial predictions and projections of SDMs using Generalised Linear Mixed Models (GLMM). Results The use of distinct sets of climatic predictor variables did not have a significant effect on overall metrics of model performance, but had significant effects on present and future spatial predictions. Main conclusion Using different sets of climatic predictors can yield the same model fits but different spatial predictions of current and future species distributions. This represents a new form of uncertainty in model-based estimates of extinction risk that may need to be better acknowledged and quantified in future SDM studies.
Resumo:
We estimate the response of stock prices to exogenous monetary policy shocks usinga vector-autoregressive model with time-varying parameters. Our evidence points toprotracted episodes in which, after a a short-run decline, stock prices increase persistently in response to an exogenous tightening of monetary policy. That responseis clearly at odds with the "conventional" view on the effects of monetary policy onbubbles, as well as with the predictions of bubbleless models. We also argue that it isunlikely that such evidence be accounted for by an endogenous response of the equitypremium to the monetary policy shocks.
Resumo:
Because data on rare species usually are sparse, it is important to have efficient ways to sample additional data. Traditional sampling approaches are of limited value for rare species because a very large proportion of randomly chosen sampling sites are unlikely to shelter the species. For these species, spatial predictions from niche-based distribution models can be used to stratify the sampling and increase sampling efficiency. New data sampled are then used to improve the initial model. Applying this approach repeatedly is an adaptive process that may allow increasing the number of new occurrences found. We illustrate the approach with a case study of a rare and endangered plant species in Switzerland and a simulation experiment. Our field survey confirmed that the method helps in the discovery of new populations of the target species in remote areas where the predicted habitat suitability is high. In our simulations the model-based approach provided a significant improvement (by a factor of 1.8 to 4 times, depending on the measure) over simple random sampling. In terms of cost this approach may save up to 70% of the time spent in the field.
Resumo:
BACKGROUND: Prognostic models and nomograms were recently developed to predict survival of patients with newly diagnosed glioblastoma multiforme (GBM).1 To improve predictions, models should be updated with the most recent patient and disease information. Nomograms predicting patient outcome at the time of disease progression are required. METHODS: Baseline information from 299 patients with recurrent GBM recruited in 8 phase I or II trials of the EORTC Brain Tumor Group was used to evaluate clinical parameters as prognosticators of patient outcome. Univariate (log rank) and multivariate (Cox models) analyses were made to assess the ability of patients' characteristics (age, sex, performance status [WHO PS], and MRC neurological deficit scale), disease history (prior treatments, time since last treatment or initial diagnosis, and administration of steroids or antiepileptics) and disease characteristics (tumor size and number of lesions) to predict progression free survival (PFS) and overall survival (OS). Bootstrap technique was used for models internal validation. Nomograms were computed to provide individual patients predictions. RESULTS: Poor PS and more than 1 lesion had a significant prognostic impact for both PFS and OS. Antiepileptic drug use was significantly associated with worse PFS. Larger tumors (split by the median of the largest tumor diameter >42.5 mm) and steroid use had shorter OS. Age, sex, neurologic deficit, prior therapies, and time since last therapy or initial diagnosis did not show independent prognostic value for PFS or OS. CONCLUSIONS: This analysis confirms that PS but not age is a major prognostic factor for PFS and OS. Multiple or large tumors and the need to administer steroids significantly increase the risk of progression and death. Nomograms at the recurrence could be used to obtain accurate predictions for the design of new targeted therapy trials or retrospective analyses. (1. T. Gorlia et al., Nomograms for predicting survival of patients with newly diagnosed glioblastoma. Lancet Oncol 9 (1): 29-38, 2008.)