898 resultados para Discrete Regression and Qualitative Choice Models


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, both homing endonucleases (HEases) and zinc-finger nucleases (ZFNs) have been engineered and selected for the targeting of desired human loci for gene therapy. However, enzyme engineering is lengthy and expensive and the off-target effect of the manufactured endonucleases is difficult to predict. Moreover, enzymes selected to cleave a human DNA locus may not cleave the homologous locus in the genome of animal models because of sequence divergence, thus hampering attempts to assess the in vivo efficacy and safety of any engineered enzyme prior to its application in human trials. Here, we show that naturally occurring HEases can be found, that cleave desirable human targets. Some of these enzymes are also shown to cleave the homologous sequence in the genome of animal models. In addition, the distribution of off-target effects may be more predictable for native HEases. Based on our experimental observations, we present the HomeBase algorithm, database and web server that allow a high-throughput computational search and assignment of HEases for the targeting of specific loci in the human and other genomes. We validate experimentally the predicted target specificity of candidate fungal, bacterial and archaeal HEases using cell free, yeast and archaeal assays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the development of many effective antihypertensive drugs, target blood pressures are reached in only a minority of patients in clinical practice. Poor adherence to drug therapy and the occurrence of side effects are among the main reasons commonly reported by patients and physicians to explain the poor results of actual antihypertensive therapies. The development of new effective antihypertensive agents with an improved tolerability profile might help to partly overcome these problems. Lercanidipine is an effective dihydropyridine calcium channel blocker of the third generation characterized by a long half-life and its lipophylicity. In contrast to first-generation dihydropyridines, lercanidipine does not induce reflex tachycardia and induces peripheral edema with a lower incidence. Recent data suggest that in addition to lowering blood pressure, lercanidipine might have some renal protective properties. In this review we shall discuss the problems of drug adherence in the management of hypertension with a special emphasis on lercanidipine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In (1) H magnetic resonance spectroscopy, macromolecule signals underlay metabolite signals, and knowing their contribution is necessary for reliable metabolite quantification. When macromolecule signals are measured using an inversion-recovery pulse sequence, special care needs to be taken to correctly remove residual metabolite signals to obtain a pure macromolecule spectrum. Furthermore, since a single spectrum is commonly used for quantification in multiple experiments, the impact of potential macromolecule signal variability, because of regional differences or pathologies, on metabolite quantification has to be assessed. In this study, we introduced a novel method to post-process measured macromolecule signals that offers a flexible and robust way of removing residual metabolite signals. This method was applied to investigate regional differences in the mouse brain macromolecule signals that may affect metabolite quantification when not taken into account. However, since no significant differences in metabolite quantification were detected, it was concluded that a single macromolecule spectrum can be generally used for the quantification of healthy mouse brain spectra. Alternatively, the study of a mouse model of human glioma showed several alterations of the macromolecule spectrum, including, but not limited to, increased mobile lipid signals, which had to be taken into account to avoid significant metabolite quantification errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe the case of a man with a history of complex partial seizures and severe language, cognitive and behavioural regression during early childhood (3.5 years), who underwent epilepsy surgery at the age of 25 years. His early epilepsy had clinical and electroencephalogram features of the syndromes of epilepsy with continuous spike waves during sleep and acquired epileptic aphasia (Landau-Kleffner syndrome), which we considered initially to be of idiopathic origin. Seizures recurred at 19 years and presurgical investigations at 25 years showed a lateral frontal epileptic focus with spread to Broca's area and the frontal orbital regions. Histopathology revealed a focal cortical dysplasia, not visible on magnetic resonance imaging. The prolonged but reversible early regression and the residual neuropsychological disorders during adulthood were probably the result of an active left frontal epilepsy, which interfered with language and behaviour during development. Our findings raise the question of the role of focal cortical dysplasia as an aetiology in the syndromes of epilepsy with continuous spike waves during sleep and acquired epileptic aphasia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to estimate the stability and adaptability of pod and seed yield in runner peanut genotypes based on the nonlinear regression and AMMI analysis. Yield data from 11 trials, distributed in six environments and three harvests, carried out in the Northeast region of Brazil during the rainy season were used. Significant effects of genotypes (G), environments (E), and GE interactions were detected in the analysis, indicating different behaviors among genotypes in favorable and unfavorable environmental conditions. The genotypes BRS Pérola Branca and LViPE‑06 are more stable and adapted to the semiarid environment, whereas LGoPE‑06 is a promising material for pod production, despite being highly dependent on favorable environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Major advances in the understanding of the molecular biology of hepatitis C virus (HCV) have been made recently. While the chimpanzee is the only established animal model of HCV infection, several in vivo and in vitro models have been established that allow us to study various aspects of the viral life cycle. In particular, the replicon system and the production of recombinant infectious virions revolutionized the investigation of HCV-RNA replication and rendered all steps of the viral life cycle, including entry and release of viral particles, amenable to systematic analysis. In the following we will review the different in vivo and in vitro models of HCV infection.