936 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estudi realitzat a partir d’una estada a la Stanford University School of Medicine. Division of Radiation Oncology, Estats Units, entre 2010 i 2012. Durant els dos anys de beca postdoctoral he estat treballant en dos projectes diferents. En primer lloc, i com a continuació d'estudis previs del grup, volíem estudiar la causa de les diferències en nivells d'hipòxia que havíem observat en models de càncer de pulmó. La nostra hipòtesi es basava en el fet que aquestes diferències es devien a la funcionalitat de la vasculatura. Vam utilitzar dos models preclínics: un en què els tumors es formaven espontàniament als pulmons i l'altre on nosaltres injectàvem les cèl•lules de manera subcutània. Vam utilitzar tècniques com la ressonància magnètica dinàmica amb agent de contrast (DCE-MRI) i l'assaig de perfusió amb el Hoeschst 33342 i ambdues van demostrar que la funcionalitat de la vasculatura dels tumors espontanis era molt més elevada comparada amb la dels tumors subcutanis. D'aquest estudi, en podem concloure que les diferències en els nivells d'hipòxia en els diferents models tumorals de càncer de pulmó podrien ser deguts a la variació en la formació i funcionalitat de la vasculatura. Per tant, la selecció de models preclínics és essencial, tant pels estudi d'hipòxia i angiogènesi, com per a teràpies adreçades a aquests fenòmens. L'altre projecte que he estat desenvolupant es basa en l'estudi de la radioteràpia i els seus possibles efectes a l’hora de potenciar l'autoregeneració del tumor a partir de les cèl•lules tumorals circulants (CTC). Aquest efecte s'ha descrit en alguns models tumorals preclínics. Per tal de dur a terme els nostres estudis, vam utilitzar una línia tumoral de càncer de mama de ratolí, marcada permanentment amb el gen de Photinus pyralis o sense marcar i vam fer estudis in vitro i in vivo. Ambdós estudis han demostrat que la radiació tumoral promou la invasió cel•lular i l'autoregeneració del tumor per CTC. Aquest descobriment s'ha de considerar dins d'un context de radioteràpia clínica per tal d'aconseguir el millor tractament en pacients amb nivells de CTC elevats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent wave of upheavals and revolts in Northern Africa and the Middle East goes back to an old question often raised by theories of collective action: does repression act as a negative or positive incentive for further mobilization? Through a review of the vast literature devoted to this question, this article aims to go beyond theoretical and methodological dead-ends. The article moves on to non-Western settings in order to better understand, via a macro-sociological and dynamic approach, the causal effects between mobilizations and repression. It pleads for a meso- and micro-level approach to this issue: an approach that puts analytical emphasis both on protest organizations and on individual activists' careers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001.We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two different approaches currently prevail for predicting spatial patterns of species assemblages. The first approach (macroecological modelling, MEM) focuses directly on realised properties of species assemblages, whereas the second approach (stacked species distribution modelling, S-SDM) starts with constituent species to approximate assemblage properties. Here, we propose to unify the two approaches in a single 'spatially-explicit species assemblage modelling' (SESAM) framework. This framework uses relevant species source pool designations, macroecological factors, and ecological assembly rules to constrain predictions of the richness and composition of species assemblages obtained by stacking predictions of individual species distributions. We believe that such a framework could prove useful in many theoretical and applied disciplines of ecology and evolution, both for improving our basic understanding of species assembly across spatio-temporal scales and for anticipating expected consequences of local, regional or global environmental changes. In this paper, we propose such a framework and call for further developments and testing across a broad range of community types in a variety of environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the short run correlation of inflation and money growth. We study whether a model of learning can do better than a model of rational expectations, we focus our study on countries of high inflation. We take the money process as an exogenous variable, estimated from the data through a switching regime process. We findthat the rational expectations model and the model of learning both offer very good explanations for the joint behavior of money and prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies equilibria for economies characterized by moral hazard(hidden action), in which the set of contracts marketed in equilibrium isdetermined by the interaction of financial intermediaries.The crucial aspect of the environment that we study is thatintermediaries are restricted to trade non-exclusive contracts: theagents' contractual relationships with competing intermediaries cannot bemonitored (or are not contractible upon). We fully characterize equilibrium allocations and contracts. In thisset-up equilibrium allocations are clearly incentive constrainedinefficient. A robust property of equilibria with non-exclusivity isthat the contracts issued in equilibrium do not implement the optimalaction. Moreover we prove that, whenever equilibrium contracts doimplement the optimal action, intermediaries make positive profits andequilibrium allocations are third best inefficient (where the definitionof third best efficiency accounts for constraints which capture thenon-exclusivity of contracts).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, I analyze the ownership dynamics of N strategic risk-averse corporate insiders facing a moral hazard problem. A solution for the equilibrium share price and the dynamics of the aggregate insider stake is obtained in two cases: when agents can crediblycommit to an optimal ownership policy and when they cannot commit (time-consistent case). Inthe latter case, the aggregate stake gradually adjusts towards the competitive allocation. The speed of adjustment increases with N when outside investors are risk-averse, and does not depend on it when investors are risk-neutral. Predictions of the model are consistent with recent empirical findings.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND/PURPOSE: A new coordinated interdisciplinary unit was created in the acute section of the department of clinical neurosciences, the Acute NeuroRehabilitation (NRA) unit. The objective was to evaluate the impact of the unit and its neurosensory programme on the management of tracheostomy patients in terms of reduction in the average time taken for weaning, weaning success rate and therapeutic efficiency. METHODS: This 49-month retrospective study compares 2 groups of tracheostomy patients before (n = 34) and after (n = 46) NRA intervention. The outcome measures evaluate the benefits of the NRA unit intervention (time to decannulation, weaning and complication rates) and the benefits of the coordination (time to registration in a rehabilitation centre and rate of non-compliance with standards of care). RESULTS: Weaning failure rate was reduced from 27.3% to 9.1%, no complications or recannulations were observed in the post-intervention group after weaning and time to decannulation following admission to our unit decreased from 19.13 to 12.75 days. The rate of non-compliance with patient standards of care was significantly reduced from 45% to 30% (Mann-Whitney p = 0.003). DISCUSSION/CONCLUSIONS: This interdisciplinary weaning programme helped to reduce weaning time and weaning failure, without increased complications, in the sample studied. Coordination improved the efficiency of the interdisciplinary team in the multiplicity and complexity of the different treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, both homing endonucleases (HEases) and zinc-finger nucleases (ZFNs) have been engineered and selected for the targeting of desired human loci for gene therapy. However, enzyme engineering is lengthy and expensive and the off-target effect of the manufactured endonucleases is difficult to predict. Moreover, enzymes selected to cleave a human DNA locus may not cleave the homologous locus in the genome of animal models because of sequence divergence, thus hampering attempts to assess the in vivo efficacy and safety of any engineered enzyme prior to its application in human trials. Here, we show that naturally occurring HEases can be found, that cleave desirable human targets. Some of these enzymes are also shown to cleave the homologous sequence in the genome of animal models. In addition, the distribution of off-target effects may be more predictable for native HEases. Based on our experimental observations, we present the HomeBase algorithm, database and web server that allow a high-throughput computational search and assignment of HEases for the targeting of specific loci in the human and other genomes. We validate experimentally the predicted target specificity of candidate fungal, bacterial and archaeal HEases using cell free, yeast and archaeal assays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is much evidence for a causal relationship between salt intake and blood pressure (BP). The current salt intake in many countries is between 9 and 12 g/day. A reduction in salt intake to the recommended level of 5-6 g/day lowers BP in both hypertensive and normotensive individuals. A further reduction to 3-4 g/day has a much greater effect. Prospective studies and outcome trials have demonstrated that a lower salt intake is associated with a decreased risk of cardiovascular disease. Increasing evidence also suggests that a high salt intake is directly related to left ventricular hypertrophy (LVH) independent of BP. Both raised BP and LVH are important risk factors for heart failure. It is therefore possible that a lower salt intake could prevent the development of heart failure. In patients who already have heart failure, a high salt intake aggravates the retention of salt and water, thereby exacerbating heart failure symptoms and progression of the disease. A lower salt intake plays an important role in the management of heart failure. Despite this, currently there is no clear evidence on how far salt intake should be reduced in heart failure. Our personal view is that these patients should reduce their salt intake to <5 g/day, i.e. the maximum intake recommended by the World Health Organisation for all adults. If salt intake is successfully reduced, there may well be a need for a reduction in diuretic dosage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Cultures have limited sensitivity in the diagnosis of prosthetic joint infection (PJI), especially in low-grade infections. We assessed the value of multiplex PCR in differentiating PJI from aseptic failure (AF). METHODS: Included were patients in whom the joint prosthesis was removed and submitted for sonication. The resulting sonication fluid was cultured and investigated by multiplex PCR, and compared with periprosthetic tissue culture. RESULTS: Among 86 explanted prostheses (56 knee, 25 hip, 3 elbow and 2 shoulder prostheses), AF was diagnosed in 62 cases (72%) and PJI in 24 cases (28%). PJI was more common detected by multiplex PCR (n=23, 96%) than by periprosthetic tissue (n=17, 71%, p=0.031) or sonication fluid culture (n=16, 67%, p=0.016). Among 12 patients with PJI who previously received antibiotics, periprosthetic tissue cultures were positive in 8 cases (67%), sonication fluid cultures in 6 cases (50%) and multiplex PCR in 11 cases (92%). In AF cases, periprosthetic tissue grew organisms in 11% and sonication fluid in 10%, whereas multiplex PCR detected no organisms. CONCLUSIONS: Multiplex PCR of sonication fluid demonstrated high sensitivity (96%) and specificity (100%) for diagnosing PJI, providing good discriminative power towards AF, especially in patients previously receiving antibiotics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.