164 resultados para CURE FRACTION MODELS
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Peering into the field of Alzheimer's disease (AD), the outsider realizes that many of the therapeutic strategies tested (in animal models) have been successful. One also may notice that there is a deficit in translational research, i.e., to take a successful drug in mice and translate it to the patient. Efforts are still focused on novel projects to expand the therapeutic arsenal to 'cure mice.' Scientific reasons behind so many successful strategies are not obvious. This article aims to review the current approaches to combat AD and to open a debate on common mechanisms of cognitive enhancement and neuroprotection. In short, either the rodent models are not good and should be discontinued, or we should extract the most useful information from those models. An example of a question that may be debated for the advancement in AD therapy is: In addition to reducing amyloid and tau pathologies, would it be necessary to boost synaptic strength and cognition? The debate could provide clues to turn around the current negative output in generating effective drugs for patients. Furthermore, discovery of biomarkers in human body fluids, and a clear distinction between cognitive enhancers and disease modifying strategies, should be instrumental for advancing in anti-AD drug discovery.
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
This paper provides empirical evidence that continuous time models with one factor of volatility, in some conditions, are able to fit the main characteristics of financial data. It also reports the importance of the feedback factor in capturing the strong volatility clustering of data, caused by a possible change in the pattern of volatility in the last part of the sample. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate logarithmic models with one and two stochastic volatility factors (with and without feedback) and to select among them.
Resumo:
The choice of either the rate of monetary growth or the nominal interest rate as the instrument controlled by monetary authorities has both positive and normative implications for economic performance. We reexamine some of the issues related to the choice of the monetary policy instrument in a dynamic general equilibrium model exhibiting endogenous growth in which a fraction of productive government spending is financed by means of issuing currency. When we evaluate the performance of the two monetary instruments attending to the fluctuations of endogenous variables, we find that the inflation rate is less volatile under nominal interest rate targeting. Concerning the fluctuations of consumption and of the growth rate, both monetary policy instruments lead to statistically equivalent volatilities. Finally, we show that none of these two targeting procedures displays unambiguously higher welfare levels.
Resumo:
We study the optimal public intervention in setting minimum standards of formation for specialized medical care. The abilities the physicians obtain by means of their training allow them to improve their performance as providers of cure and earn some monopoly rents.. Our aim is to characterize the most efficient regulation in this field taking into account different regulatory frameworks. We find that the existing situation in some countries, in which the amount of specialization is controlled, and the costs of this process of specialization are publicly financed, can be supported as the best possible intervention.
Resumo:
Expectations are central to behaviour. Despite the existence of subjective expectations data, the standard approach is to ignore these, to hypothecate a model of behaviour and to infer expectations from realisations. In the context of income models, we reveal the informational gain obtained from using both a canonical model and subjective expectations data. We propose a test for this informational gain, and illustrate our approach with an application to the problem of measuring income risk.
Resumo:
Estudi realitzat a partir d’una estada a Roma entre el 7 de gener i el 28 de febrer de 2006. S’ estudia la influència de les produccions bizantines i orientals a la península Ibèrica, a l’època visigoda i més enllà, fins i tot justificant una cronologia dels segles VIII-X dC per a molts dels capitells tradicionalment denominats mossàrabs del nord-oest peninsular. A més, s’enuncia una via per la investigació de les possibles influències llombardes a la península Ibèrica. També es comenten les relacions entre els capitells del nord-est peninsular i els de la Gàl.lia.
Resumo:
Estudi realitzat a partir d’una estada a l’Institut National de Recherche Scientifique, de Montreal, entre l’1 de setembre i el 30 de desembre de 2005. S’analitza el model d’organització de l’àrea metropolitana de Montreal (Canadà) després de la reforma realitzada entre 2000 i 2002, així com les causes que van conduïr a adoptar-lo.
Resumo:
Transcripció de la intervenció del Sr. Gabriel Colomé en el Curs Universitari sobre Olimpisme que va organitzar el Centre d'Estudis Olímpics (CEO-UAB) el febrer de 1992. L'autor amb aquest text es proposa dos objectius principals: d'una banda, analitzar la influència de l'entorn sociopolític sobre l'estructura organitzativa del Comitè Organitzador dels Jocs; de l'altra, veure com afecta el tipus de finançament en l'estructura i la infrastructura dels mateixos Jocs, i quines diferències hi ha entre els Jocs de 1972 i els següents fins a arribar a Barcelona.
Resumo:
We present experimental and theoretical analyses of data requirements for haplotype inference algorithms. Our experiments include a broad range of problem sizes under two standard models of tree distribution and were designed to yield statistically robust results despite the size of the sample space. Our results validate Gusfield's conjecture that a population size of n log n is required to give (with high probability) sufficient information to deduce the n haplotypes and their complete evolutionary history. The experimental results inspired our experimental finding with theoretical bounds on the population size. We also analyze the population size required to deduce some fixed fraction of the evolutionary history of a set of n haplotypes and establish linear bounds on the required sample size. These linear bounds are also shown theoretically.
Resumo:
We give sufficient conditions for existence, uniqueness and ergodicity of invariant measures for Musiela's stochastic partial differential equation with deterministic volatility and a Hilbert space valued driving Lévy noise. Conditions for the absence of arbitrage and for the existence of mild solutions are also discussed.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
There is recent interest in the generalization of classical factor models in which the idiosyncratic factors are assumed to be orthogonal and there are identification restrictions on cross-sectional and time dimensions. In this study, we describe and implement a Bayesian approach to generalized factor models. A flexible framework is developed to determine the variations attributed to common and idiosyncratic factors. We also propose a unique methodology to select the (generalized) factor model that best fits a given set of data. Applying the proposed methodology to the simulated data and the foreign exchange rate data, we provide a comparative analysis between the classical and generalized factor models. We find that when there is a shift from classical to generalized, there are significant changes in the estimates of the structures of the covariance and correlation matrices while there are less dramatic changes in the estimates of the factor loadings and the variation attributed to common factors.
Resumo:
Projecte de recerca elaborat a partir d’una estada al Laboratory of Archaeometry del National Centre of Scientific Research “Demokritos” d’Atenes, Grècia, entre juny i setembre 2006. Aquest estudi s’emmarca dins d’un context més ampli d’estudi del canvi tecnològic que es documenta en la producció d’àmfores de tipologia romana durant els segles I aC i I dC en els territoris costaners de Catalunya. Una part d’aquest estudi contempla el càlcul de les propietats mecàniques d’aquestes àmfores i la seva avaluació en funció de la tipologia amforal, a partir de l’Anàlisi d’Elements Finits (AEF). L’AEF és una aproximació numèrica que té el seu origen en les ciències d’enginyeria i que ha estat emprada per estimar el comportament mecànic d’un model en termes, per exemple, de deformació i estrès. Així, un objecte, o millor dit el seu model, es dividit en sub-dominis anomenats elements finits, als quals se’ls atribueixen les propietats mecàniques del material en estudi. Aquests elements finits estan connectats formant una xarxa amb constriccions que pot ser definida. En el cas d’aplicar una força determinada a un model, el comportament de l’objecte pot ser estimat mitjançant el conjunt d’equacions lineals que defineixen el rendiment dels elements finits, proporcionant una bona aproximació per a la descripció de la deformació estructural. Així, aquesta simulació per ordinador suposa una important eina per entendre la funcionalitat de ceràmiques arqueològiques. Aquest procediment representa un model quantitatiu per predir el trencament de l’objecte ceràmic quan aquest és sotmès a diferents condicions de pressió. Aquest model ha estat aplicat a diferents tipologies amforals. Els resultats preliminars mostren diferències significatives entre la tipologia pre-romana i les tipologies romanes, així com entre els mateixos dissenys amforals romans, d’importants implicacions arqueològiques.