902 resultados para Role models
Resumo:
Most physiological effects of thyroid hormones are mediated by the two thyroid hormone receptor subtypes, TR alpha and TR beta. Several pharmacological effects mediated by TR beta might be beneficial in important medical conditions such as obesity, hypercholesterolemia and diabetes, and selective TR beta activation may elicit these effects while maintaining an acceptable safety profile, To understand the molecular determinants of affinity and subtype selectivity of TR ligands, we have successfully employed a ligand- and structure-guided pharmacophore-based approach to obtain the molecular alignment of a large series of thyromimetics. Statistically reliable three-dimensional quantitative structure-activity relationship (3D-QSAR) and three-dimensional quantitative structure-selectivity relationship (3D-QSSR) models were obtained using the comparative molecular field analysis (CoMFA) method, and the visual analyses of the contour maps drew attention to a number of possible opportunities for the development of analogs with improved affinity and selectivity. Furthermore, the 3D-QSSR analysis allowed the identification of a novel and previously unmentioned halogen bond, bringing new insights to the mechanism of activity and selectivity of thyromimetics.
Resumo:
Mixed models may be defined with or without reference to sampling, and can be used to predict realized random effects, as when estimating the latent values of study subjects measured with response error. When the model is specified without reference to sampling, a simple mixed model includes two random variables, one stemming from an exchangeable distribution of latent values of study subjects and the other, from the study subjects` response error distributions. Positive probabilities are assigned to both potentially realizable responses and artificial responses that are not potentially realizable, resulting in artificial latent values. In contrast, finite population mixed models represent the two-stage process of sampling subjects and measuring their responses, where positive probabilities are only assigned to potentially realizable responses. A comparison of the estimators over the same potentially realizable responses indicates that the optimal linear mixed model estimator (the usual best linear unbiased predictor, BLUP) is often (but not always) more accurate than the comparable finite population mixed model estimator (the FPMM BLUP). We examine a simple example and provide the basis for a broader discussion of the role of conditioning, sampling, and model assumptions in developing inference.
Resumo:
Mitochondria contain their own genome, a small circular molecule of around 16.5 kbases. The mitochondrial DNA (mtDNA) encodes for only 13 polypeptides, but its integrity is essential for mitochondrial function, as all 13 proteins are regulatory subunits of the oxidative phosphorylation complexes. Nonetheless, the mtDNA is physically associated with the inner mitochondrial membrane, where the majority of the cellular reactive oxygen species are generated. In fact, the mitochondrial DNA accumulates high levels of oxidized lesions, which have been associated with several pathological and degenerative processes. The cellular responses to nuclear DNA damage have been extensively studied, but so far little is known about the functional outcome and cellular responses to mtDNA damage. In this review we will discuss the mechanisms that lead to damage accumulation and the in vitro models we are establishing to dissect the cellular responses to oxidative damage in the mtDNA and to sort out the differential cellular consequences of accumulation of damage in each cellular genome, the nuclear and the mitochondrial genome.
Resumo:
During the period of 1990-2002 US households experienced a dramatic wealth cycle, induced by a 369% appreciation in the value of real per capita liquid stock market assets followed by a 55% decline. However, consumer spending in real terms continued to rise throughout this period. Using data from 1990-2005, traditional life-cycle approaches to estimating macroeconomic wealth effects confront two puzzles: (i) econometric evidence of a stable cointegrating relationship among consumption, income, and wealth is weak at best; and (ii) life-cycle models that rely on aggregate measures of wealth cannot explain why consumption did not collapse when the value of stock market assets declined so dramatically. We address both puzzles by decomposing wealth according to the liquidity of household assets. We find that the significant appreciation in the value of real estate assets that occurred after the peak of the wealth cycle helped sustain consumer spending from 2001 to 2005.
Resumo:
Climate change has resulted in substantial variations in annual extreme rainfall quantiles in different durations and return periods. Predicting the future changes in extreme rainfall quantiles is essential for various water resources design, assessment, and decision making purposes. Current Predictions of future rainfall extremes, however, exhibit large uncertainties. According to extreme value theory, rainfall extremes are rather random variables, with changing distributions around different return periods; therefore there are uncertainties even under current climate conditions. Regarding future condition, our large-scale knowledge is obtained using global climate models, forced with certain emission scenarios. There are widely known deficiencies with climate models, particularly with respect to precipitation projections. There is also recognition of the limitations of emission scenarios in representing the future global change. Apart from these large-scale uncertainties, the downscaling methods also add uncertainty into estimates of future extreme rainfall when they convert the larger-scale projections into local scale. The aim of this research is to address these uncertainties in future projections of extreme rainfall of different durations and return periods. We plugged 3 emission scenarios with 2 global climate models and used LARS-WG, a well-known weather generator, to stochastically downscale daily climate models’ projections for the city of Saskatoon, Canada, by 2100. The downscaled projections were further disaggregated into hourly resolution using our new stochastic and non-parametric rainfall disaggregator. The extreme rainfall quantiles can be consequently identified for different durations (1-hour, 2-hour, 4-hour, 6-hour, 12-hour, 18-hour and 24-hour) and return periods (2-year, 10-year, 25-year, 50-year, 100-year) using Generalized Extreme Value (GEV) distribution. By providing multiple realizations of future rainfall, we attempt to measure the extent of total predictive uncertainty, which is contributed by climate models, emission scenarios, and downscaling/disaggregation procedures. The results show different proportions of these contributors in different durations and return periods.
Resumo:
This paper develops background considerations to help better framing the results of a CGE exercise. Three main criticisms are usually addressed to CGE efforts. First, they are too aggregate, their conclusions failing to shed light on relevant sectors or issues. Second, they imply huge data requirements. Timeliness is frequently jeopardised by out-dated sources, benchmarks referring to realities gone by. Finally, results are meaningless, as they answer wrong or ill-posed questions. Modelling demands end up by creating a rather artificial context, where the original questions lose content. In spite of a positive outlook on the first two, crucial questions lie in the third point. After elaborating such questions, and trying to answer some, the text argues that CGE models can come closer to reality. If their use is still scarce to give way to a fruitful symbiosis between negotiations and simulation results, they remain the only available technique providing a global, inter-related way of capturing economy-wide effects of several different policies. International organisations can play a major role supporting and encouraging improvements. They are also uniquely positioned to enhance information and data sharing, as well as putting people from various origins together, to share their experiences. A serious and complex homework is however required, to correct, at least, the most dangerous present shortcomings of the technique.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
Is private money feasible and desirable? In its absence, is there a central bank policy that partially or fully substitutes for private money? In this paper, some recent modeling ideas about how to address these questioned are reviewed and applied. The main ideas are that people cannot commit to future actions and that their histories are to some extent unknown - are not common knowledge. Under the additional assumption that the private monies issued by diferent people are distinct, a strong recognizability assumption, it is shown that there is a role for private money.
Resumo:
This paper aims at contributing to the research agenda on the sources of price stickiness, showing that the adoption of nominal price rigidity may be an optimal firms' reaction to the consumers' behavior, even if firms have no adjustment costs. With regular broadly accepted assumptions on economic agents behavior, we show that firms' competition can lead to the adoption of sticky prices as an (sub-game perfect) equilibrium strategy. We introduce the concept of a consumption centers model economy in which there are several complete markets. Moreover, we weaken some traditional assumptions used in standard monetary policy models, by assuming that households have imperfect information about the ineflicient time-varying cost shocks faced by the firms, e.g. the ones regarding to inefficient equilibrium output leveIs under fiexible prices. Moreover, the timing of events are assumed in such a way that, at every period, consumers have access to the actual prices prevailing in the market only after choosing a particular consumption center. Since such choices under uncertainty may decrease the expected utilities of risk averse consumers, competitive firms adopt some degree of price stickiness in order to minimize the price uncertainty and fi attract more customers fi.'
Resumo:
We estimate the effect of firms' profitability on wage determination for the American economy. Two standard bargaining models are used to illustrate the problems caused by the endogeneity of profits-per-worker in a real wage equation. The profit-sharing parameter can be identified with instruments which shift demando Using information from the input-output table, we create demand-shift variables for 63 4-digit sectors of the US manufacturing sector. The LV. estimates show that profit-sharing is a relevant and widespread phenomenon. The elasticity of wages with respect to profits-per-worker is seven times as large as OLS estimates here and in previous papers. Sensitivity analysis of the profit-sharing parameter controlling for the extent of unionization and product market concentration reinforces our results.
Resumo:
SILVA, Flávio César Bezerra da ; COSTA, Francisca Marta de Lima; ANDRADE, Hamilton Leandro Pinto de; FREIRE, Lúcia de Fátima; MACIEL, Patrícia Suerda de Oliveira; ENDERS, Bertha Cruz ; MENEZES, Rejane Maria Paiva de. Paradigms that guide the models of attention to the health in Brazil: an analytic essay. Revista de Enfermagem UFPE On Line., Recife, v.3,n.4, p.460-65. out/dez. 2009. Disponível em < http://www.ufpe.br/revistaenfermagem/index.php/revista/search/results >.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The human buccal micronucleus cytome assay (BMCyt) is one of the most widely used techniques to measure genetic damage in human population studies. Reducing protocol variability, assessing the role of confounders, and estimating a range of reference values are research priorities that will be addressed by the HUMNXL, collaborative study. The HUMNXL, project evaluates the impact of host factors, occupation, life-style, disease status, and protocol features on the occurrence of MN in exfoliated buccal cells. In addition, the study will provide a range of reference values for all cytome endpoints. A database of 5424 subjects with buccal MN values obtained from 30 laboratories worldwide was compiled and analyzed to investigate the influence of several conditions affecting MN frequency. Random effects models were mostly used to investigate MN predictors. The estimated spontaneous MN frequency was 0.74 parts per thousand (95% CI 0.52-1.05). Only staining among technical features influenced MN frequency, with an abnormal increase for non-DNA-specific stains. No effect of gender was evident, while the trend for age was highly significant (p < 0.001). Most occupational exposures and a diagnosis of cancer significantly increased MN and other endpoints frequencies. MN frequency increased in heavy smoking (>= 40 cig/day. FR = 1.37:95% CI 1.03-.82) and decreased with daily fruit consumption (FR = 0.68; 95% CI 0.50-0.91). The results of the HUMNXL, project identified priorities for validation studies, increased the basic knowledge of the assay, and contributed to the creation of a laboratory network which in perspective may allow the evaluation of disease risk associated with MN frequency. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)