924 resultados para two-step carcinogenesis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The intermediacy of the geminate base proton pair (A*center dot center dot center dot H(+)) in excited-state proton-transfer (ESPT) reactions (two-step mechanism) has been investigated employing the synthetic flavylium salt 7-hydroxy-4-methyl-flavylium chloride (HMF). In aqueous solution, the ESPT mechanism involves solely the excited acid AH* and base A* forms of HMF as indicated by the fluorescence spectra and double-exponential fluorescence decays (two species, two decay times). However, upon addition of either 1,4-dioxane or 1,2-propylene glycol, the decays become triple-exponential with a term consistent with the presence of the geminate base proton pair A*center dot center dot center dot H(+). The geminate pair becomes detectable because of the increase in the recombination rate constant, k(rec), of (A*center dot center dot center dot H(+)) with increasing the mole fraction of added organic cosolvent. Because the two-step ESPT mechanism splits the intrinsic prototropic reaction rates (deprotonation of AH(+)*, k(d), and recombination, k(rec) of A*center dot center dot center dot H(+)) from the diffusion controlled rates (dissociation, k(diss) and formation, k(diff)[H(+)], of A*center dot center dot center dot H+), the experimental detection of the geminate pair provides a wealth of information on the proton-transfer reaction (k(d) and k(rec)) as well as on proton diffusion/migration (k(diss) and k(diff)).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The influence of molecular oxygen in the interactions of emeraldine base form of polyaniline (EB-PANI) with Fe(III) or Cu(II) ions in 1-methyl-2-pyrrolidinone (NMP) solutions has been investigated by UV-vis-NIR, resonance Raman and electron paramagnetic resonance (EPR) spectroscopies. Through the set of spectroscopic results it was possible to rationalize the role Of O(2) and to construct a scheme of preferential routes occurring in the interaction of EB-PANI with Fe(III) or Cu(II). Solutions of 4.0 mmol L(-1) EB-PANI with 0.8, 2.0 and 20 mmol L(-1) Fe(III) or Cu(II) ions in NMP were investigated and the main observed reactions were EB-PANI oxidation to pernigraniline (PB-PANI) and EB-PANI doping process by pseudo-protonation, or by a two-step redox process. In the presence Of O(2), PB-PANI is observed in all Fe(III)/EB solutions and EB-PANI doping only occurs in solutions with high Fe(III) concentrations through pseudo-protonation. On the other hand, emeraldine salt (ES-PANI) is formed in all Fe(III)/EB solutions under N(2) atmosphere and, in this case, doping occurs both by the pseudo-protonation and two-step redox mechanisms. In all Cu(II)/EB solutions PB-PANI is formed both in the presence and absence of O(2), and only for solutions with high Cu(II) concentrations doping process occurs in a very low degree. The most important result from EPR spectra was providing evidence for redox steps. The determined Cu(II) signal areas under oxygen are higher than under N(2) and, further. the initial metal proportions (1:2:20) are maintained in these spectra, indicating that Cu(I) formed are re-oxidized by O(2) and. so, Cu(II) ions are being recycled. Consistently, for the solutions prepared under nitrogen, the corresponding areas and proportions in the spectra are much lower, confirming that a partial reduction of Cu(II) ions actually occurs. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rapporten ingår i ett FoU- projekt för Högskolan Dalarna, där målet är att ta fram en konstruktion utan ångspärr som ska klara dagens mått på lufttäthet och fuktkrav. Syftet med denna rapport är att utreda hur fukt påverkar en byggnad medmassivträstomme och olika isoleringsmaterial utan ångspärr. Mineralull och träfiberisolering jämförs mot varandra för att se hur dessa påverkarfuktbelastningen i en väggkonstruktion. Testobjektet är lokaliserat i Dalarna, inget fukttillskott har funnits inomhus i byggnaden. För att genomföra detta arbete har tre stycken olika metoder används. Ensimulering, verkliga uppmätta värden och en provtagning. Fuktsimuleringen genomfördes med hjälp av programmet WUFI, uppmätta värden i form av relativ fuktighet och temperatur har samlats in kontinuerligt under två års tid från väggkonstruktionen via mätsensorer. Provtagningen utfördes med ett fysiskt ingrepp på samma nivå i konstruktionen som mätsensorer var placerade. Resultat presenteras i form av diagram och tabeller där det går att avläsa konstruktionens nulägesstatus i form av relativ fuktighet, temperatur, fuktkvot och mikrobiologisk påväxt. Isoleringsmaterialen påvisar en hög relativ fuktighet under vinterhalvåret längst ut i konstruktionen mot utomhusklimatet. Utomhusklimatet har visats spela stor roll i detta. Ingen direkt mikrobiologisk påväxt har påträffats trots en hög halt av fukt. Resultaten visar att träfiberisoleringen har bättre förmåga att hantera fukt i jämförelse med mineralullen. En vidarestudie med fuktbelastning och 21 °C inomhus bör utföras. Men för att denna studie ska fungera rekommenderas en tvåstegstätad fasadlösning för att klara fuktbelastningen i väggkonstruktionen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a two-step pseudo likelihood estimation technique for generalized linear mixed models with the random effects being correlated between groups. The core idea is to deal with the intractable integrals in the likelihood function by multivariate Taylor's approximation. The accuracy of the estimation technique is assessed in a Monte-Carlo study. An application of it with a binary response variable is presented using a real data set on credit defaults from two Swedish banks. Thanks to the use of two-step estimation technique, the proposed algorithm outperforms conventional pseudo likelihood algorithms in terms of computational time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we propose a two-step estimator for panel data models in which a binary covariate is endogenous. In the first stage, a random-effects probit model is estimated, having the endogenous variable as the left-hand side variable. Correction terms are then constructed and included in the main regression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian inflation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in different measures of forecasting accuracy are substantial, especially for short horizons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian in ation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in di¤erent measures of forecasting accuracy are substantial, especially for short horizons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. A Monte Carlo study explores the finite sample performance of this procedure and evaluates the forecasting accuracy of models selected by this procedure. Two empirical applications confirm the usefulness of the model selection procedure proposed here for forecasting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most studies around that try to verify the existence of regulatory risk look mainly at developed countries. Looking at regulatory risk in emerging market regulated sectors is no less important to improving and increasing investment in those markets. This thesis comprises three papers comprising regulatory risk issues. In the first Paper I check whether CAPM betas capture information on regulatory risk by using a two-step procedure. In the first step I run Kalman Filter estimates and then use these estimated betas as inputs in a Random-Effect panel data model. I find evidence of regulatory risk in electricity, telecommunications and all regulated sectors in Brazil. I find further evidence that regulatory changes in the country either do not reduce or even increase the betas of the regulated sectors, going in the opposite direction to the buffering hypothesis as proposed by Peltzman (1976). In the second Paper I check whether CAPM alphas say something about regulatory risk. I investigate a methodology similar to those used by some regulatory agencies around the world like the Brazilian Electricity Regulatory Agency (ANEEL) that incorporates a specific component of regulatory risk in setting tariffs for regulated sectors. I find using SUR estimates negative and significant alphas for all regulated sectors especially the electricity and telecommunications sectors. This runs in the face of theory that predicts alphas that are not statistically different from zero. I suspect that the significant alphas are related to misspecifications in the traditional CAPM that fail to capture true regulatory risk factors. On of the reasons is that CAPM does not consider factors that are proven to have significant effects on asset pricing, such as Fama and French size (ME) and price-to-book value (ME/BE). In the third Paper, I use two additional factors as controls in the estimation of alphas, and the results are similar. Nevertheless, I find evidence that the negative alphas may be the result of the regulated sectors premiums associated with the three Fama and French factors, particularly the market risk premium. When taken together, ME and ME/BE regulated sectors diminish the statistical significance of market factors premiums, especially for the electricity sector. This show how important is the inclusion of these factors, which unfortunately is scarce in emerging markets like Brazil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this article is to assess the role of real effective exchange rate volatility on long-run economic growth for a set of 82 advanced and emerging economies using a panel data set ranging from 1970 to 2009. With an accurate measure for exchange rate volatility, the results for the two-step system GMM panel growth models show that a more (less) volatile RER has significant negative (positive) impact on economic growth and the results are robust for different model specifications. In addition to that, exchange rate stability seems to be more important to foster long-run economic growth than exchange rate misalignment

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study semiparametric two-step estimators which have the same structure as parametric doubly robust estimators in their second step. The key difference is that we do not impose any parametric restriction on the nuisance functions that are estimated in a first stage, but retain a fully nonparametric model instead. We call these estimators semiparametric doubly robust estimators (SDREs), and show that they possess superior theoretical and practical properties compared to generic semiparametric two-step estimators. In particular, our estimators have substantially smaller first-order bias, allow for a wider range of nonparametric first-stage estimates, rate-optimal choices of smoothing parameters and data-driven estimates thereof, and their stochastic behavior can be well-approximated by classical first-order asymptotics. SDREs exist for a wide range of parameters of interest, particularly in semiparametric missing data and causal inference models. We illustrate our method with a simulation exercise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a two-step procedure to back out the conditional alpha of a given stock using high-frequency data. We rst estimate the realized factor loadings of the stocks, and then retrieve their conditional alphas by estimating the conditional expectation of their risk-adjusted returns. We start with the underlying continuous-time stochastic process that governs the dynamics of every stock price and then derive the conditions under which we may consistently estimate the daily factor loadings and the resulting conditional alphas. We also contribute empiri-cally to the conditional CAPM literature by examining the main drivers of the conditional alphas of the S&P 100 index constituents from January 2001 to December 2008. In addition, to con rm whether these conditional alphas indeed relate to pricing errors, we assess the performance of both cross-sectional and time-series momentum strategies based on the conditional alpha estimates. The ndings are very promising in that these strategies not only seem to perform pretty well both in absolute and relative terms, but also exhibit virtually no systematic exposure to the usual risk factors (namely, market, size, value and momentum portfolios).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As peculiaridades da atividade bancária - normalmente vista como fundamental à persecução do desenvolvimento, bem como bastante influenciada pelo direito - estimularam a emergência de um regime internacional de regulação da categoria. Tal advento se deu na esteira dos trabalhos realizados por organizações internacionais, como o Comitê da Basileia (BCBS) e o Comitê de Estabilidade Financeira (FSB), e em virtude da percepção de estarmos em um mundo no qual os mercados estão muito interligados, mas permanecem nacionalmente regulados. À parte da discussão do mérito e efetividade dos padrões regulatórios propostos por essas organizações, em um contexto no qual uma série de países busca implementá-los, interessa ao presente trabalho perscrutar os elementos que definem o grau adequado de discricionariedade de implementação conferida na formulação desses. A análise de tal problema sugere a existência de dois extremos a se evitar: a arbitragem regulatória e o one size fits all. Evitar a arbitragem regulatória é uma preocupação da literatura de regulação bancária que se traduz em conter uma variação muito acentuada entre os regimes regulatórios de diferentes jurisdições. Isso enseja três vetores favoráveis a um menor grau de discricionariedade, representado por desígnios de maior coordenação, maior competitividade e de evitar uma race to the bottom regulatória entre os países. Já evitar o one size fits all é uma preocupação recorrente da literatura de direito e desenvolvimento que sugere a necessidade de se atentar para as peculiaridades locais na formulação de políticas regulatórias. Por sua vez, isso enseja outros três vetores, dessa vez em direção a um maior grau de discricionariedade. Sendo esses representados por preocupações com a eficiência das medidas adotadas, com a garantia de um espaço de manobra que respeite a autodeterminação dos países - ao menos minorando eventuais déficits democráticos da estipulação de padrões internacionais - e com a viabilidade prática do experimentalismo. A fim de analisar esse problema e levando em conta esses extremos, propõe-se uma estratégia bipartida: a construção de um enquadramento teórico e a verificação de uma hipótese de pesquisa, segundo a qual um caso específico de regulação bancária pode demonstrar como esses elementos interagem na definição do grau de discricionariedade. Assim, em um primeiro momento - após a necessária contextualização e descrição metodológica - é construído um framework teórico do problema à luz da literatura da regulação bancária e do instrumental utilizado pelas discussões acerca do impacto do direito no desenvolvimento. Discussões essas que há anos têm abordado a formulação de padrões internacionais e a sua implementação em contextos nacionais diversos. Também nesse primeiro momento e como parte da construção dos alicerces teóricos, procede-se a um excurso que busca verificar a hipótese da confiança no sistema bancário ser uma espécie de baldio (common), bem como suas possíveis consequências. Partindo desse enquadramento, elege-se o segmento de regulação bancária relativo aos garantidores de depósito para uma análise de caso. Tal análise - realizada com subsídios provenientes de pesquisa bibliográfica e empírica - busca demonstrar com que grau de discricionariedade e de que forma se deu a formulação e implementação de padrões internacionais nesse segmento. Ao fim, analisa-se como os vetores determinantes do grau de discricionariedade interagem no caso dos garantidores de depósitos, bem como as sugestões possivelmente inferíveis dessa verificação para os demais segmentos da regulação bancária.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A tradicional representação da estrutura a termo das taxas de juros em três fatores latentes (nível, inclinação e curvatura) teve sua formulação original desenvolvida por Charles R. Nelson e Andrew F. Siegel em 1987. Desde então, diversas aplicações vêm sendo desenvolvidas por acadêmicos e profissionais de mercado tendo como base esta classe de modelos, sobretudo com a intenção de antecipar movimentos nas curvas de juros. Ao mesmo tempo, estudos recentes como os de Diebold, Piazzesi e Rudebusch (2010), Diebold, Rudebusch e Aruoba (2006), Pooter, Ravazallo e van Dijk (2010) e Li, Niu e Zeng (2012) sugerem que a incorporação de informação macroeconômica aos modelos da ETTJ pode proporcionar um maior poder preditivo. Neste trabalho, a versão dinâmica do modelo Nelson-Siegel, conforme proposta por Diebold e Li (2006), foi comparada a um modelo análogo, em que são incluídas variáveis exógenas macroeconômicas. Em paralelo, foram testados dois métodos diferentes para a estimação dos parâmetros: a tradicional abordagem em dois passos (Two-Step DNS), e a estimação com o Filtro de Kalman Estendido, que permite que os parâmetros sejam estimados recursivamente, a cada vez que uma nova informação é adicionada ao sistema. Em relação aos modelos testados, os resultados encontrados mostram-se pouco conclusivos, apontando uma melhora apenas marginal nas estimativas dentro e fora da amostra quando as variáveis exógenas são incluídas. Já a utilização do Filtro de Kalman Estendido mostrou resultados mais consistentes quando comparados ao método em dois passos para praticamente todos os horizontes de tempo estudados.