969 resultados para Consistent term structure models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper provides one of the first applications of the double bootstrap procedure (Simar and Wilson 2007) in a two-stage estimation of the effect of environmental variables on non-parametric estimates of technical efficiency. This procedure enables consistent inference within models explaining efficiency scores, while simultaneously producing standard errors and confidence intervals for these efficiency scores. The application is to 88 livestock and 256 crop farms in the Czech Republic, split into individual and corporate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Greek speakers say "ovpa", Germans "schwanz'' and the French "queue'' to describe what English speakers call a 'tail', but all of these languages use a related form of 'two' to describe the number after one. Among more than 100 Indo-European languages and dialects, the words for some meanings (such as 'tail') evolve rapidly, being expressed across languages by dozens of unrelated words, while others evolve much more slowly-such as the number 'two', for which all Indo-European language speakers use the same related word-form(1). No general linguistic mechanism has been advanced to explain this striking variation in rates of lexical replacement among meanings. Here we use four large and divergent language corpora (English(2), Spanish(3), Russian(4) and Greek(5)) and a comparative database of 200 fundamental vocabulary meanings in 87 Indo-European languages(6) to show that the frequency with which these words are used in modern language predicts their rate of replacement over thousands of years of Indo-European language evolution. Across all 200 meanings, frequently used words evolve at slower rates and infrequently used words evolve more rapidly. This relationship holds separately and identically across parts of speech for each of the four language corpora, and accounts for approximately 50% of the variation in historical rates of lexical replacement. We propose that the frequency with which specific words are used in everyday language exerts a general and law-like influence on their rates of evolution. Our findings are consistent with social models of word change that emphasize the role of selection, and suggest that owing to the ways that humans use language, some words will evolve slowly and others rapidly across all languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol indirect effects continue to constitute one of the most important uncertainties for anthropogenic climate perturbations. Within the international AEROCOM initiative, the representation of aerosol-cloud-radiation interactions in ten different general circulation models (GCMs) is evaluated using three satellite datasets. The focus is on stratiform liquid water clouds since most GCMs do not include ice nucleation effects, and none of the model explicitly parameterises aerosol effects on convective clouds. We compute statistical relationships between aerosol optical depth (τa) and various cloud and radiation quantities in a manner that is consistent between the models and the satellite data. It is found that the model-simulated influence of aerosols on cloud droplet number concentration (Nd ) compares relatively well to the satellite data at least over the ocean. The relationship between �a and liquid water path is simulated much too strongly by the models. This suggests that the implementation of the second aerosol indirect effect mainly in terms of an autoconversion parameterisation has to be revisited in the GCMs. A positive relationship between total cloud fraction (fcld) and �a as found in the satellite data is simulated by the majority of the models, albeit less strongly than that in the satellite data in most of them. In a discussion of the hypotheses proposed in the literature to explain the satellite-derived strong fcld–�a relationship, our results indicate that none can be identified as a unique explanation. Relationships similar to the ones found in satellite data between �a and cloud top temperature or outgoing long-wave radiation (OLR) are simulated by only a few GCMs. The GCMs that simulate a negative OLR - �a relationship show a strong positive correlation between �a and fcld. The short-wave total aerosol radiative forcing as simulated by the GCMs is strongly influenced by the simulated anthropogenic fraction of �a, and parameterisation assumptions such as a lower bound on Nd . Nevertheless, the strengths of the statistical relationships are good predictors for the aerosol forcings in the models. An estimate of the total short-wave aerosol forcing inferred from the combination of these predictors for the modelled forcings with the satellite-derived statistical relationships yields a global annual mean value of −1.5±0.5Wm−2. In an alternative approach, the radiative flux perturbation due to anthropogenic aerosols can be broken down into a component over the cloud-free portion of the globe (approximately the aerosol direct effect) and a component over the cloudy portion of the globe (approximately the aerosol indirect effect). An estimate obtained by scaling these simulated clearand cloudy-sky forcings with estimates of anthropogenic �a and satellite-retrieved Nd–�a regression slopes, respectively, yields a global, annual-mean aerosol direct effect estimate of −0.4±0.2Wm−2 and a cloudy-sky (aerosol indirect effect) estimate of −0.7±0.5Wm−2, with a total estimate of −1.2±0.4Wm−2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atmospheric CO2 concentration has varied from minima of 170-200 ppm in glacials to maxima of 280-300 ppm in the recent interglacials. Photosynthesis by C-3 plants is highly sensitive to CO2 concentration variations in this range. Physiological consequences of the CO2 changes should therefore be discernible in palaeodata. Several lines of evidence support this expectation. Reduced terrestrial carbon storage during glacials, indicated by the shift in stable isotope composition of dissolved inorganic carbon in the ocean, cannot be explained by climate or sea-level changes. It is however consistent with predictions of current process-based models that propagate known physiological CO2 effects into net primary production at the ecosystem scale. Restricted forest cover during glacial periods, indicated by pollen assemblages dominated by non-arboreal taxa, cannot be reproduced accurately by palaeoclimate models unless CO2 effects on C-3-C-4 plant competition are also modelled. It follows that methods to reconstruct climate from palaeodata should account for CO2 concentration changes. When they do so, they yield results more consistent with palaeoclimate models. In conclusion, the palaeorecord of the Late Quaternary, interpreted with the help of climate and ecosystem models, provides evidence that CO2 effects at the ecosystem scale are neither trivial nor transient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

These findings strongly suggest that CFPE do not generally result from increased bacterial density within the airways. Instead, data presented here are consistent with alternative models of pulmonary exacerbation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20-45%) of the global land grid points, particularly in areas where the hydro-graph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5-30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper discusses how variations in the pattern of convective plasma flows should beincluded in self-consistent time-dependent models of the coupled ionosphere-thermosphere system. The author shows how these variations depend upon the mechanism by which the solar wind flow excites the convection. The modelling of these effects is not just of relevance to the polar ionosphere. This is because the influence of convection is not confined to high latitudes: the resultant heating and composition changes in the thermosphere are communicated to lower latitudes by the winds which are also greatly modified by the plasma convection. These thermospheric changes alter the global distribution of plasma by modulatingthe rates of the chemical reactions which areresponsible for the loss of plasma. Hence the modelling of these high-latitude processes is of relevanceto the design and operation of HF communication, radar and navigation systems worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article examines whether commodity risk is priced in the cross-section of global equity returns. We employ a long-only equally-weighted portfolio of commodity futures and a term structure portfolio that captures phases of backwardation and contango as mimicking portfolios for commodity risk. We find that equity-sorted portfolios with greater sensitivities to the excess returns of the backwardation and contango portfolio command higher average excess returns, suggesting that when measured appropriately, commodity risk is pervasive in stocks. Our conclusions are robust to the addition to the pricing model of financial, macroeconomic and business cycle-based risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we deal with a Bayesian analysis for right-censored survival data suitable for populations with a cure rate. We consider a cure rate model based on the negative binomial distribution, encompassing as a special case the promotion time cure model. Bayesian analysis is based on Markov chain Monte Carlo (MCMC) methods. We also present some discussion on model selection and an illustration with a real dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a large-scale systematics of charge densities, excitation energies and deformation parameters For hundreds of heavy nuclei The systematics is based on a generalized rotation vibration model for the quadrupole and octupole modes and takes into account second-order contributions of the deformations as well as the effects of finite diffuseness values for the nuclear densities. We compare our results with the predictions of classical surface vibrations in the hydrodynamical approximation. (C) 2010 Elsevier B V All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Durante a análise sísmica de estruturas complexas, o modelo matemático empregado deveria incluir não só as distribuicões irregulares de massas e de rigidezes senão também à natureza tridimensional da ecitação sísmica. Na prática, o elevado número de graus de liberdade involucrado limita este tipo de análise à disponibilidade de grandes computadoras. Este trabalho apresenta um procedimento simplificado, para avaliar a amplificação do movimento sísmico em camadas de solos. Sua aplicação permitiria estabelecer critérios a partir dos quais avalia-se a necessidade de utilizar modelos de interação solo-estrutura mais complexos que os utilizados habitualmente. O procedimento proposto possui as seguientes características : A- Movimento rígido da rocha definido em termos de três componentes ortagonais. Direção de propagação vertical. B- A ecuação constitutiva do solo inclui as características de não linearidade, plasticidade, dependência da história da carga, dissipação de energia e variação de volume. C- O perfil de solos é dicretizado mediante um sistema de massas concentradas. Utiliza-se uma formulação incremental das equações de movimento com integração directa no domínio do tempo. As propriedades pseudo-elásticas do solo são avaliadas em cada intervalo de integração, em função do estado de tensões resultante da acção simultânea das três componentes da excitação. O correcto funcionamento do procedimento proposto é verificado mediante análises unidimensionais (excitação horizontal) incluindo estudos comparativos com as soluções apresentadas por diversos autores. Similarmente apresentam-se análises tridimensionais (acção simultânea das três componentes da excitação considerando registros sísmicos reais. Analisa-se a influência que possui a dimensão da análise (uma análise tridimensional frente a três análises unidimensionais) na resposta de camadas de solos submetidos a diferentes níveis de exçitação; isto é, a limitação do Princípio de Superposisão de Efeitos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste artigo é examinar como as decisões de taxa de juros básica no Brasil (um forte mecanismo de sinalização em política monetária) afetam a estrutura a termo da curva de juros. Diferentemente de outros trabalhos sobre o caso brasileiro, este avalia a evolução da previsibilidade das decisões de política monetária após a introdução do regime de metas de inflação e, também, compara esta evolução com outros países. A metodologia utilizada é um estudo de eventos em 2 períodos distintos: entre jan/2000 e ago/2003, após a introdução do regime de metas de inflação, e entre set/2003 e jul/2008, quando o regime de metas atinge certa maturidade. Os resultados indicam que: 1) os efeitos surpresa na curva de juros estão menores; 2) o poder explicativo das ações de política monetária aumentou; 3) o mercado tem efetuado o ajuste das expectativas de decisão sobre a taxa de juros com antecedência de 3 dias; 4) a previsibilidade e transparência das decisões de política monetária no Brasil aumentaram e estão próximas daquelas observadas nos EUA e Alemanha e superiores ao caso italiano e britânico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tese tem por objetivo principal o estudo da relação entre atividade econômica, inflação e política monetária no tocante a três aspectos importantes. O primeiro, a perspectiva histórica da evolução da relação entre atividade e inflação no pensamento econômico. O segundo, a análise da dinâmica inflacionária utilizando um modelo com fundamentação microeconômica, no caso a curva de Phillips Novo-Keynesiana, com uma aplicação ao caso brasileiro. O terceiro, a avaliação da eficiência dos mecanismos de sinalização de política monetária utilizados pelo Banco Central no Brasil com base nos movimentos na estrutura a termo da taxa de juros com a mudança da meta da Selic. O elemento central que une estes ensaios é a necessidade do formulador de política econômica compreender o impacto significativo das ações de política monetária na definição do curso de curto prazo da economia real para atingir seus objetivos de aliar crescimento econômico com estabilidade de preços. Os resultados destes ensaios indicam que o modelo Novo-Keynesiano, resultado de um longo desenvolvimento na análise econômica, constitui-se numa ferramenta valiosa para estudar a relação entre atividade e inflação. Uma variante deste modelo foi empregada para estudar com relativo sucesso a dinâmica inflacionária no Brasil, obtendo valores para rigidez da economia próximos ao comportamento observado em pesquisas de campo. Finalmente, foi aliviada a previsibilidade das ações do Banco Central para avaliar o estágio atual de desenvolvimento do sistema de metas no Brasil, através da reação da estrutura a termo de juros às mudanças na meta da taxa básica (Selic). Os resultados indicam que comparando o período de 2003 a 2008 com 2000 a 2003, verificamos que os resultados apontam para o aumento da previsibilidade das decisões do Banco Central. Este fato pode ser explicado por alguns fatores: o aprendizado do público sobre o comportamento do Banco Central; a menor volatilidade econômica no cenário econômico e o aperfeiçoamento dos mecanismos de sinalização e da própria operação do sistema de metas. Comparando-se o efeito surpresa no Brasil com aqueles obtidos por países que promoveram mudanças significativas para aumentar a transparência da política monetária no período de 1990 a 1997, observa-se que o efeito surpresa no Brasil nas taxas de curto prazo reduziu-se significativamente. No período de 2000 a 2003, o efeito surpresa era superior aos de EUA, Alemanha e Reino Unido e era da mesma ordem de grandeza da Itália. No período de 2003 a 2008, o efeito surpresa no Brasil está próximo dos valores dos EUA e Alemanha e inferiores aos da Itália e Reino Unido.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nesse trabalho desenvolvemos uma estratégia para o apreçamento de opções de recompra Embutidas . Esse tipo específico de opção está presente em um grande número de debêntures no mercado brasileiro. Em função deste mercado apresentar um número reduzido de ativos, o apreçamento destas opções se faz necessário para que tenhamos condições de ampliar a massa de ativos disponíveis para a análise. Como passo intermediário, é preciso determinar quando é interessante para o emissor efetuar o resgate antecipado da debênture. Para este m, propomos uma metodologia para a estimação da estrutura a termo da taxa de juros do mercado de debêntures com base no modelo de Nelson-Siegel.