969 resultados para Semi-parametric models
Resumo:
This paper explores the changing survival patterns of cereal crop variety innovations in the UK since the introduction of plant breeders’ rights in the mid-1960s. Using non-parametric, semi-parametric and parametric approaches, we examine the determinants of the survival of wheat variety innovations, focusing on the impacts of changes to Plant Variety Protection (PVP) regime over the last four decades. We find that the period since the introduction of the PVP regime has been characterised by the accelerated development of new varieties and increased private sector participation in the breeding of cereal crop varieties. However, the increased flow of varieties has been accompanied by a sharp decline in the longevity of innovations. These trends may have contributed to a reduction in the returns appropriated by plant breeders from protected variety innovations and may explain the decline of conventional plant breeding in the UK. It may also explain the persistent demand from the seed industry for stronger protection. The strengthening of the PVP regime in conformity with the UPOV Convention of 1991, the introduction of EU-wide protection through the Community Plant Variety Office and the introduction of royalties on farm-saved seed have had a positive effect on the longevity of protected variety innovations, but have not been adequate to offset the long term decline in survival durations.
Resumo:
We address the problem of automatically identifying and restoring damaged and contaminated images. We suggest a novel approach based on a semi-parametric model. This has two components, a parametric component describing known physical characteristics and a more flexible non-parametric component. The latter avoids the need for a detailed model for the sensor, which is often costly to produce and lacking in robustness. We assess our approach using an analysis of electroencephalographic images contaminated by eye-blink artefacts and highly damaged photographs contaminated by non-uniform lighting. These experiments show that our approach provides an effective solution to problems of this type.
Resumo:
A detailed spectrally-resolved extraterrestrial solar spectrum (ESS) is important for line-by-line radiative transfer modeling in the near-infrared (near-IR). Very few observationally-based high-resolution ESS are available in this spectral region. Consequently the theoretically-calculated ESS by Kurucz has been widely adopted. We present the CAVIAR (Continuum Absorption at Visible and Infrared Wavelengths and its Atmospheric Relevance) ESS which is derived using the Langley technique applied to calibrated observations using a ground-based high-resolution Fourier transform spectrometer (FTS) in atmospheric windows from 2000–10000 cm-1 (1–5 μm). There is good agreement between the strengths and positions of solar lines between the CAVIAR and the satellite-based ACE-FTS (Atmospheric Chemistry Experiment-FTS) ESS, in the spectral region where they overlap, and good agreement with other ground-based FTS measurements in two near-IR windows. However there are significant differences in the structure between the CAVIAR ESS and spectra from semi-empirical models. In addition, we found a difference of up to 8 % in the absolute (and hence the wavelength-integrated) irradiance between the CAVIAR ESS and that of Thuillier et al., which was based on measurements from the Atmospheric Laboratory for Applications and Science satellite and other sources. In many spectral regions, this difference is significant, as the coverage factor k = 2 (or 95 % confidence limit) uncertainties in the two sets of observations do not overlap. Since the total solar irradiance is relatively well constrained, if the CAVIAR ESS is correct, then this would indicate an integrated “loss” of solar irradiance of about 30 W m-2 in the near-IR that would have to be compensated by an increase at other wavelengths.
Resumo:
Forecasting wind power is an important part of a successful integration of wind power into the power grid. Forecasts with lead times longer than 6 h are generally made by using statistical methods to post-process forecasts from numerical weather prediction systems. Two major problems that complicate this approach are the non-linear relationship between wind speed and power production and the limited range of power production between zero and nominal power of the turbine. In practice, these problems are often tackled by using non-linear non-parametric regression models. However, such an approach ignores valuable and readily available information: the power curve of the turbine's manufacturer. Much of the non-linearity can be directly accounted for by transforming the observed power production into wind speed via the inverse power curve so that simpler linear regression models can be used. Furthermore, the fact that the transformed power production has a limited range can be taken care of by employing censored regression models. In this study, we evaluate quantile forecasts from a range of methods: (i) using parametric and non-parametric models, (ii) with and without the proposed inverse power curve transformation and (iii) with and without censoring. The results show that with our inverse (power-to-wind) transformation, simpler linear regression models with censoring perform equally or better than non-linear models with or without the frequently used wind-to-power transformation.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Resumo:
The reliable evaluation of the flood forecasting is a crucial problem for assessing flood risk and consequent damages. Different hydrological models (distributed, semi-distributed or lumped) have been proposed in order to deal with this issue. The choice of the proper model structure has been investigated by many authors and it is one of the main sources of uncertainty for a correct evaluation of the outflow hydrograph. In addition, the recent increasing of data availability makes possible to update hydrological models as response of real-time observations. For these reasons, the aim of this work it is to evaluate the effect of different structure of a semi-distributed hydrological model in the assimilation of distributed uncertain discharge observations. The study was applied to the Bacchiglione catchment, located in Italy. The first methodological step was to divide the basin in different sub-basins according to topographic characteristics. Secondly, two different structures of the semi-distributed hydrological model were implemented in order to estimate the outflow hydrograph. Then, synthetic observations of uncertain value of discharge were generated, as a function of the observed and simulated value of flow at the basin outlet, and assimilated in the semi-distributed models using a Kalman Filter. Finally, different spatial patterns of sensors location were assumed to update the model state as response of the uncertain discharge observations. The results of this work pointed out that, overall, the assimilation of uncertain observations can improve the hydrologic model performance. In particular, it was found that the model structure is an important factor, of difficult characterization, since can induce different forecasts in terms of outflow discharge. This study is partly supported by the FP7 EU Project WeSenseIt.
Resumo:
This work investigates the impact of schooling Oil income distribution in statesjregions of Brazil. Using a semi-parametric model, discussed in DiNardo, Fortin & Lemieux (1996), we measure how much income diíferences between the Northeast and Southeast regions- the country's poorest and richest - and between the states of Ceará and São Paulo in those regions - can be explained by differences in schooling leveIs of the resident population. Using data from the National Household Survey (PNAD), we construct counterfactual densities by reweighting the distribution of the poorest region/state by the schooling profile of the richest. We conclude that: (i) more than 50% of the income di:fference is explained by the difference in schooling; (ii) the highest deciles of the income distribution gain more from an increase in schooling, closely approaching the wage distribution of the richest region/state; and (iii) an increase in schooling, holding the wage structure constant, aggravates the wage disparity in the poorest regions/ states.
Resumo:
Dentre os principais desafios enfrentados no cálculo de medidas de risco de portfólios está em como agregar riscos. Esta agregação deve ser feita de tal sorte que possa de alguma forma identificar o efeito da diversificação do risco existente em uma operação ou em um portfólio. Desta forma, muito tem se feito para identificar a melhor forma para se chegar a esta definição, alguns modelos como o Valor em Risco (VaR) paramétrico assumem que a distribuição marginal de cada variável integrante do portfólio seguem a mesma distribuição , sendo esta uma distribuição normal, se preocupando apenas em modelar corretamente a volatilidade e a matriz de correlação. Modelos como o VaR histórico assume a distribuição real da variável e não se preocupam com o formato da distribuição resultante multivariada. Assim sendo, a teoria de Cópulas mostra-se um grande alternativa, à medida que esta teoria permite a criação de distribuições multivariadas sem a necessidade de se supor qualquer tipo de restrição às distribuições marginais e muito menos as multivariadas. Neste trabalho iremos abordar a utilização desta metodologia em confronto com as demais metodologias de cálculo de Risco, a saber: VaR multivariados paramétricos - VEC, Diagonal,BEKK, EWMA, CCC e DCC- e VaR histórico para um portfólio resultante de posições idênticas em quatro fatores de risco – Pre252, Cupo252, Índice Bovespa e Índice Dow Jones
Resumo:
Convex combinations of long memory estimates using the same data observed at different sampling rates can decrease the standard deviation of the estimates, at the cost of inducing a slight bias. The convex combination of such estimates requires a preliminary correction for the bias observed at lower sampling rates, reported by Souza and Smith (2002). Through Monte Carlo simulations, we investigate the bias and the standard deviation of the combined estimates, as well as the root mean squared error (RMSE), which takes both into account. While comparing the results of standard methods and their combined versions, the latter achieve lower RMSE, for the two semi-parametric estimators under study (by about 30% on average for ARFIMA(0,d,0) series).
Resumo:
Produtividade é frequentemente calculada pela aproximação da função de produção Cobb-Douglas. Tal estimativa, no entanto, pode sofrer de simultaneidade e viés de seleção dos insumos. Olley e Pakes (1996) introduziu um método semi-paramétrico que nos permite estimar os parâmetros da função de produção de forma consistente e, assim, obter medidas de produtividade confiável, controlando tais problemas de viés. Este estudo aplica este método em uma empresa do setor sucroalcooleiro e utiliza o comando opreg do Stata com a finalidade de estimar a função produção, descrevendo a intuição econômica por trás dos resultados.
Resumo:
Esta tese é composta de três artigos que analisam a estrutura a termo das taxas de juros usando diferentes bases de dados e modelos. O capítulo 1 propõe um modelo paramétrico de taxas de juros que permite a segmentação e choques locais na estrutura a termo. Adotando dados do tesouro americano, duas versões desse modelo segmentado são implementadas. Baseado em uma sequência de 142 experimentos de previsão, os modelos propostos são comparados à benchmarks e concluí-se que eles performam melhor nos resultados das previsões fora da amostra, especialmente para as maturidades curtas e para o horizonte de previsão de 12 meses. O capítulo 2 acrescenta restrições de não arbitragem ao estimar um modelo polinomial gaussiano dinâmico de estrutura a termo para o mercado de taxas de juros brasileiro. Esse artigo propõe uma importante aproximação para a série temporal dos fatores de risco da estrutura a termo, que permite a extração do prêmio de risco das taxas de juros sem a necessidade de otimização de um modelo dinâmico completo. Essa metodologia tem a vantagem de ser facilmente implementada e obtém uma boa aproximação para o prêmio de risco da estrutura a termo, que pode ser usada em diferentes aplicações. O capítulo 3 modela a dinâmica conjunta das taxas nominais e reais usando um modelo afim de não arbitagem com variáveis macroeconômicas para a estrutura a termo, afim de decompor a diferença entre as taxas nominais e reais em prêmio de risco de inflação e expectativa de inflação no mercado americano. Uma versão sem variáveis macroeconômicas e uma versão com essas variáveis são implementadas e os prêmios de risco de inflação obtidos são pequenos e estáveis no período analisado, porém possuem diferenças na comparação dos dois modelos analisados.
Resumo:
This paper presents a methodology to estimate and identify different kinds of economic interaction, whenever these interactions can be established in the form of spatial dependence. First, we apply the semi-parametric approach of Chen and Conley (2001) to the estimation of reaction functions. Then, the methodology is applied to the analysis financial providers in Thailand. Based on a sample of financial institutions, we provide an economic framework to test if the actual spatial pattern is compatible with strategic competition (local interactions) or social planning (global interactions). Our estimates suggest that the provision of commercial banks and suppliers credit access is determined by spatial competition, while the Thai Bank of Agriculture and Agricultural Cooperatives is distributed as in a social planner problem.
Resumo:
After the decline of production from natural energy of the reservoir, the methods of enhanced oil recovery, which methods result from the application of special processes such as chemical injection, miscible gases, thermal and others can be applied. The advanced recovery method with alternating - CO2 injection WAG uses the injection of water and gas, normally miscible that will come in contact with the stock oil. In Brazil with the discovery of pre-salt layer that gas gained prominence. The amount of CO2 present in the oil produced in the pre-salt layer, as well as some reservoirs is one of the challenges to be overcome in relation to sustainable production once this gas needs to be processed in some way. Many targets for CO2 are proposed by researchers to describe some alternatives to the use of CO2 gas produced such as enhanced recovery, storage depleted fields, salt caverns storage and marketing of CO2 even in plants. The largest oil discoveries in Brazil have recently been made by Petrobras in the pre -salt layer located between the states of Santa Catarina and Espírito Santo, where he met large volumes of light oil with a density of approximately 28 ° API, low acidity and low sulfur content. This oil that has a large amount of dissolved CO2 and thus a pioneering solution for the fate of this gas comes with an advanced recovery. The objective of this research is to analyze which parameters had the greatest influence on the enhanced recovery process. The simulations were performed using the "GEM" module of the Computer Modelling Group, with the aim of studying the advanced recovery method in question. For this work, semi - synthetic models were used with reservoir and fluid data that can be extrapolated to practical situations in the Brazilian Northeast. The results showed the influence of the alternating injection of water and gas on the recovery factor and flow rate of oil production process, when compared to primary recovery and continuous water injection or continuous gas injection
Resumo:
The growing demand in the use of composite materials necessitates a better understanding of its behavior related to many conditions of loading and service, as well as under several ways of connections involved in mechanisms of structural projects. Within these project conditions are highlighted the presence of geometrical discontinuities in the area of cross and longitudinal sections of structural elements and environmental conditions of work like UV radiation, moisture, heat, leading to a decrease in final mechanical response of the material. In this sense, this thesis aims to develop studies detailed (experimental and semi-empirical models) the effects caused by the presence of geometric discontinuity, more specifically, a central hole in the longitudinal section (with reduced cross section) and the influence of accelerated environmental aging on the mechanical properties and fracture mechanism of FGRP composite laminates under the action of uniaxial tensile loads. Studies on morphological behavior and structural degradation of composite laminates are performed by macroscopic and microscopic analysis of affected surfaces, in addition to evaluation by the Measurement technique for mass variation (TMVM). The accelerated environmental aging conditions are simulated by aging chamber. To study the simultaneous influence of aging/geometric discontinuity in the mechanical properties of composite laminates, a semiempirical model is proposed and called IE/FCPM Model. For the stress concentration due to the central hole, an analisys by failures criteria were performed by Average-Stress Criterion (ASC) and Point-Stress Criterion (PSC). Two polymeric composite laminates, manufactured industrially were studied: the first is only reinforced by short mats of fiberglass-E (LM) and the second where the reinforced by glass fiber/E comes in the form of bidirectional fabric (LT). In the conception configurations of laminates the anisotropy is crucial to the final mechanical response of the same. Finally, a comparative study of all parameters was performed for a better understanding of the results. How conclusive study, the characteristics of the final fracture of the laminate under all conditions that they were subjected, were analyzed. These analyzes were made at the macroscopic level (scanner) microscope (optical and scanning electron). At the end of the analyzes, it was observed that the degradation process occurs similarly for each composite researched, however, the LM composite compared to composite LT (configurations LT 0/90º and LT ±45º) proved to be more susceptible to loss of mechanical properties in both regarding with the central hole as well to accelerated environmental aging
Resumo:
Includes bibliography