46 resultados para VaR Estimation methods, Statistical Methods, Risk managment, Investments
em CentAUR: Central Archive University of Reading - UK
Resumo:
Technology involving genetic modification of crops has the potential to make a contribution to rural poverty reduction in many developing countries. Thus far, insecticide-producing 'Bt' varieties of cotton have been the main GM crops under cultivation in developing nations. Several studies have evaluated the farm-level performance of Bt varieties in comparison to conventional ones by estimating production technology, and have mostly found Bt technology to be very successful in raising output and/or reducing insecticide input. However, the production risk properties of this technology have not been studied, although they are likely to be important to risk-averse smallholders. This study investigates the output risk aspects of Bt technology using a three-year farm-level dataset on smallholder cotton production in Makhathini flats, Kwa-Zulu Natal, South Africa. Stochastic dominance and stochastic production function estimation methods are used to examine the risk properties of the two technologies. Results indicate that Bt technology increases output risk by being most effective when crop growth conditions are good, but being less effective when conditions are less favourable. However, in spite of its risk increasing effect, the mean output performance of Bt cotton is good enough to make it preferable to conventional technology even for risk-averse smallholders.
Resumo:
Many well-established statistical methods in genetics were developed in a climate of severe constraints on computational power. Recent advances in simulation methodology now bring modern, flexible statistical methods within the reach of scientists having access to a desktop workstation. We illustrate the potential advantages now available by considering the problem of assessing departures from Hardy-Weinberg (HW) equilibrium. Several hypothesis tests of HW have been established, as well as a variety of point estimation methods for the parameter which measures departures from HW under the inbreeding model. We propose a computational, Bayesian method for assessing departures from HW, which has a number of important advantages over existing approaches. The method incorporates the effects-of uncertainty about the nuisance parameters--the allele frequencies--as well as the boundary constraints on f (which are functions of the nuisance parameters). Results are naturally presented visually, exploiting the graphics capabilities of modern computer environments to allow straightforward interpretation. Perhaps most importantly, the method is founded on a flexible, likelihood-based modelling framework, which can incorporate the inbreeding model if appropriate, but also allows the assumptions of the model to he investigated and, if necessary, relaxed. Under appropriate conditions, information can be shared across loci and, possibly, across populations, leading to more precise estimation. The advantages of the method are illustrated by application both to simulated data and to data analysed by alternative methods in the recent literature.
Resumo:
This paper discusses how numerical gradient estimation methods may be used in order to reduce the computational demands on a class of multidimensional clustering algorithms. The study is motivated by the recognition that several current point-density based cluster identification algorithms could benefit from a reduction of computational demand if approximate a-priori estimates of the cluster centres present in a given data set could be supplied as starting conditions for these algorithms. In this particular presentation, the algorithm shown to benefit from the technique is the Mean-Tracking (M-T) cluster algorithm, but the results obtained from the gradient estimation approach may also be applied to other clustering algorithms and their related disciplines.
Modelled soil organic carbon stocks and changes in the Indo-Gangetic Plains, India from 1980 to 2030
Resumo:
The Global Environment Facility co-financed Soil Organic Carbon (GEFSOC) Project developed a comprehensive modelling system for predicting soil organic carbon (SOC) stocks and changes over time. This research is an effort to predict SOC stocks and changes for the Indian, Indo-Gangetic Plains (IGP), an area with a predominantly rice (Oryza sativa) - wheat (Triticum aestivum) cropping system, using the GEFSOC Modelling System and to compare output with stocks generated using mapping approaches based on soil survey data. The GEFSOC Modelling System predicts an estimated SOC stock for the IGP, India of 1.27, 1.32 and 1.27 Pg for 1990, 2000 and 2030, respectively, in the top 20 cm of soil. The SOC stock using a mapping approach based on soil survey data was 0.66 and 0.88 Pg for 1980 and 2000, respectively. The SOC stock estimated using the GEFSOC Modelling System is higher than the stock estimated using the mapping approach. This is due to the fact that while the GEFSOC System accounts for variation in crop input data (crop management), the soil mapping approach only considers regional variation in soil texture and wetness. The trend of overall change in the modelled SOC stock estimates shows that the IGP, India may have reached an equilibrium following 30-40 years of the Green Revolution. This can be seen in the SOC stock change rates. Various different estimation methods show SOC stocks of 0.57-1.44 Pg C for the study area. The trend of overall change in C stock assessed from the soil survey data indicates that the soils of the IGP, India may store a projected 1.1 Pg of C in 2030. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Radiotelemetry is an important tool used to aid the understanding and conservation of cryptic and rare birds. The two bird species of the family Picathartidae are little-known, secretive, forest-dwelling birds endemic to western and central Africa. In 2005, we conducted a radio-tracking trial of Grey-necked Picathartes Picathartes oreas in the Mbam Minkom Mountain Forest, southern Cameroon, using neck collar (two birds) and tail-mounted (four birds) transmitters to investigate the practicality of radio-tracking Picathartidae. Three birds with tail-mounted transmitters were successfully tracked with the fourth, though not relocated for radio tracking, resighted the following breeding season. Two of these were breeding birds that continued to provision young during radio tracking. One neck-collared bird was found dead three days after transmitter attachment and the other neither relocated nor resighted. As mortality in one bird was potentially caused by the neck collar transmitter we recommend tail-mounted transmitters in future radio-tracking studies of Picathartidae. Home ranges, shown using minimum convex polygon and kernel estimation methods, were generally small (<0.5 km(2)) and centred around breeding sites. A minimum of 60 fixes were found to be sufficient for home range estimation.
Resumo:
Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.
Resumo:
We present a comparative analysis of projected impacts of climate change on river runoff from two types of distributed hydrological model, a global hydrological model (GHM) and catchment-scale hydrological models (CHM). Analyses are conducted for six catchments that are global in coverage and feature strong contrasts in spatial scale as well as climatic and development conditions. These include the Liard (Canada), Mekong (SE Asia), Okavango (SW Africa), Rio Grande (Brazil), Xiangu (China) and Harper's Brook (UK). A single GHM (Mac-PDM.09) is applied to all catchments whilst different CHMs are applied for each catchment. The CHMs typically simulate water resources impacts based on a more explicit representation of catchment water resources than that available from the GHM, and the CHMs include river routing. Simulations of average annual runoff, mean monthly runoff and high (Q5) and low (Q95) monthly runoff under baseline (1961-1990) and climate change scenarios are presented. We compare the simulated runoff response of each hydrological model to (1) prescribed increases in global mean temperature from the HadCM3 climate model and (2)a prescribed increase in global-mean temperature of 2oC for seven GCMs to explore response to climate model and structural uncertainty. We find that differences in projected changes of mean annual runoff between the two types of hydrological model can be substantial for a given GCM, and they are generally larger for indicators of high and low flow. However, they are relatively small in comparison to the range of projections across the seven GCMs. Hence, for the six catchments and seven GCMs we considered, climate model structural uncertainty is greater than the uncertainty associated with the type of hydrological model applied. Moreover, shifts in the seasonal cycle of runoff with climate change are presented similarly by both hydrological models, although for some catchments the monthly timing of high and low flows differs.This implies that for studies that seek to quantify and assess the role of climate model uncertainty on catchment-scale runoff, it may be equally as feasible to apply a GHM as it is to apply a CHM, especially when climate modelling uncertainty across the range of available GCMs is as large as it currently is. Whilst the GHM is able to represent the broad climate change signal that is represented by the CHMs, we find, however, that for some catchments there are differences between GHMs and CHMs in mean annual runoff due to differences in potential evaporation estimation methods, in the representation of the seasonality of runoff, and in the magnitude of changes in extreme monthly runoff, all of which have implications for future water management issues.
Resumo:
Modeling the vertical penetration of photosynthetically active radiation (PAR) through the ocean, and its utilization by phytoplankton, is fundamental to simulating marine primary production. The variation of attenuation and absorption of light with wavelength suggests that photosynthesis should be modeled at high spectral resolution, but this is computationally expensive. To model primary production in global 3d models, a balance between computer time and accuracy is necessary. We investigate the effects of varying the spectral resolution of the underwater light field and the photosynthetic efficiency of phytoplankton (α∗), on primary production using a 1d coupled ecosystem ocean turbulence model. The model is applied at three sites in the Atlantic Ocean (CIS (∼60°N), PAP (∼50°N) and ESTOC (∼30°N)) to include the effect of different meteorological forcing and parameter sets. We also investigate three different methods for modeling α∗ – as a fixed constant, varying with both wavelength and chlorophyll concentration [Bricaud, A., Morel, A., Babin, M., Allali, K., Claustre, H., 1998. Variations of light absorption by suspended particles with chlorophyll a concentration in oceanic (case 1) waters. Analysis and implications for bio-optical models. J. Geophys. Res. 103, 31033–31044], and using a non-spectral parameterization [Anderson, T.R., 1993. A spectrally averaged model of light penetration and photosynthesis. Limnol. Oceanogr. 38, 1403–1419]. After selecting the appropriate ecosystem parameters for each of the three sites we vary the spectral resolution of light and α∗ from 1 to 61 wavebands and study the results in conjunction with the three different α∗ estimation methods. The results show modeled estimates of ocean primary productivity are highly sensitive to the degree of spectral resolution and α∗. For accurate simulations of primary production and chlorophyll distribution we recommend a spectral resolution of at least six wavebands if α∗ is a function of wavelength and chlorophyll, and three wavebands if α∗ is a fixed value.
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.
Resumo:
This paper presents the mathematical development of a body-centric nonlinear dynamic model of a quadrotor UAV that is suitable for the development of biologically inspired navigation strategies. Analytical approximations are used to find an initial guess of the parameters of the nonlinear model, then parameter estimation methods are used to refine the model parameters using the data obtained from onboard sensors during flight. Due to the unstable nature of the quadrotor model, the identification process is performed with the system in closed-loop control of attitude angles. The obtained model parameters are validated using real unseen experimental data. Based on the identified model, a Linear-Quadratic (LQ) optimal tracker is designed to stabilize the quadrotor and facilitate its translational control by tracking body accelerations. The LQ tracker is tested on an experimental quadrotor UAV and the obtained results are a further means to validate the quality of the estimated model. The unique formulation of the control problem in the body frame makes the controller better suited for bio-inspired navigation and guidance strategies than conventional attitude or position based control systems that can be found in the existing literature.
Resumo:
This paper presents an open-source canopy height profile (CHP) toolkit designed for processing small-footprint full-waveform LiDAR data to obtain the estimates of effective leaf area index (LAIe) and CHPs. The use of the toolkit is presented with a case study of LAIe estimation in discontinuous-canopy fruit plantations. The experiments are carried out in two study areas, namely, orange and almond plantations, with different percentages of canopy cover (48% and 40%, respectively). For comparison, two commonly used discrete-point LAIe estimation methods are also tested. The LiDAR LAIe values are first computed for each of the sites and each method as a whole, providing “apparent” site-level LAIe, which disregards the discontinuity of the plantations’ canopies. Since the toolkit allows for the calculation of the study area LAIe at different spatial scales, between-tree-level clumpingcan be easily accounted for and is then used to illustrate the impact of the discontinuity of canopy cover on LAIe retrieval. The LiDAR LAIe estimates are therefore computed at smaller scales as a mean of LAIe in various grid-cell sizes, providing estimates of “actual” site-level LAIe. Subsequently, the LiDAR LAIe results are compared with theoretical models of “apparent” LAIe versus “actual” LAIe, based on known percent canopy cover in each site. The comparison of those models to LiDAR LAIe derived from the smallest grid-cell sizes against the estimates of LAIe for the whole site has shown that the LAIe estimates obtained from the CHP toolkit provided values that are closest to those of theoretical models.
Resumo:
It is generally accepted that genetics may be an important factor in explaining the variation between patients’ responses to certain drugs. However, identification and confirmation of the responsible genetic variants is proving to be a challenge in many cases. A number of difficulties that maybe encountered in pursuit of these variants, such as non-replication of a true effect, population structure and selection bias, can be mitigated or at least reduced by appropriate statistical methodology. Another major statistical challenge facing pharmacogenetics studies is trying to detect possibly small polygenic effects using large volumes of genetic data, while controlling the number of false positive signals. Here we review statistical design and analysis options available for investigations of genetic resistance to anti-epileptic drugs.
Resumo:
The proportional odds model provides a powerful tool for analysing ordered categorical data and setting sample size, although for many clinical trials its validity is questionable. The purpose of this paper is to present a new class of constrained odds models which includes the proportional odds model. The efficient score and Fisher's information are derived from the profile likelihood for the constrained odds model. These results are new even for the special case of proportional odds where the resulting statistics define the Mann-Whitney test. A strategy is described involving selecting one of these models in advance, requiring assumptions as strong as those underlying proportional odds, but allowing a choice of such models. The accuracy of the new procedure and its power are evaluated.