23 resultados para Robust methods

em CentAUR: Central Archive University of Reading - UK


Relevância:

70.00% 70.00%

Publicador:

Resumo:

We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium-correction models. Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, impulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift. We derive the resulting forecast biases and error variances, and indicate when the methods are likely to perform well. The robust methods are applied to forecasting US GDP using autoregressive models, and also to autoregressive models with factors extracted from a large dataset of macroeconomic variables. We consider forecasting performance over the Great Recession, and over an earlier more quiescent period.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fixed transactions costs that prohibit exchange engender bias in supply analysis due to censoring of the sample observations. The associated bias in conventional regression procedures applied to censored data and the construction of robust methods for mitigating bias have been preoccupations of applied economists since Tobin [Econometrica 26 (1958) 24]. This literature assumes that the true point of censoring in the data is zero and, when this is not the case, imparts a bias to parameter estimates of the censored regression model. We conjecture that this bias can be significant; affirm this from experiments; and suggest techniques for mitigating this bias using Bayesian procedures. The bias-mitigating procedures are based on modifications of the key step that facilitates Bayesian estimation of the censored regression model; are easy to implement; work well in both small and large samples; and lead to significantly improved inference in the censored regression model. These findings are important in light of the widespread use of the zero-censored Tobit regression and we investigate their consequences using data on milk-market participation in the Ethiopian highlands. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is intense scientific and public interest in the Intergovernmental Panel on Climate Change (IPCC) projections of sea level for the twenty-first century and beyond. The Fourth Assessment Report (AR4) projections, obtained by applying standard methods to the results of the World Climate Research Programme Coupled Model Experiment, includes estimates of ocean thermal expansion, the melting of glaciers and ice caps (G&ICs), increased melting of the Greenland Ice Sheet, and increased precipitation over Greenland and Antarctica, partially offsetting other contributions. The AR4 recognized the potential for a rapid dynamic ice sheet response but robust methods for quantifying it were not available. Illustrative scenarios suggested additional sea level rise on the order of 10 to 20 cm or more, giving a wide range in the global averaged projections of about 20 to 80 cm by 2100. Currently, sea level is rising at a rate near the upper end of these projections. Since publication of the AR4 in 2007, biases in historical ocean temperature observations have been identified and significantly reduced, resulting in improved estimates of ocean thermal expansion. Models that include all climate forcings are in good agreement with these improved observations and indicate the importance of stratospheric aerosol loadings from volcanic eruptions. Estimates of the volumes of G&ICs and their contributions to sea level rise have improved. Results from recent (but possibly incomplete) efforts to develop improved ice sheet models should be available for the 2013 IPCC projections. Improved understanding of sea level rise is paving the way for using observations to constrain projections. Understanding of the regional variations in sea level change as a result of changes in ocean properties, wind-stress patterns, and heat and freshwater inputs into the ocean is improving. Recently, estimates of sea level changes resulting from changes in Earth's gravitational field and the solid Earth response to changes in surface loading have been included in regional projections. While potentially valuable, semi-empirical models have important limitations, and their projections should be treated with caution

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is controversy about whether traditional medicine can guide drug discovery, and investment in ethnobotanically led research has fluctuated. One view is that traditionally used plants are not necessarily efficacious and there are no robust methods for distinguishing the ones that are most likely to be bioactive when selecting species for further testing. Here, we reconstruct a genus-level molecular phylogeny representing the 20,000 species found in the floras of three disparate biodiversity hotspots: Nepal, New Zealand and the Cape of South Africa. Borrowing phylogenetic methods from community ecology, we reveal significant clustering of the 1,500 traditionally used species, and provide a direct measure of the relatedness of the three medicinal floras. We demonstrate shared phylogenetic patterns across the floras: related plants from these regions are used to treat medical conditions in the same therapeutic areas. This strongly suggests independent discovery of plant efficacy, an interpretation corroborated by the presence of a significantly greater proportion of known bioactive species in these plant groups than in a random sample. Phylogenetic cross-cultural comparison can focus screening efforts on a subset of traditionally used plants that are richer in bioactive compounds, and could revitalise the use of traditional knowledge in bioprospecting.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Changes in landscape composition and structure may impact the conservation and management of protected areas. Species that depend on specific habitats are at risk of extinction when these habitats are degraded or lost. Designing robust methods to evaluate landscape composition will assist decision- and policy-making in emerging landscapes. This paper describes a rapid assessment methodology aimed at evaluating landcover quality for birds, plants, butterflies and bees around seven UK Natura 2000 sites. An expert panel assigned quality values to standard Coordination of Information on the Environment (CORINE) landcover classes for each taxonomic group. Quality was assessed based on historical (1950, 1990), current (2000) and future (2030) land-cover data, the last projected using three alternative scenarios: a growth applied strategy (GRAS), a business-as-might-beusual (BAMBU) scenario, and sustainable European development goal (SEDG) scenario. A quantitative quality index weighted the area of each land-cover parcel with a taxa-specific quality measure. Land parcels with high quality for all taxonomic groups were evaluated for temporal changes in area, size and adjacency. For all sites and taxonomic groups, the rate of deterioration of land-cover quality was greater between 1950 and 1990 than current rates or as modelled using the alternative future scenarios (2000– 2030). Model predictions indicated land-cover quality stabilized over time under the GRAS scenario, and was close to stable for the BAMBU scenario. The SEDG scenario suggested an ongoing loss of quality, though this was lower than the historical rate of c. 1% loss per decade. None of the future scenarios showed accelerated fragmentation, but rather increases in the area, adjacency and diversity of high quality land parcels in the landscape.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ethnobotanical relevance Cancer patients commonly use traditional medicines (TM) and in Thailand these are popular for both self-medication and as prescribed by TM practitioners, and are rarely monitored. A study was conducted at Wat Khampramong, a Thai Buddhist temple herbal medicine hospice, to document some of these practices as well as the hospice regime. Materials and methods Cancer patients (n=286) were surveyed shortly after admission as to which TMs they had previously taken and perceptions of effects experienced. They were also asked to describe their current symptoms. Treatment at the hospice is built upon an 11-herb anti-cancer formula, yod-ya-mareng, prescribed for all patients, and ideally, its effects would have been evaluated. However other herbal medicines and holistic practices are integral to the regime, so instead we attempted to assess the value of the patients׳ stay at the hospice by measuring any change in symptom burden, as they perceived it. Surviving patients (n=270) were therefore asked to describe their symptoms again just before leaving. Results 42% of patients (120/286; 95% CI 36.4%, 47.8%) had used herbal medicines before their arrival, with 31.7% (38/120; 95% CI 24%, 40.4%) using several at once. Mixed effects were reported for these products. After taking the herbal regime at Khampramong, 77% (208/270 95% CI; 71.7%, 81.7%) reported benefit, and a comparison of the incidence of the most common (pain, dyspepsia, abdominal or visceral pain, insomnia, fatigue) showed statistical significance (χ2 57.1, df 7, p<0.001). Conclusions A wide range of TMs is taken by cancer patients in Thailand and considered to provide more benefit than harm, and this perception extends to the temple regime. Patients reported a significant reduction in symptoms after staying at Khampramong, indicating an improvement in quality of life, the aim of hospices everywhere. Based on this evidence, it is not possible to justify the use of TM for cancer in general, but this study suggests that further research is warranted. The uncontrolled use of TMs, many of which are uncharacterised, raises concerns, and this work also highlights the fact that validated, robust methods of assessing holistic medical regimes are urgently needed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Assessments concerning the effects of climate change, water resource availability and water deprivation in West Africa have not frequently considered the positive contribution to be derived from collecting and reusing water for domestic purposes. Where the originating water is taken from a clean water source and has been used the first time for washing or bathing, this water is commonly called “greywater”. Greywater is a prolific resource that is generated wherever people live. Treated greywater can be used for domestic cleaning, for flushing toilets where appropriate, for washing cars, sometimes for watering kitchen gardens, and for clothes washing prior to rinsing. Therefore, a large theoretical potential exists to increase total water resource availability if greywater were to be widely reused. Locally treated greywater reduces the distribution network requirement, lower construction effort and cost and, wherever possible, minimising the associated carbon footprint. Such locally treated greywater offers significant practical opportunities for increasing the total available water resources at a local level. The reuse of treated greywater is one important action that will help to mitigate the reducing availability of clean water supplies in some areas, and the expected mitigation required in future aligns well with WHO/UNICEF (2012) aspirations. The evaluation of potential opportunities for prioritising greywater systems to support water reuse takes into account the availability of water resources, water use indicators and published estimates in order to understand typical patterns of water demand. The approach supports knowledge acquisition regarding local conditions for enabling capacity building for greywater reuse, the understanding of systems that are most likely to encourage greywater reuse, and practices and future actions to stimulate greywater infrastructure planning, design and implementation. Although reuse might be considered to increase the uncertainty of achieving a specified quality of the water supply, robust methods and technologies are available for local treatment. Resource strategies for greywater reuse have the potential to consistently improve water efficiency and availability in water impoverished and water stressed regions of Ghana and West Africa. Untreated greywater is referred to as “greywater”; treated greywater is referred to as “treated greywater” in this paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1. The rapid expansion of systematic monitoring schemes necessitates robust methods to reliably assess species' status and trends. Insect monitoring poses a challenge where there are strong seasonal patterns, requiring repeated counts to reliably assess abundance. Butterfly monitoring schemes (BMSs) operate in an increasing number of countries with broadly the same methodology, yet they differ in their observation frequency and in the methods used to compute annual abundance indices. 2. Using simulated and observed data, we performed an extensive comparison of two approaches used to derive abundance indices from count data collected via BMS, under a range of sampling frequencies. Linear interpolation is most commonly used to estimate abundance indices from seasonal count series. A second method, hereafter the regional generalized additive model (GAM), fits a GAM to repeated counts within sites across a climatic region. For the two methods, we estimated bias in abundance indices and the statistical power for detecting trends, given different proportions of missing counts. We also compared the accuracy of trend estimates using systematically degraded observed counts of the Gatekeeper Pyronia tithonus (Linnaeus 1767). 3. The regional GAM method generally outperforms the linear interpolation method. When the proportion of missing counts increased beyond 50%, indices derived via the linear interpolation method showed substantially higher estimation error as well as clear biases, in comparison to the regional GAM method. The regional GAM method also showed higher power to detect trends when the proportion of missing counts was substantial. 4. Synthesis and applications. Monitoring offers invaluable data to support conservation policy and management, but requires robust analysis approaches and guidance for new and expanding schemes. Based on our findings, we recommend the regional generalized additive model approach when conducting integrative analyses across schemes, or when analysing scheme data with reduced sampling efforts. This method enables existing schemes to be expanded or new schemes to be developed with reduced within-year sampling frequency, as well as affording options to adapt protocols to more efficiently assess species status and trends across large geographical scales.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria that optimizes model adequacy, or the predicted residual sums of squares (PRESS) statistic that optimizes model generalization capability, respectively. Three robust identification algorithms are introduced, i.e., combined A- and D-optimality with regularized orthogonal least squares algorithm, respectively; and combined PRESS statistic with regularized orthogonal least squares algorithm. A common characteristic of these algorithms is that the inherent computation efficiency associated with the orthogonalization scheme in orthogonal least squares or regularized orthogonal least squares has been extended such that the new algorithms are computationally efficient. Numerical examples are included to demonstrate effectiveness of the algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change science is increasingly concerned with methods for managing and integrating sources of uncertainty from emission storylines, climate model projections, and ecosystem model parameterizations. In tropical ecosystems, regional climate projections and modeled ecosystem responses vary greatly, leading to a significant source of uncertainty in global biogeochemical accounting and possible future climate feedbacks. Here, we combine an ensemble of IPCC-AR4 climate change projections for the Amazon Basin (eight general circulation models) with alternative ecosystem parameter sets for the dynamic global vegetation model, LPJmL. We evaluate LPJmL simulations of carbon stocks and fluxes against flux tower and aboveground biomass datasets for individual sites and the entire basin. Variability in LPJmL model sensitivity to future climate change is primarily related to light and water limitations through biochemical and water-balance-related parameters. Temperature-dependent parameters related to plant respiration and photosynthesis appear to be less important than vegetation dynamics (and their parameters) for determining the magnitude of ecosystem response to climate change. Variance partitioning approaches reveal that relationships between uncertainty from ecosystem dynamics and climate projections are dependent on geographic location and the targeted ecosystem process. Parameter uncertainty from the LPJmL model does not affect the trajectory of ecosystem response for a given climate change scenario and the primary source of uncertainty for Amazon 'dieback' results from the uncertainty among climate projections. Our approach for describing uncertainty is applicable for informing and prioritizing policy options related to mitigation and adaptation where long-term investments are required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a novel method for scoring the accuracy of protein binding site predictions – the Binding-site Distance Test (BDT) score. Recently, the Matthews Correlation Coefficient (MCC) has been used to evaluate binding site predictions, both by developers of new methods and by the assessors for the community wide prediction experiment – CASP8. Whilst being a rigorous scoring method, the MCC does not take into account the actual 3D location of the predicted residues from the observed binding site. Thus, an incorrectly predicted site that is nevertheless close to the observed binding site will obtain an identical score to the same number of nonbinding residues predicted at random. The MCC is somewhat affected by the subjectivity of determining observed binding residues and the ambiguity of choosing distance cutoffs. By contrast the BDT method produces continuous scores ranging between 0 and 1, relating to the distance between the predicted and observed residues. Residues predicted close to the binding site will score higher than those more distant, providing a better reflection of the true accuracy of predictions. The CASP8 function predictions were evaluated using both the MCC and BDT methods and the scores were compared. The BDT was found to strongly correlate with the MCC scores whilst also being less susceptible to the subjectivity of defining binding residues. We therefore suggest that this new simple score is a potentially more robust method for future evaluations of protein-ligand binding site predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the G(A)(0) distribution is assumed as the universal model for amplitude Synthetic Aperture (SAR) imagery data under the Multiplicative Model. The observed data, therefore, is assumed to obey a G(A)(0) (alpha; gamma, n) law, where the parameter n is related to the speckle noise, and (alpha, gamma) are related to the ground truth, giving information about the background. Therefore, maps generated by the estimation of (alpha, gamma) in each coordinate can be used as the input for classification methods. Maximum likelihood estimators are derived and used to form estimated parameter maps. This estimation can be hampered by the presence of corner reflectors, man-made objects used to calibrate SAR images that produce large return values. In order to alleviate this contamination, robust (M) estimators are also derived for the universal model. Gaussian Maximum Likelihood classification is used to obtain maps using hard-to-deal-with simulated data, and the superiority of robust estimation is quantitatively assessed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several pixel-based people counting methods have been developed over the years. Among these the product of scale-weighted pixel sums and a linear correlation coefficient is a popular people counting approach. However most approaches have paid little attention to resolving the true background and instead take all foreground pixels into account. With large crowds moving at varying speeds and with the presence of other moving objects such as vehicles this approach is prone to problems. In this paper we present a method which concentrates on determining the true-foreground, i.e. human-image pixels only. To do this we have proposed, implemented and comparatively evaluated a human detection layer to make people counting more robust in the presence of noise and lack of empty background sequences. We show the effect of combining human detection with a pixel-map based algorithm to i) count only human-classified pixels and ii) prevent foreground pixels belonging to humans from being absorbed into the background model. We evaluate the performance of this approach on the PETS 2009 dataset using various configurations of the proposed methods. Our evaluation demonstrates that the basic benchmark method we implemented can achieve an accuracy of up to 87% on sequence ¿S1.L1 13-57 View 001¿ and our proposed approach can achieve up to 82% on sequence ¿S1.L3 14-33 View 001¿ where the crowd stops and the benchmark accuracy falls to 64%.