927 resultados para Estimation methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-stationary signal modeling is a well addressed problem in the literature. Many methods have been proposed to model non-stationary signals such as time varying linear prediction and AM-FM modeling, the later being more popular. Estimation techniques to determine the AM-FM components of narrow-band signal, such as Hilbert transform, DESA1, DESA2, auditory processing approach, ZC approach, etc., are prevalent but their robustness to noise is not clearly addressed in the literature. This is critical for most practical applications, such as in communications. We explore the robustness of different AM-FM estimators in the presence of white Gaussian noise. Also, we have proposed three new methods for IF estimation based on non-uniform samples of the signal and multi-resolution analysis. Experimental results show that ZC based methods give better results than the popular methods such as DESA in clean condition as well as noisy condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fisheries managers are becoming increasingly aware of the need to quantify all forms of harvest, including that by recreational fishers. This need has been driven by both a growing recognition of the potential impact that noncommercial fishers can have on exploited resources and the requirement to allocate catch limits between different sectors of the wider fishing community in many jurisdictions. Marine recreational fishers are rarely required to report any of their activity, and some form of survey technique is usually required to estimate levels of recreational catch and effort. In this review, we describe and discuss studies that have attempted to estimate the nature and extent of recreational harvests of marine fishes in New Zealand and Australia over the past 20 years. We compare studies by method to show how circumstances dictate their application and to highlight recent developments that other researchers may find of use. Although there has been some convergence of approach, we suggest that context is an important consideration, and many of the techniques discussed here have been adapted to suit local conditions and to address recognized sources of bias. Much of this experience, along with novel improvements to existing approaches, have been reported only in "gray" literature because of an emphasis on providing estimates for immediate management purposes. This paper brings much of that work together for the first time, and we discuss how others might benefit from our experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bycatch, or the incidental catch of nontarget organisms during fi shing operations, is a major issue in U.S. shrimp trawl fisheries. Because bycatch is typically discarded at sea, total bycatch is usually estimated by extrapolating from an observed bycatch sample to the entire fleet with either mean-per-unit or ratio estimators. Using both field observations of commercial shrimp trawlers and computer simulations, I compared five methods for generating bycatch estimates that were used in past studies, a mean-per-unit estimator and four forms of the ratio estimator, respectively: 1) the mean fish catch per unit of effort, where unit effort was a proxy for sample size, 2) the mean of the individual fish to shrimp ratios, 3) the ratio of mean fish catch to mean shrimp catch, 4) the mean of the ratios of fish catch per time fished (a variable measure of effort), and 5) the ratio of mean fish catch per mean time fished. For field data, different methods used to estimate bycatch of Atlantic croaker, spot, and weakfish yielded extremely different results, with no discernible pattern in the estimates by method, geographic region, or species. Simulated fishing fleets were used to compare bycatch estimated by the fi ve methods with “actual” (simulated) bycatch. Simulations were conducted by using both normal and delta lognormal distributions of fish and shrimp and employed a range of values for several parameters, including mean catches of fish and shrimp, variability in the catches of fish and shrimp, variability in fishing effort, number of observations, and correlations between fish and shrimp catches. Results indicated that only the mean per unit estimators provided statistically unbiased estimates, while all other methods overestimated bycatch. The mean of the individual fish to shrimp ratios, the method used in the South Atlantic Bight before the 1990s, gave the most biased estimates. Because of the statistically significant two- and 3-way interactions among parameters, it is unlikely that estimates generated by one method can be converted or corrected to estimates made by another method: therefore bycatch estimates obtained with different methods should not be compared directly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Camera motion estimation is one of the most significant steps for structure-from-motion (SFM) with a monocular camera. The normalized 8-point, the 7-point, and the 5-point algorithms are normally adopted to perform the estimation, each of which has distinct performance characteristics. Given unique needs and challenges associated to civil infrastructure SFM scenarios, selection of the proper algorithm directly impacts the structure reconstruction results. In this paper, a comparison study of the aforementioned algorithms is conducted to identify the most suitable algorithm, in terms of accuracy and reliability, for reconstructing civil infrastructure. The free variables tested are baseline, depth, and motion. A concrete girder bridge was selected as the "test-bed" to reconstruct using an off-the-shelf camera capturing imagery from all possible positions that maximally the bridge's features and geometry. The feature points in the images were extracted and matched via the SURF descriptor. Finally, camera motions are estimated based on the corresponding image points by applying the aforementioned algorithms, and the results evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper Swedish listed companies’ use of capital budgeting and cost of capital estimation methods in 2005 and 2008 are examined. The relation between company characteristics and choice of methods is investigated and both within-country longitudinal and cross-country comparisons are made. Larger companies seem to have used capital budgeting methods more frequently than smaller companies. When compared to U.S. and continental European companies, Swedish listed companies employed capital budgeting methods less frequently. In 2005 the most common method for establishing the cost of equity was by asking the investors what return they required. By 2008 CAPM was instead the most utilised method, which could indicate greater sophistication. The use of project risk when evaluating investments also seems to have gained in popularity, while the use of company risk declined. Overall, the use of sophisticated capital budgeting and cost of capital estimation methods seem to be rising and the use of less sophisticated methods declining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents two novel algorithms for blind chancel equalization (BCE) and blind source separation (BSS). Beside these, a general framework for global convergent analysis is proposed. Finally, the open problem of equalising a non-irreducible system is answered by the algorithm proposed in this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Additive and nonadditive genetic effects on preweaning weight gain (PWG) of a commercial crossbred population were estimated using different genetic models and estimation methods. The data set consisted of 103,445 records on purebred and crossbred Nelore-Hereford calves raised under pasture conditions on farms located in south, southeast, and middle west Brazilian regions. In addition to breed additive and dominance effects, the models including different epistasis covariables were tested. Models considering joint additive and environment (latitude) by genetic effects interactions were also applied. In a first step, analyses were carried out under animal models. In a second step, preadjusted records were analyzed using ordinary least squares (OLS) and ridge regression (RR). The results reinforced evidence that breed additive and dominance effects are not sufficient to explain the observed variability in preweaning traits of Bos taurus x Bos indicus calves, and that genotype x environment interaction plays an important role in the evaluation of crossbred calves. Data were ill-conditioned to estimate the effects of genotype x environment interactions. Models including these effects presented multicolinearity problems. In this case, RR seemed to be a powerful tool for obtaining more plausible and stable estimates. Estimated prediction error variances and variance inflation factors were drastically reduced, and many effects that were not significant under ordinary least squares became significant under RR. Predictions of PWG based on RR estimates were more acceptable from a biological perspective. In temperate and subtropical regions, calves with intermediate genetic compositions (close to 1/2 Nelore) exhibited greater predicted PWG. In the tropics, predicted PWG increased linearly as genotype got closer to Nelore. ©2006 American Society of Animal Science. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One approach to verify the adequacy of estimation methods of reference evapotranspiration is the comparison with the Penman-Monteith method, recommended by the United Nations of Food and Agriculture Organization - FAO, as the standard method for estimating ET0. This study aimed to compare methods for estimating ET0, Makkink (MK), Hargreaves (HG) and Solar Radiation (RS), with Penman-Monteith (PM). For this purpose, we used daily data of global solar radiation, air temperature, relative humidity and wind speed for the year 2010, obtained through the automatic meteorological station, with latitude 18° 91' 66 S, longitude 48° 25' 05 W and altitude of 869m, at the National Institute of Meteorology situated in the Campus of Federal University of Uberlandia - MG, Brazil. Analysis of results for the period were carried out in daily basis, using regression analysis and considering the linear model y = ax, where the dependent variable was the method of Penman-Monteith and the independent, the estimation of ET0 by evaluated methods. Methodology was used to check the influence of standard deviation of daily ET0 in comparison of methods. The evaluation indicated that methods of Solar Radiation and Penman-Monteith cannot be compared, yet the method of Hargreaves indicates the most efficient adjustment to estimate ETo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to investigate, in a population of crossbred cattle, the obtainment of the non-additive genetic effects for the characteristics weight at 205 and 390 days and scrotal circumference, and to evaluate the consideration of these effects in the prediction of breeding values of sires using different estimation methodologies. In method 1, the data were pre-adjusted for the non-additive effects obtained by least squares means method in a model that considered the direct additive, maternal and non-additive fixed genetic effects, the direct and total maternal heterozygosities, and epistasis. In method 2, the non-additive effects were considered covariates in genetic model. Genetic values for adjusted and non-adjusted data were predicted considering additive direct and maternal effects, and for weight at 205 days, also the permanent environmental effect, as random effects in the model. The breeding values of the categories of sires considered for the weight characteristic at 205 days were organized in files, in order to verify alterations in the magnitude of the predictions and ranking of animals in the two methods of correction data for the non-additives effects. The non-additive effects were not similar in magnitude and direction in the two estimation methods used, nor for the characteristics evaluated. Pearson and Spearman correlations between breeding values were higher than 0.94, and the use of different methods does not imply changes in the selection of animals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The amount and type of ground cover is an important characteristic to measure when collecting soil disturbance monitoring data after a timber harvest. Estimates of ground cover and bare soil can be used for tracking changes in invasive species, plant growth and regeneration, woody debris loadings, and the risk of surface water runoff and soil erosion. A new method of assessing ground cover and soil disturbance was recently published by the U.S. Forest Service, the Forest Soil Disturbance Monitoring Protocol (FSDMP). This protocol uses the frequency of cover types in small circular (15cm) plots to compare ground surface in pre- and post-harvest condition. While both frequency and percent cover are common methods of describing vegetation, frequency has rarely been used to measure ground surface cover. In this study, three methods for assessing ground cover percent (step-point, 15cm dia. circular and 1x5m visual plot estimates) were compared to the FSDMP frequency method. Results show that the FSDMP method provides significantly higher estimates of ground surface condition for most soil cover types, except coarse wood. The three cover methods had similar estimates for most cover values. The FSDMP method also produced the highest value when bare soil estimates were used to model erosion risk. In a person-hour analysis, estimating ground cover percent in 15cm dia. plots required the least sampling time, and provided standard errors similar to the other cover estimates even at low sampling intensities (n=18). If ground cover estimates are desired in soil monitoring, then a small plot size (15cm dia. circle), or a step-point method can provide a more accurate estimate in less time than the current FSDMP method.