906 resultados para accuracy of estimation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing empirical evidence has frequently observed that professional forecasters are conservative and display herding behaviour. Whilst a large number of papers have considered equities as well as macroeconomic series, few have considered the accuracy of forecasts in alternative asset classes such as real estate. We consider the accuracy of forecasts for the UK commercial real estate market over the period 1999-2011. The results illustrate that forecasters display a tendency to under-estimate growth rates during strong market conditions and over-estimate when the market is performing poorly. This conservatism not only results in smoothed estimates but also implies that forecasters display herding behaviour. There is also a marked difference in the relative accuracy of capital and total returns versus rental figures. Whilst rental growth forecasts are relatively accurate, considerable inaccuracy is observed with respect to capital value and total returns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a spatial-temporal downscaling approach to construction of the intensity-duration-frequency (IDF) relations at a local site in the context of climate change and variability. More specifically, the proposed approach is based on a combination of a spatial downscaling method to link large-scale climate variables given by General Circulation Model (GCM) simulations with daily extreme precipitations at a site and a temporal downscaling procedure to describe the relationships between daily and sub-daily extreme precipitations based on the scaling General Extreme Value (GEV) distribution. The feasibility and accuracy of the suggested method were assessed using rainfall data available at eight stations in Quebec (Canada) for the 1961-2000 period and climate simulations under four different climate change scenarios provided by the Canadian (CGCM3) and UK (HadCM3) GCM models. Results of this application have indicated that it is feasible to link sub-daily extreme rainfalls at a local site with large-scale GCM-based daily climate predictors for the construction of the IDF relations for present (1961-1990) and future (2020s, 2050s, and 2080s) periods at a given site under different climate change scenarios. In addition, it was found that annual maximum rainfalls downscaled from the HadCM3 displayed a smaller change in the future, while those values estimated from the CGCM3 indicated a large increasing trend for future periods. This result has demonstrated the presence of high uncertainty in climate simulations provided by different GCMs. In summary, the proposed spatial-temporal downscaling method provided an essential tool for the estimation of extreme rainfalls that are required for various climate-related impact assessment studies for a given region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. A Monte Carlo study explores the finite sample performance of this procedure and evaluates the forecasting accuracy of models selected by this procedure. Two empirical applications confirm the usefulness of the model selection procedure proposed here for forecasting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial companies in developing countries are facing rapid growths, and this requires having in place the best organizational processes to cope with the market demand. Sales forecasting, as a tool aligned with the general strategy of the company, needs to be as much accurate as possible, in order to achieve the sales targets by making available the right information for purchasing, planning and control of production areas, and finally attending in time and form the demand generated. The present dissertation uses a single case study from the subsidiary of an international explosives company based in Brazil, Maxam, experiencing high growth in sales, and therefore facing the challenge to adequate its structure and processes properly for the rapid growth expected. Diverse sales forecast techniques have been analyzed to compare the actual monthly sales forecast, based on the sales force representatives’ market knowledge, with forecasts based on the analysis of historical sales data. The dissertation findings show how the combination of both qualitative and quantitative forecasts, by the creation of a combined forecast that considers both client´s demand knowledge from the sales workforce with time series analysis, leads to the improvement on the accuracy of the company´s sales forecast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main target here is to determine the orbit of an artificial satellite, using signals of the GPS constellation and least squares algorithms implemented through sequential Givens rotations as a method of estimation, with the aim of improving the performance of the orbit estimation process and, at the same time, minimizing the computational procedure cost. Geopotential perturbations up to high order and direct solar radiation pressure were taken into account. It was also considered the position of the GPS antenna on the satellite body that, lately, consists of the influence of the satellite attitude motion in the orbit determination process. An application has been done, using real data from the Topex/Poseidon satellite, whose ephemeris is available at Internet. The best accuracy obtained in position was smaller than 5 meters for short period (2 hours) and smaller than 28 meters for long period (24 hours) orbit determination. In both cases, the perturbations mentioned before were taken into consideration and the analysis occurred without selective availability on the signals measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statement of problem. A clinically significant incisal pin opening may occur after processing complete dentures if a compression molding technique is used. To recover the proper vertical dimension of occlusion, a time-consuming occlusal adjustment is necessary that often destroys the anatomy of the artificial teeth. A new injection molding process claims to produce dentures that require few, if any, occlusal adjustments in the laboratory after processing.Purpose. This laboratory study compared incisal pin opening, dimensional accuracy, and laboratory working time for dentures fabricated by this new injection system with dentures constructed by the conventional compression molding technique.Material and methods. Two groups of 6 maxillary and 6 mandibular dentures were evaluated as follows: group 1 (control), Lucitone 199, compression molded with a long cure cycle; and group 2, Lucitone 199, injection molded with a long cure. Incisal pin opening was measured with a micrometer immediately after deflasking. A computerized coordinate measuring machine was used to measure dimensional accuracy of 3-dimensional variations in selected positions of artificial teeth in 4 stages of denture fabrication. Analysis of variance (ANOVA) and t tests were performed to compare the groups.Results. A significant difference was found in pin opening between groups (t test). Horizontal dimensional changes evaluated with repeated measures ANOVA revealed no significant differences between groups. However, analysis of vertical dimensional changes disclosed significant differences between the groups. There was no appreciable difference in laboratory working time for flasking and molding denture bases between the injection and compression molding techniques when polymethyl methacrylate resin was used.Conclusion. The injection molding method produced a significantly smaller incisal pin opening over the standard compression molding technique. The injection molding technique, using polymethyl methacrylate, was a more accurate method for processing dentures. There were no appreciable differences in laboratory working time between the injection and compression molding techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geometric accuracy of a close-range photogrammetric system is assessed in this paper considering surface reconstruction with structured light as its main purpose. The system is based on an off-the-shelf digital camera and a pattern projector. The mathematical model for reconstruction is based on the parametric equation of the projected straight line combined with collinearity equations. A sequential approach for system calibration was developed and is presented. Results obtained from real data are also presented and discussed. Experiments with real data using a prototype have indicated 0.5mm of accuracy in height determination and 0.2mm in the XY plane considering an application where the object was 1630mm distant from the camera.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to evaluate the influence of point measurements in images, with subpixel accuracy, and its contribution in the calibration of digital cameras. Also, the effect of subpixel measurements in 3D coordinates of check points in the object space will be evaluated. With this purpose, an algorithm that allows subpixel accuracy was implemented for semi-automatic determination of points of interest, based on Fõrstner operator. Experiments were accomplished with a block of images acquired with the multispectral camera DuncanTech MS3100-CIR. The influence of subpixel measurements in the adjustment by Least Square Method (LSM) was evaluated by the comparison of estimated standard deviation of parameters in both situations, with manual measurement (pixel accuracy) and with subpixel estimation. Additionally, the influence of subpixel measurements in the 3D reconstruction was also analyzed. Based on the obtained results, i.e., on the quantification of the standard deviation reduction in the Inner Orientation Parameters (IOP) and also in the relative error of the 3D reconstruction, it was shown that measurements with subpixel accuracy are relevant for some tasks in Photogrammetry, mainly for those in which the metric quality is of great relevance, as Camera Calibration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: This in vitro study compared the dimensional accuracy of stone index (I) and three impression techniques: tapered impression copings (T), squared impression copings (S) and modified squared impression copings (MS) for implant-supported prostheses. Methods: A master cast, with four parallel implant abutment analogs and a passive framework, were fabricated. Vinyl polysiloxane impression material was used for all impressions with two metal stock trays (open and closed tray). Four groups (I, T, S and MS) were tested (n = 5). A metallic framework was seated on each of the casts, one abutment screw was tightened, and the gap between the analog of implant and the framework was measured with a stereomicroscope. The groups' measurements (80 gap values) were analyzed using software (LeicaQWin - Leica Imaging Systems Ltd.) that received the images of a video camera coupled to a Leica stereomicroscope at 100× magnification. The results were statistically analyzed with Kruskal-Wallis One Way ANOVA on Ranks test followed by Dunn's Method, 0.05. Results: The mean values of abutment/framework interface gaps were: Master Cast = 32 μm (SD 2); Group I = 45 μm (SD 3); Group T = 78 μm (SD 25); Group S = 134 μm (SD 30); Group MS = 143 μm (SD 27). No significant difference was detected among Index and Master Cast (P = .05). Conclusion: Under the limitations of this study, it could be suggested that a more accurate working cast is possible using tapered impression copings techniques and stone index. © 2013 Japan Prosthodontic Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the importance of Guzera breeding programs for milk production in the tropics, the objective of this study was to compare alternative random regression models for estimation of genetic parameters and prediction of breeding values. Test-day milk yields records (TDR) were collected monthly, in a maximum of 10 measurements. The database included 20,524 records of first lactation from 2816 Guzera cows. TDR data were analyzed by random regression models (RRM) considering additive genetic, permanent environmental and residual effects as random and the effects of contemporary group (CG), calving age as a covariate (linear and quadratic effects) and mean lactation curve as fixed. The genetic additive and permanent environmental effects were modeled by RRM using Wilmink, All and Schaeffer and cubic B-spline functions as well as Legendre polynomials. Residual variances were considered as heterogeneous classes, grouped differently according to the model used. Multi-trait analysis using finite-dimensional models (FDM) for testday milk records (TDR) and a single-trait model for 305-days milk yields (default) using the restricted maximum likelihood method were also carried out as further comparisons. Through the statistical criteria adopted, the best RRM was the one that used the cubic B-spline function with five random regression coefficients for the genetic additive and permanent environmental effects. However, the models using the Ali and Schaeffer function or Legendre polynomials with second and fifth order for, respectively, the additive genetic and permanent environmental effects can be adopted, as little variation was observed in the genetic parameter estimates compared to those estimated by models using the B-spline function. Therefore, due to the lower complexity in the (co)variance estimations, the model using Legendre polynomials represented the best option for the genetic evaluation of the Guzera lactation records. An increase of 3.6% in the accuracy of the estimated breeding values was verified when using RRM. The ranks of animals were very close whatever the RRM for the data set used to predict breeding values. Considering P305, results indicated only small to medium difference in the animals' ranking based on breeding values predicted by the conventional model or by RRM. Therefore, the sum of all the RRM-predicted breeding values along the lactation period (RRM305) can be used as a selection criterion for 305-day milk production. (c) 2014 Elsevier B.V. All rights reserved.