912 resultados para Error Correction Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The zero-inflated negative binomial model is used to account for overdispersion detected in data that are initially analyzed under the zero-Inflated Poisson model A frequentist analysis a jackknife estimator and a non-parametric bootstrap for parameter estimation of zero-inflated negative binomial regression models are considered In addition an EM-type algorithm is developed for performing maximum likelihood estimation Then the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and some ways to perform global influence analysis are derived In order to study departures from the error assumption as well as the presence of outliers residual analysis based on the standardized Pearson residuals is discussed The relevance of the approach is illustrated with a real data set where It is shown that zero-inflated negative binomial regression models seems to fit the data better than the Poisson counterpart (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The truncation errors associated with finite difference solutions of the advection-dispersion equation with first-order reaction are formulated from a Taylor analysis. The error expressions are based on a general form of the corresponding difference equation and a temporally and spatially weighted parametric approach is used for differentiating among the various finite difference schemes. The numerical truncation errors are defined using Peclet and Courant numbers and a new Sink/Source dimensionless number. It is shown that all of the finite difference schemes suffer from truncation errors. Tn particular it is shown that the Crank-Nicolson approximation scheme does not have second order accuracy for this case. The effects of these truncation errors on the solution of an advection-dispersion equation with a first order reaction term are demonstrated by comparison with an analytical solution. The results show that these errors are not negligible and that correcting the finite difference scheme for them results in a more accurate solution. (C) 1999 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most previous investigations on tide-induced watertable fluctuations in coastal aquifers have been based on one-dimensional models that describe the processes in the cross-shore direction alone, assuming negligible along-shore variability. A recent study proposed a two-dimensional approximation for tide-induced watertable fluctuations that took into account coastline variations. Here, we further develop this approximation in two ways, by extending the approximation to second order and by taking into account capillary effects. Our results demonstrate that both effects can markedly influence watertable fluctuations. In particular, with the first-order approximation, the local damping rate of the tidal signal could be subject to sizable errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Most cases of congenital clubfoot treated with the Ponseti technique require percutaneous Achilles tenotomy to correct the residual equinus. Clinical evidence suggests that complete healing occurs between the cut tendon stumps, but there have not yet been any detailed studies investigating this reparative process. This study was performed to assess Achilles tendon repair after percutaneous section to correct the residual equinus of clubfoot treated with the Ponseti method. Method: A prospective study analyzed 37 tenotomies in 26 patients with congenital clubfoot treated with the Ponseti technique, with a minimum follow-up of 1 year after the section. The tenotomy was performed percutaneously with a large-bore needle bevel with patient sedation and local anesthesia. Ultrasonographic scanning was performed after section to ascertain that the tenotomy had been completed and to measure the stump separation. In the follow-up period, the reparative process was followed ultrasonographically and assessed at 3 weeks, 6 months, and 1 year posttenotomy. Results: The ultrasonography performed immediately after the procedure showed that in some cases, residual strands between the tendon ends persisted, and these were completely sectioned under ultrasound control. A mean retraction of 5.65 mm +/- 2.26 mm (range, 2.3 to 11.0 mm) between tendon stumps after section was observed. Unusual bleeding occurred in one case and was controlled by digital pressure, with no interference with the final treatment. After 3 weeks, ultrasonography showed tendon repair with the tendon gap filled with irregular hypoechoic tissue, and also with transmission of muscle motion to the heel. Six months after tenotomy, there was structural filling with a fibrillar aspect, mild or moderate hypoechogenicity, and tendon scar thickening when compared with a normal tendon. One year after tenotomy, ultrasound showed a fibrillar structure and echogenicity at the repair site that was similar to a normal tendon, but with persistent tendon scarring thickness. Conclusions: There is a fast reparative process after Achilles tendon percutaneous section that reestablishes continuity between stumps. The reparative tissue evolved to tendon tissue with a normal ultrasonographic appearance except for mild thickening, suggesting a predominantly intrinsic repair mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider testing for additivity in a class of nonparametric stochastic regression models. Two test statistics are constructed and their asymptotic distributions are established. We also conduct a small sample study for one of the test statistics through a simulated example. (C) 2002 Elsevier Science (USA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thin-layer drying behaviour of bananas in a beat pump dehumidifier dryer was examined. Four pre-treatments (blanching, chilling, freezing and combined blanching and freezing) were applied to the bananas, which were dried at 50 degreesC with an air velocity of 3.1 m s(-1) and with the relative humidity of the inlet air of 10-35%. Three drying models, the simple model, the two-term exponential model and the Page model were examined. All models were evaluated using three statistical measures, correlation coefficient, root means square error, and mean absolute percent error. Moisture diffusivity was calculated based on the diffusion equation for an infinite cylindrical shape using the slope method. The rate of drying was higher for the pre-treatments involving freezing. The sample which was blanched only did not show any improvement in drying rate. In fact, a longer drying time resulted due to water absorption during blanching. There was no change in the rate for the chilled sample compared with the control. While all models closely fitted the drying data, the simple model showed greatest deviation from the experimental results. The two-term exponential model was found to be the best model for describing the drying curves of bananas because its parameters represent better the physical characteristics of the drying process. Moisture diffusivities of bananas were in the range 4.3-13.2 x 10(-10) m(2)s(-1). (C) 2002 Published by Elsevier Science Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pectus excavatum is the most common deformity of the thorax. A minimally invasive surgical correction is commonly carried out to remodel the anterior chest wall by using an intrathoracic convex prosthesis in the substernal position. The process of prosthesis modeling and bending still remains an area of improvement. The authors developed a new system, i3DExcavatum, which can automatically model and bend the bar preoperatively based on a thoracic CT scan. This article presents a comparison between automatic and manual bending. The i3DExcavatum was used to personalize prostheses for 41 patients who underwent pectus excavatum surgical correction between 2007 and 2012. Regarding the anatomical variations, the soft-tissue thicknesses external to the ribs show that both symmetric and asymmetric patients always have asymmetric variations, by comparing the patients’ sides. It highlighted that the prosthesis bar should be modeled according to each patient’s rib positions and dimensions. The average differences between the skin and costal line curvature lengths were 84 ± 4 mm and 96 ± 11 mm, for male and female patients, respectively. On the other hand, the i3DExcavatum ensured a smooth curvature of the surgical prosthesis and was capable of predicting and simulating a virtual shape and size of the bar for asymmetric and symmetric patients. In conclusion, the i3DExcavatum allows preoperative personalization according to the thoracic morphology of each patient. It reduces surgery time and minimizes the margin error introduced by the manually bent bar, which only uses a template that copies the chest wall curvature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pectus excavatum is the most common deformity of the thorax. A minimally invasive surgical correction is commonly carried out to remodel the anterior chest wall by using an intrathoracic convex prosthesis in the substernal position. The process of prosthesis modeling and bending still remains an area of improvement. The authors developed a new system, i3DExcavatum, which can automatically model and bend the bar preoperatively based on a thoracic CT scan. This article presents a comparison between automatic and manual bending. The i3DExcavatum was used to personalize prostheses for 41 patients who underwent pectus excavatum surgical correction between 2007 and 2012. Regarding the anatomical variations, the soft-tissue thicknesses external to the ribs show that both symmetric and asymmetric patients always have asymmetric variations, by comparing the patients’ sides. It highlighted that the prosthesis bar should be modeled according to each patient’s rib positions and dimensions. The average differences between the skin and costal line curvature lengths were 84 ± 4 mm and 96 ± 11 mm, for male and female patients, respectively. On the other hand, the i3DExcavatum ensured a smooth curvature of the surgical prosthesis and was capable of predicting and simulating a virtual shape and size of the bar for asymmetric and symmetric patients. In conclusion, the i3DExcavatum allows preoperative personalization according to the thoracic morphology of each patient. It reduces surgery time and minimizes the margin error introduced by the manually bent bar, which only uses a template that copies the chest wall curvature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pectus excavatum is the most common deformity of the thorax. A minimally invasive surgical correction is commonly carried out to remodel the anterior chest wall by using an intrathoracic convex prosthesis in the substernal position. The process of prosthesis modeling and bending still remains an area of improvement. The authors developed a new system, i3DExcavatum, which can automatically model and bend the bar preoperatively based on a thoracic CT scan. This article presents a comparison between automatic and manual bending. The i3DExcavatum was used to personalize prostheses for 41 patients who underwent pectus excavatum surgical correction between 2007 and 2012. Regarding the anatomical variations, the soft-tissue thicknesses external to the ribs show that both symmetric and asymmetric patients always have asymmetric variations, by comparing the patients’ sides. It highlighted that the prosthesis bar should be modeled according to each patient’s rib positions and dimensions. The average differences between the skin and costal line curvature lengths were 84 ± 4 mm and 96 ± 11 mm, for male and female patients, respectively. On the other hand, the i3DExcavatum ensured a smooth curvature of the surgical prosthesis and was capable of predicting and simulating a virtual shape and size of the bar for asymmetric and symmetric patients. In conclusion, the i3DExcavatum allows preoperative personalization according to the thoracic morphology of each patient. It reduces surgery time and minimizes the margin error introduced by the manually bent bar, which only uses a template that copies the chest wall curvature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new data set of daily gridded observations of precipitation, computed from over 400 stations in Portugal, is used to assess the performance of 12 regional climate models at 25 km resolution, from the ENSEMBLES set, all forced by ERA-40 boundary conditions, for the 1961-2000 period. Standard point error statistics, calculated from grid point and basin aggregated data, and precipitation related climate indices are used to analyze the performance of the different models in representing the main spatial and temporal features of the regional climate, and its extreme events. As a whole, the ENSEMBLES models are found to achieve a good representation of those features, with good spatial correlations with observations. There is a small but relevant negative bias in precipitation, especially in the driest months, leading to systematic errors in related climate indices. The underprediction of precipitation occurs in most percentiles, although this deficiency is partially corrected at the basin level. Interestingly, some of the conclusions concerning the performance of the models are different of what has been found for the contiguous territory of Spain; in particular, ENSEMBLES models appear too dry over Portugal and too wet over Spain. Finally, models behave quite differently in the simulation of some important aspects of local climate, from the mean climatology to high precipitation regimes in localized mountain ranges and in the subsequent drier regions.