1000 resultados para Residuals analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increase in ultraviolet radiation (UV) at surface, the high incidence of non-melanoma skin cancer (NMSC) in coast of Northeast of Brazil (NEB) and reduction of total ozone were the motivation for the present study. The overall objective was to identify and understand the variability of UV or Index Ultraviolet Radiation (UV Index) in the capitals of the east coast of the NEB and adjust stochastic models to time series of UV index aiming make predictions (interpolations) and forecasts / projections (extrapolations) followed by trend analysis. The methodology consisted of applying multivariate analysis (principal component analysis and cluster analysis), Predictive Mean Matching method for filling gaps in the data, autoregressive distributed lag (ADL) and Mann-Kendal. The modeling via the ADL consisted of parameter estimation, diagnostics, residuals analysis and evaluation of the quality of the predictions and forecasts via mean squared error and Pearson correlation coefficient. The research results indicated that the annual variability of UV in the capital of Rio Grande do Norte (Natal) has a feature in the months of September and October that consisting of a stabilization / reduction of UV index because of the greater annual concentration total ozone. The increased amount of aerosol during this period contributes in lesser intensity for this event. The increased amount of aerosol during this period contributes in lesser intensity for this event. The application of cluster analysis on the east coast of the NEB showed that this event also occurs in the capitals of Paraiba (João Pessoa) and Pernambuco (Recife). Extreme events of UV in NEB were analyzed from the city of Natal and were associated with absence of cloud cover and levels below the annual average of total ozone and did not occurring in the entire region because of the uneven spatial distribution of these variables. The ADL (4, 1) model, adjusted with data of the UV index and total ozone to period 2001-2012 made a the projection / extrapolation for the next 30 years (2013-2043) indicating in end of that period an increase to the UV index of one unit (approximately), case total ozone maintain the downward trend observed in study period

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present residual analysis techniques to assess the fit of correlated survival data by Accelerated Failure Time Models (AFTM) with random effects. We propose an imputation procedure for censored observations and consider three types of residuals to evaluate different model characteristics. We illustrate the proposal with the analysis of AFTM with random effects to a real data set involving times between failures of oil well equipment

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A análise de dados de sobrevivência tem sido tradicionalmente baseada no modelo de regressão de Cox (COX, 1972). No entanto, a suposição de taxas de falha proporcionais assumida para esse modelo pode não ser atendida em diversas situações práticas. Essa restrição do modelo de Cox tem gerado interesse em abordagens alternativas, dentre elas os modelos dinâmicos que permitem efeito das covariáveis variando no tempo. Neste trabalho, foram revisados os principais modelos de sobrevivência dinâmicos com estrutura aditiva e multiplicativa nos contextos não paramétrico e semiparamétrico. Métodos gráficos baseados em resíduos foram apresentados com a finalidade de avaliar a qualidade de ajuste desses modelos. Uma versão tempo-dependente da área sob a curva ROC, denotada por AUC(t), foi proposta com a finalidade de avaliar e comparar a qualidade de predição entre modelos de sobrevivência com estruturas aditiva e multiplicativa. O desempenho da AUC(t) foi avaliado por meio de um estudo de simulação. Dados de três estudos descritos na literatura foram também analisados para ilustrar ou complementar os cenários que foram considerados no estudo de simulação. De modo geral, os resultados obtidos indicaram que os métodos gráficos apresentados para avaliar a adequação dos modelos em conjunto com a AUC(t) se constituem em um conjunto de ferramentas estatísticas úteis para o próposito de avaliar modelos de sobrevivência dinâmicos nos contextos não paramétrico e semiparamétrico. Além disso, a aplicação desse conjunto de ferramentas em alguns conjuntos de dados evidenciou que se, por um lado, os modelos dinâmicos são atrativos por permitirem covariáveis tempo-dependentes, por outro lado podem não ser apropriados para todos os conjuntos de dados, tendo em vista que estimação pode apresentar restrições para alguns deles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Urban rainfall-runoff residuals contain metals such as Cr, Zn, Cu, As, Pb and Cd and are thus reasonable candidates for treatment using Portland cement-based solidification-stabilization (S/S). This research is a study of S/S of urban storm water runoff solid residuals in Portland cement with quicklime and sodium bentonite additives. The solidified residuals were analyzed after 28 days of hydration time using X-ray powder diffraction (XRD) and solid-state Si-29 nuclear magnetic resonance (NMR) spectroscopy. X-ray diffraction (XRD) results indicate that the main cement hydration products are ettringite, calcium hydroxide and hydrated calcium silicates. Zinc hydroxide and lead and zinc silicates are also present due to the reactions of the waste compounds with the cement and its hydration products. Si-29 NMR analysis shows that the coarse fraction of the waste apparently does not interfere with cement hydration, but the fine fraction retards silica polymerization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we compare three residuals to assess departures from the error assumptions as well as to detect outlying observations in log-Burr XII regression models with censored observations. These residuals can also be used for the log-logistic regression model, which is a special case of the log-Burr XII regression model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to the modified martingale-type residual in log-Burr XII regression models with censored data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Omnibus tests of significance in contingency tables use statistics of the chi-square type. When the null is rejected, residual analyses are conducted to identify cells in which observed frequencies differ significantly from expected frequencies. Residual analyses are thus conditioned on a significant omnibus test. Conditional approaches have been shown to substantially alter type I error rates in cases involving t tests conditional on the results of a test of equality of variances, or tests of regression coefficients conditional on the results of tests of heteroscedasticity. We show that residual analyses conditional on a significant omnibus test are also affected by this problem, yielding type I error rates that can be up to 6 times larger than nominal rates, depending on the size of the table and the form of the marginal distributions. We explored several unconditional approaches in search for a method that maintains the nominal type I error rate and found out that a bootstrap correction for multiple testing achieved this goal. The validity of this approach is documented for two-way contingency tables in the contexts of tests of independence, tests of homogeneity, and fitting psychometric functions. Computer code in MATLAB and R to conduct these analyses is provided as Supplementary Material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heat transfer and entropy generation analysis of the thermally developing forced convection in a porous-saturated duct of rectangular cross-section, with walls maintained at a constant and uniform heat flux, is investigated based on the Brinkman flow model. The classical Galerkin method is used to obtain the fully developed velocity distribution. To solve the thermal energy equation, with the effects of viscous dissipation being included, the Extended Weighted Residuals Method (EWRM) is applied. The local (three dimensional) temperature field is solved by utilizing the Green’s function solution based on the EWRM where symbolic algebra is being used for convenience in presentation. Following the computation of the temperature field, expressions are presented for the local Nusselt number and the bulk temperature as a function of the dimensionless longitudinal coordinate, the aspect ratio, the Darcy number, the viscosity ratio, and the Brinkman number. With the velocity and temperature field being determined, the Second Law (of Thermodynamics) aspect of the problem is also investigated. Approximate closed form solutions are also presented for two limiting cases of MDa values. It is observed that decreasing the aspect ratio and MDa values increases the entropy generation rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the hysteresis hypothesis in the Brazilian industrialized exports using a time series analysis. This hypothesis finds an empirical representation into the nonlinear adjustments of the exported quantity to relative price changes. Thus, the threshold cointegration analysis proposed by Balke and Fomby [Balke, N.S. and Fomby, T.B. Threshold Cointegration. International Economic Review, 1997; 38; 627-645.] was used for estimating models with asymmetric adjustment of the error correction term. Amongst sixteen industrial sectors selected, there was evidence of nonlinearities in the residuals of long-run relationships of supply or demand for exports in nine of them. These nonlinearities represent asymmetric and/or discontinuous responses of exports to different representative measures of real exchange rates, in addition to other components of long-run demand or supply equations. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a spatial econometrics analysis for the number of road accidents with victims in the smallest administrative divisions of Lisbon, considering as a baseline a log-Poisson model for environmental factors. Spatial correlation on data is investigated for data alone and for the residuals of the baseline model without and with spatial-autocorrelated and spatial-lagged terms. In all the cases no spatial autocorrelation was detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the problem of spatial data mapping. A new method based on wavelet interpolation and geostatistical prediction (kriging) is proposed. The method - wavelet analysis residual kriging (WARK) - is developed in order to assess the problems rising for highly variable data in presence of spatial trends. In these cases stationary prediction models have very limited application. Wavelet analysis is used to model large-scale structures and kriging of the remaining residuals focuses on small-scale peculiarities. WARK is able to model spatial pattern which features multiscale structure. In the present work WARK is applied to the rainfall data and the results of validation are compared with the ones obtained from neural network residual kriging (NNRK). NNRK is also a residual-based method, which uses artificial neural network to model large-scale non-linear trends. The comparison of the results demonstrates the high quality performance of WARK in predicting hot spots, reproducing global statistical characteristics of the distribution and spatial correlation structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location State of Vaud, western Switzerland. Methods Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To study different temporal components on cancer mortality (age, period and cohort) methods of graphic representation were applied to Swiss mortality data from 1950 to 1984. Maps using continuous slopes ("contour maps") and based on eight tones of grey according to the absolute distribution of rates were used to represent the surfaces defined by the matrix of various age-specific rates. Further, progressively more complex regression surface equations were defined, on the basis of two independent variables (age/cohort) and a dependent one (each age-specific mortality rate). General patterns of trends in cancer mortality were thus identified, permitting definition of important cohort (e.g., upwards for lung and other tobacco-related neoplasms, or downwards for stomach) or period (e.g., downwards for intestines or thyroid cancers) effects, besides the major underlying age component. For most cancer sites, even the lower order (1st to 3rd) models utilised provided excellent fitting, allowing immediate identification of the residuals (e.g., high or low mortality points) as well as estimates of first-order interactions between the three factors, although the parameters of the main effects remained still undetermined. Thus, the method should be essentially used as summary guide to illustrate and understand the general patterns of age, period and cohort effects in (cancer) mortality, although they cannot conceptually solve the inherent problem of identifiability of the three components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to evaluate the efficiency of spatial statistical analysis in the selection of genotypes in a plant breeding program and, particularly, to demonstrate the benefits of the approach when experimental observations are not spatially independent. The basic material of this study was a yield trial of soybean lines, with five check varieties (of fixed effect) and 110 test lines (of random effects), in an augmented block design. The spatial analysis used a random field linear model (RFML), with a covariance function estimated from the residuals of the analysis considering independent errors. Results showed a residual autocorrelation of significant magnitude and extension (range), which allowed a better discrimination among genotypes (increase of the power of statistical tests, reduction in the standard errors of estimates and predictors, and a greater amplitude of predictor values) when the spatial analysis was applied. Furthermore, the spatial analysis led to a different ranking of the genetic materials, in comparison with the non-spatial analysis, and a selection less influenced by local variation effects was obtained.