55 resultados para Computer Diagnostics

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The zero-inflated negative binomial model is used to account for overdispersion detected in data that are initially analyzed under the zero-Inflated Poisson model A frequentist analysis a jackknife estimator and a non-parametric bootstrap for parameter estimation of zero-inflated negative binomial regression models are considered In addition an EM-type algorithm is developed for performing maximum likelihood estimation Then the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and some ways to perform global influence analysis are derived In order to study departures from the error assumption as well as the presence of outliers residual analysis based on the standardized Pearson residuals is discussed The relevance of the approach is illustrated with a real data set where It is shown that zero-inflated negative binomial regression models seems to fit the data better than the Poisson counterpart (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to develop a Bayesian analysis for nonlinear regression models under scale mixtures of skew-normal distributions. This novel class of models provides a useful generalization of the symmetrical nonlinear regression models since the error distributions cover both skewness and heavy-tailed distributions such as the skew-t, skew-slash and the skew-contaminated normal distributions. The main advantage of these class of distributions is that they have a nice hierarchical representation that allows the implementation of Markov chain Monte Carlo (MCMC) methods to simulate samples from the joint posterior distribution. In order to examine the robust aspects of this flexible class, against outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. Further, some discussions on the model selection criteria are given. The newly developed procedures are illustrated considering two simulations study, and a real data previously analyzed under normal and skew-normal nonlinear regression models. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we propose and analyze nonlinear elliptical models for longitudinal data, which represent an alternative to gaussian models in the cases of heavy tails, for instance. The elliptical distributions may help to control the influence of the observations in the parameter estimates by naturally attributing different weights for each case. We consider random effects to introduce the within-group correlation and work with the marginal model without requiring numerical integration. An iterative algorithm to obtain maximum likelihood estimates for the parameters is presented, as well as diagnostic results based on residual distances and local influence [Cook, D., 1986. Assessment of local influence. journal of the Royal Statistical Society - Series B 48 (2), 133-169; Cook D., 1987. Influence assessment. journal of Applied Statistics 14 (2),117-131; Escobar, L.A., Meeker, W.Q., 1992, Assessing influence in regression analysis with censored data, Biometrics 48, 507-528]. As numerical illustration, we apply the obtained results to a kinetics longitudinal data set presented in [Vonesh, E.F., Carter, R.L., 1992. Mixed-effects nonlinear regression for unbalanced repeated measures. Biometrics 48, 1-17], which was analyzed under the assumption of normality. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Grubbs` measurement model is frequently used to compare several measuring devices. It is common to assume that the random terms have a normal distribution. However, such assumption makes the inference vulnerable to outlying observations, whereas scale mixtures of normal distributions have been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to develop an EM-type algorithm for the parameter estimation, and to use the local influence method to assess the robustness aspects of these parameter estimates under some usual perturbation schemes, In order to identify outliers and to criticize the model building we use the local influence procedure in a Study to compare the precision of several thermocouples. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the issue of assessing influence of observations in the class of beta regression models, which is useful for modelling random variables that assume values in the standard unit interval and are affected by independent variables. We propose a Cook-like distance and also measures of local influence under different perturbation schemes. Applications using real data are presented. (c) 2008 Elsevier B.V.. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides general matrix formulas for computing the score function, the (expected and observed) Fisher information and the A matrices (required for the assessment of local influence) for a quite general model which includes the one proposed by Russo et al. (2009). Additionally, we also present an expression for the generalized leverage on fixed and random effects. The matrix formulation has notational advantages, since despite the complexity of the postulated model, all general formulas are compact, clear and have nice forms. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. Compact groups of galaxies are entities that have high densities of galaxies and serve as laboratories to study galaxy interactions, intergalactic star formation and galaxy evolution. Aims. The main goal of this study is to search for young objects in the intragroup medium of seven compact groups of galaxies: HCG 2, 7, 22, 23, 92, 100 and NGC 92 as well as to evaluate the stage of interaction of each group. Methods. We used Fabry-Perot velocity fields and rotation curves together with GALEX NUV and FUV images and optical R-band and HI maps. Results. (i) HCG 7 and HCG 23 are in early stages of interaction; (ii) HCG 2 and HCG 22 are mildly interacting; and (iii) HCG 92, HCG 100 and NGC 92 are in late stages of evolution. We find that all three evolved groups contain populations of young blue objects in the intragroup medium, consistent with ages < 100 Myr, of which several are younger than < 10 Myr. We also report the discovery of a tidal dwarf galaxy candidate in the tail of NGC 92. These three groups, besides containing galaxies that have peculiar velocity fields, also show extended HI tails. Conclusions. Our results indicate that the advanced stage of evolution of a group, together with the presence of intragroup HI clouds, may lead to star formation in the intragroup medium. A table containing all intergalactic HII regions and tidal dwarf galaxies confirmed to date is appended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first simultaneous measurements of the Thomson scattering and electron cyclotron emission radiometer diagnostics performed at TCABR tokamak with Alfven wave heating. The Thomson scattering diagnostic is an upgraded version of the one previously installed at the ISTTOK tokamak, while the electron cyclotron emission radiometer employs a heterodyne sweeping radiometer. For purely Ohmic discharges, the electron temperature measurements from both diagnostics are in good agreement. Additional Alfven wave heating does not affect the capability of the Thomson scattering diagnostic to measure the instantaneous electron temperature, whereas measurements from the electron cyclotron emission radiometer become underestimates of the actual temperature values. (C) 2010 American Institute of Physics. [doi:10.1063/1.3494379]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multispectral widefield optical imaging has the potential to improve early detection of oral cancer. The appropriate selection of illumination and collection conditions is required to maximize diagnostic ability. The goals of this study were to (i) evaluate image contrast between oral cancer/precancer and non-neoplastic mucosa for a variety of imaging modalities and illumination/collection conditions, and (ii) use classification algorithms to evaluate and compare the diagnostic utility of these modalities to discriminate cancers and precancers from normal tissue. Narrowband reflectance, autofluorescence, and polarized reflectance images were obtained from 61 patients and 11 normal volunteers. Image contrast was compared to identify modalities and conditions yielding greatest contrast. Image features were extracted and used to train and evaluate classification algorithms to discriminate tissue as non-neoplastic, dysplastic, or cancer; results were compared to histologic diagnosis. Autofluorescence imaging at 405-nm excitation provided the greatest image contrast, and the ratio of red-to-green fluorescence intensity computed from these images provided the best classification of dysplasia/cancer versus non-neoplastic tissue. A sensitivity of 100% and a specificity of 85% were achieved in the validation set. Multispectral widefield images can accurately distinguish neoplastic and non-neoplastic tissue; however, the ability to separate precancerous lesions from cancers with this technique was limited. (C) 2010 Society of Photo-Optical Instrumentation Engineers. [DOI: 10.1117/1.3516593]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A two-dimensional numeric simulator is developed to predict the nonlinear, convective-reactive, oxygen mass exchange in a cross-flow hollow fiber blood oxygenator. The numeric simulator also calculates the carbon dioxide mass exchange, as hemoglobin affinity to oxygen is affected by the local pH value, which depends mostly on the local carbon dioxide content in blood. Blood pH calculation inside the oxygenator is made by the simultaneous solution of an equation that takes into account the blood buffering capacity and the classical Henderson-Hasselbach equation. The modeling of the mass transfer conductance in the blood comprises a global factor, which is a function of the Reynolds number, and a local factor, which takes into account the amount of oxygen reacted to hemoglobin. The simulator is calibrated against experimental data for an in-line fiber bundle. The results are: (i) the calibration process allows the precise determination of the mass transfer conductance for both oxygen and carbon dioxide; (ii) very alkaline pH values occur in the blood path at the gas inlet side of the fiber bundle; (iii) the parametric analysis of the effect of the blood base excess (BE) shows that V(CO2) is similar in the case of blood metabolic alkalosis, metabolic acidosis, or normal BE, for a similar blood inlet P(CO2), although the condition of metabolic alkalosis is the worst case, as the pH in the vicinity of the gas inlet is the most alkaline; (iv) the parametric analysis of the effect of the gas flow to blood flow ratio (Q(G)/Q(B)) shows that V(CO2) variation with the gas flow is almost linear up to Q(G)/Q(B) = 2.0. V(O2) is not affected by the gas flow as it was observed that by increasing the gas flow up to eight times, the V(O2) grows only 1%. The mass exchange of carbon dioxide uses the full length of the hollow-fiber only if Q(G)/Q(B) > 2.0, as it was observed that only in this condition does the local variation of pH and blood P(CO2) comprise the whole fiber bundle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A gap has been identified in the literature on the diagnosis and monitoring of the degree of strategic alignment. The main objective of this article is to diagnose and analyze the strategic alignment profile using the alignment diagnostic profile (ADP) tool, which enables organizations to show visually their degree of strategic alignment. The methodological approach adopted is multiple-case studies, which were conducted at five organizations in the medical diagnostics sector. The results indicate that the ADP enables organizations to understand the steps required to improve their level of alignment and to identify and locate gaps and conflicts.