26 resultados para Model Output Statistics
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Backgrounds Ea aims: The boundaries between the categories of body composition provided by vectorial analysis of bioimpedance are not well defined. In this paper, fuzzy sets theory was used for modeling such uncertainty. Methods: An Italian database with 179 cases 18-70 years was divided randomly into developing (n = 20) and testing samples (n = 159). From the 159 registries of the testing sample, 99 contributed with unequivocal diagnosis. Resistance/height and reactance/height were the input variables in the model. Output variables were the seven categories of body composition of vectorial analysis. For each case the linguistic model estimated the membership degree of each impedance category. To compare such results to the previously established diagnoses Kappa statistics was used. This demanded singling out one among the output set of seven categories of membership degrees. This procedure (defuzzification rule) established that the category with the highest membership degree should be the most likely category for the case. Results: The fuzzy model showed a good fit to the development sample. Excellent agreement was achieved between the defuzzified impedance diagnoses and the clinical diagnoses in the testing sample (Kappa = 0.85, p < 0.001). Conclusions: fuzzy linguistic model was found in good agreement with clinical diagnoses. If the whole model output is considered, information on to which extent each BIVA category is present does better advise clinical practice with an enlarged nosological framework and diverse therapeutic strategies. (C) 2012 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Resumo:
In this paper we consider an equilibrium last-passage percolation model on an environment given by a compound two-dimensional Poisson process. We prove an L-2-formula relating the initial measure with the last-passage percolation time. This formula turns out to be a useful tool to analyze the fluctuations of the last-passage times along non-characteristic directions.
Resumo:
Brazilian design code ABNT NBR6118:2003 - Design of Concrete Structures - Procedures - [1] proposes the use of simplified models for the consideration of non-linear material behavior in the evaluation of horizontal displacements in buildings. These models penalize stiffness of columns and beams, representing the effects of concrete cracking and avoiding costly physical non-linear analyses. The objectives of the present paper are to investigate the accuracy and uncertainty of these simplified models, as well as to evaluate the reliabilities of structures designed following ABNT NBR6118:2003[1&] in the service limit state for horizontal displacements. Model error statistics are obtained from 42 representative plane frames. The reliabilities of three typical (4, 8 and 12 floor) buildings are evaluated, using the simplified models and a rigorous, physical and geometrical non-linear analysis. Results show that the 70/70 (column/beam stiffness reduction) model is more accurate and less conservative than the 80/40 model. Results also show that ABNT NBR6118:2003 [1] design criteria for horizontal displacement limit states (masonry damage according to ACI 435.3R-68(1984) [10]) are conservative, and result in reliability indexes which are larger than those recommended in EUROCODE [2] for irreversible service limit states.
Resumo:
In this paper we propose a hybrid hazard regression model with threshold stress which includes the proportional hazards and the accelerated failure time models as particular cases. To express the behavior of lifetimes the generalized-gamma distribution is assumed and an inverse power law model with a threshold stress is considered. For parameter estimation we develop a sampling-based posterior inference procedure based on Markov Chain Monte Carlo techniques. We assume proper but vague priors for the parameters of interest. A simulation study investigates the frequentist properties of the proposed estimators obtained under the assumption of vague priors. Further, some discussions on model selection criteria are given. The methodology is illustrated on simulated and real lifetime data set.
Resumo:
OBJECTIVES: Hemodynamic support is aimed at providing adequate O-2 delivery to the tissues; most interventions target O-2 delivery increase. Mixed venous O-2 saturation is a frequently used parameter to evaluate the adequacy of O-2 delivery. METHODS: We describe a mathematical model to compare the effects of increasing O-2 delivery on venous oxygen saturation through increases in the inspired O-2 fraction versus increases in cardiac output. The model was created based on the lungs, which were divided into shunted and non-shunted areas, and on seven peripheral compartments, each with normal values of perfusion, optimal oxygen consumption, and critical O-2 extraction rate. O-2 delivery was increased by changing the inspired fraction of oxygen from 0.21 to 1.0 in steps of 0.1 under conditions of low (2.0 L.min(-1)) or normal (6.5 L.min(-1)) cardiac output. The same O-2 delivery values were also obtained by maintaining a fixed O-2 inspired fraction value of 0.21 while changing cardiac output. RESULTS: Venous oxygen saturation was higher when produced through increases in inspired O-2 fraction versus increases in cardiac output, even at the same O-2 delivery and consumption values. Specifically, at high inspired O-2 fractions, the measured O-2 saturation values failed to detect conditions of low oxygen supply. CONCLUSIONS: The mode of O-2 delivery optimization, specifically increases in the fraction of inspired oxygen versus increases in cardiac output, can compromise the capability of the "venous O-2 saturation" parameter to measure the adequacy of oxygen supply. Consequently, venous saturation at high inspired O-2 fractions should be interpreted with caution.
Resumo:
The purpose of this paper is to develop a Bayesian analysis for the right-censored survival data when immune or cured individuals may be present in the population from which the data is taken. In our approach the number of competing causes of the event of interest follows the Conway-Maxwell-Poisson distribution which generalizes the Poisson distribution. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the proposed model. Also, some discussions on the model selection and an illustration with a real data set are considered.
Resumo:
A Bayesian nonparametric model for Taguchi's on-line quality monitoring procedure for attributes is introduced. The proposed model may accommodate the original single shift setting to the more realistic situation of gradual quality deterioration and allows the incorporation of an expert's opinion on the production process. Based on the number of inspections to be carried out until a defective item is found, the Bayesian operation for the distribution function that represents the increasing sequence of defective fractions during a cycle considering a mixture of Dirichlet processes as prior distribution is performed. Bayes estimates for relevant quantities are also obtained. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we propose a cure rate survival model by assuming the number of competing causes of the event of interest follows the Geometric distribution and the time to event follow a Birnbaum Saunders distribution. We consider a frequentist analysis for parameter estimation of a Geometric Birnbaum Saunders model with cure rate. Finally, to analyze a data set from the medical area. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The log-Burr XII regression model for grouped survival data is evaluated in the presence of many ties. The methodology for grouped survival data is based on life tables, where the times are grouped in k intervals, and we fit discrete lifetime regression models to the data. The model parameters are estimated by maximum likelihood and jackknife methods. To detect influential observations in the proposed model, diagnostic measures based on case deletion, so-called global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to these measures, the total local influence and influential estimates are also used. We conduct Monte Carlo simulation studies to assess the finite sample behavior of the maximum likelihood estimators of the proposed model for grouped survival. A real data set is analyzed using a regression model for grouped data.
Resumo:
Creating high-quality quad meshes from triangulated surfaces is a highly nontrivial task that necessitates consideration of various application specific metrics of quality. In our work, we follow the premise that automatic reconstruction techniques may not generate outputs meeting all the subjective quality expectations of the user. Instead, we put the user at the center of the process by providing a flexible, interactive approach to quadrangulation design. By combining scalar field topology and combinatorial connectivity techniques, we present a new framework, following a coarse to fine design philosophy, which allows for explicit control of the subjective quality criteria on the output quad mesh, at interactive rates. Our quadrangulation framework uses the new notion of Reeb atlas editing, to define with a small amount of interactions a coarse quadrangulation of the model, capturing the main features of the shape, with user prescribed extraordinary vertices and alignment. Fine grain tuning is easily achieved with the notion of connectivity texturing, which allows for additional extraordinary vertices specification and explicit feature alignment, to capture the high-frequency geometries. Experiments demonstrate the interactivity and flexibility of our approach, as well as its ability to generate quad meshes of arbitrary resolution with high-quality statistics, while meeting the user's own subjective requirements.
Resumo:
Many recent survival studies propose modeling data with a cure fraction, i.e., data in which part of the population is not susceptible to the event of interest. This event may occur more than once for the same individual (recurrent event). We then have a scenario of recurrent event data in the presence of a cure fraction, which may appear in various areas such as oncology, finance, industries, among others. This paper proposes a multiple time scale survival model to analyze recurrent events using a cure fraction. The objective is analyzing the efficiency of certain interventions so that the studied event will not happen again in terms of covariates and censoring. All estimates were obtained using a sampling-based approach, which allows information to be input beforehand with lower computational effort. Simulations were done based on a clinical scenario in order to observe some frequentist properties of the estimation procedure in the presence of small and moderate sample sizes. An application of a well-known set of real mammary tumor data is provided.
Resumo:
The main goal of this article is to consider influence assessment in models with error-prone observations and variances of the measurement errors changing across observations. The techniques enable to identify potential influential elements and also to quantify the effects of perturbations in these elements on some results of interest. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease.
Resumo:
In this work, an analysis of scientific bibliographic productivity was made using the Faculdade de Filosofia e Ciencias, Universidade Estadual Paulista (FFC-UNESP) as example. It is composed by nine departments which offer altogether nine undergraduate courses: 1) Archival, 2) Library, 3) Speech Therapy, 4) Pedagogy, 5) International Relations, 6) Physiotherapy, 7) Occupational Therapy, 8) Philosophy, 9) Social Sciences and six graduate programs leading to M. S. and Ph.D. degrees. Moreover, when analyzing the different courses of FFC-UNESP, they represent typical academic organization in Brazil and Latin America and could be taken as a model for analyzing other Brazilian research institutions. Using data retrieved from the Lattes Plataform database (Curriculum Lattes) we have quantitatively the scientific productivity percentage of professors at UNESP. We observed that bibliometric evaluations using the Curriculum Lattes (CL) showed that the professors published papers in journal are not indexed by ISI and SCOPUS. This analysis was made using: 1) the total number of papers (indexed in Curriculum Lattes database), 2) the number of papers indexed by Thomson ISI Web of Science database and SCOPUS database, and 3) the Hirsch (h-index) by ISI and SCOPUS. Bibliometric evaluations of departments showed a better performance of Political Science and Economics Department when compared to others departments, in relation total number of papers (indexed in Curriculum Lattes database). We also analyzed the academic advisory (Master's Thesis and Ph. D. Thesis) by nine departments of FFC/UNESP. The Administration and School Supervision Department presented a higher academic advisory (concluded and current) when compared to the others departments.
Resumo:
This paper presents a performance analysis of a baseband multiple-input single-output ultra-wideband system over scenarios CM1 and CM3 of the IEEE 802.15.3a channel model, incorporating four different schemes of pre-distortion: time reversal, zero-forcing pre-equaliser, constrained least squares pre-equaliser, and minimum mean square error pre-equaliser. For the third case, a simple solution based on the steepest-descent (gradient) algorithm is adopted and compared with theoretical results. The channel estimations at the transmitter are assumed to be truncated and noisy. Results show that the constrained least squares algorithm has a good trade-off between intersymbol interference reduction and signal-to-noise ratio preservation, providing a performance comparable to the minimum mean square error method but with lower computational complexity. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
This paper introduces a skewed log-Birnbaum-Saunders regression model based on the skewed sinh-normal distribution proposed by Leiva et al. [A skewed sinh-normal distribution and its properties and application to air pollution, Comm. Statist. Theory Methods 39 (2010), pp. 426-443]. Some influence methods, such as the local influence and generalized leverage, are presented. Additionally, we derived the normal curvatures of local influence under some perturbation schemes. An empirical application to a real data set is presented in order to illustrate the usefulness of the proposed model.