907 resultados para Weighted regression
Resumo:
Lateral ventricular volumes based on segmented brain MR images can be significantly underestimated if partial volume effects are not considered. This is because a group of voxels in the neighborhood of lateral ventricles is often mis-classified as gray matter voxels due to partial volume effects. This group of voxels is actually a mixture of ventricular cerebro-spinal fluid and the white matter and therefore, a portion of it should be included as part of the lateral ventricular structure. In this note, we describe an automated method for the measurement of lateral ventricular volumes on segmented brain MR images. Image segmentation was carried in combination of intensity correction and thresholding. The method is featured with a procedure for addressing mis-classified voxels in the surrounding of lateral ventricles. A detailed analysis showed that lateral ventricular volumes could be underestimated by 10 to 30% depending upon the size of the lateral ventricular structure, if mis-classified voxels were not included. Validation of the method was done through comparison with the averaged manually traced volumes. Finally, the merit of the method is demonstrated in the evaluation of the rate of lateral ventricular enlargement. (C) 2001 Elsevier Science Inc. All rights reserved.
Resumo:
In this paper, we consider testing for additivity in a class of nonparametric stochastic regression models. Two test statistics are constructed and their asymptotic distributions are established. We also conduct a small sample study for one of the test statistics through a simulated example. (C) 2002 Elsevier Science (USA).
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
A definition of medium voltage (MV) load diagrams was made, based on the data base knowledge discovery process. Clustering techniques were used as support for the agents of the electric power retail markets to obtain specific knowledge of their customers’ consumption habits. Each customer class resulting from the clustering operation is represented by its load diagram. The Two-step clustering algorithm and the WEACS approach based on evidence accumulation (EAC) were applied to an electricity consumption data from a utility client’s database in order to form the customer’s classes and to find a set of representative consumption patterns. The WEACS approach is a clustering ensemble combination approach that uses subsampling and that weights differently the partitions in the co-association matrix. As a complementary step to the WEACS approach, all the final data partitions produced by the different variations of the method are combined and the Ward Link algorithm is used to obtain the final data partition. Experiment results showed that WEACS approach led to better accuracy than many other clustering approaches. In this paper the WEACS approach separates better the customer’s population than Two-step clustering algorithm.
Resumo:
With the electricity market liberalization, the distribution and retail companies are looking for better market strategies based on adequate information upon the consumption patterns of its electricity consumers. A fair insight on the consumers’ behavior will permit the definition of specific contract aspects based on the different consumption patterns. In order to form the different consumers’ classes, and find a set of representative consumption patterns we use electricity consumption data from a utility client’s database and two approaches: Two-step clustering algorithm and the WEACS approach based on evidence accumulation (EAC) for combining partitions in a clustering ensemble. While EAC uses a voting mechanism to produce a co-association matrix based on the pairwise associations obtained from N partitions and where each partition has equal weight in the combination process, the WEACS approach uses subsampling and weights differently the partitions. As a complementary step to the WEACS approach, we combine the partitions obtained in the WEACS approach with the ALL clustering ensemble construction method and we use the Ward Link algorithm to obtain the final data partition. The characterization of the obtained consumers’ clusters was performed using the C5.0 classification algorithm. Experiment results showed that the WEACS approach leads to better results than many other clustering approaches.
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
OBJECTIVE: To determine the prevalence and severity of occlusal problems in populations at the ages of deciduous and permanent dentition and to carry out a meta-analysis to estimate the weighted odds ratio for occlusal problems comparing both groups. METHODS: Data of a probabilistic sample (n=985) of schoolchildren aged 5 and 12 from an epidemiological study in the municipality of São Paulo, Brazil, were analyzed using univariate logistic regression (MLR). Results of cross-sectional study data published in the last 70 years were examined in the meta-analysis. RESULTS: The prevalence of occlusal problems increased from 49.0% (95% CI =47.4%-50.6%) in the deciduous dentition to 71.3% (95% CI =70.3%-72.3%) in the permanent dentition (p<0.001). Dentition was the only variable significantly associated to the severity of malocclusion (OR=1.87; 95% CI =1.43-2.45; p<0.001). The variables sex, type of school and ethnic group were not significant. The meta-analysis showed that a weighted OR of 1.95 (1.91; 1.98) when compared the second dentition period with deciduous and mixed dentition. CONCLUSIONS: In planning oral health services, some activities are indicated to reduce the proportion of moderate/severe malocclusion to levels that are socially more acceptable and economically sustainable.
Resumo:
Susceptibility-weighted imaging (SWI) is a relatively new contrast in MR imaging. Previous studies have found an effect of caffeine in the contrast generated by SWI images. The present study investigates the effect of caffeine on contrast-to-noise ratio (CNR) in SWI.
Resumo:
The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia
Resumo:
Environmental pollution continues to be an emerging study field, as there are thousands of anthropogenic compounds mixed in the environment whose possible mechanisms of toxicity and physiological outcomes are of great concern. Developing methods to access and prioritize the screening of these compounds at trace levels in order to support regulatory efforts is, therefore, very important. A methodology based on solid phase extraction followed by derivatization and gas chromatography-mass spectrometry analysis was developed for the assessment of four endocrine disrupting compounds (EDCs) in water matrices: bisphenol A, estrone, 17b-estradiol and 17a-ethinylestradiol. The study was performed, simultaneously, by two different laboratories in order to evaluate the robustness of the method and to increase the quality control over its application in routine analysis. Validation was done according to the International Conference on Harmonisation recommendations and other international guidelines with specifications for the GC-MS methodology. Matrix-induced chromatographic response enhancement was avoided by using matrix-standard calibration solutions and heteroscedasticity has been overtaken by a weighted least squares linear regression model application. Consistent evaluation of key analytical parameters such as extraction efficiency, sensitivity, specificity, linearity, limits of detection and quantification, precision, accuracy and robustness was done in accordance with standards established for acceptance. Finally, the application of the optimized method in the assessment of the selected analytes in environmental samples suggested that it is an expedite methodology for routine analysis of EDC residues in water matrices.
Resumo:
We propose a 3D-2D image registration method that relates image features of 2D projection images to the transformation parameters of the 3D image by nonlinear regression. The method is compared with a conventional registration method based on iterative optimization. For evaluation, simulated X-ray images (DRRs) were generated from coronary artery tree models derived from 3D CTA scans. Registration of nine vessel trees was performed, and the alignment quality was measured by the mean target registration error (mTRE). The regression approach was shown to be slightly less accurate, but much more robust than the method based on an iterative optimization approach.