85 resultados para CLASSIFICATION PLANS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usefulness of motor subtypes of delirium is unclear due to inconsistency in subtyping methods and a lack of validation with objective measures of activity. The activity of 40 patients was measured over 24 h with a discrete accelerometer-based activity monitor. The continuous wavelet transform (CWT) with various mother wavelets were applied to accelerometry data from three randomly selected patients with DSM-IV delirium that were readily divided into hyperactive, hypoactive, and mixed motor subtypes. A classification tree used the periods of overall movement as measured by the discrete accelerometer-based monitor as determining factors for which to classify these delirious patients. This data used to create the classification tree were based upon the minimum, maximum, standard deviation, and number of coefficient values, generated over a range of scales by the CWT. The classification tree was subsequently used to define the remaining motoric subtypes. The use of a classification system shows how delirium subtypes can be categorized in relation to overall motoric behavior. The classification system was also implemented to successfully define other patient motoric subtypes. Motor subtypes of delirium defined by observed ward behavior differ in electronically measured activity levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic discrimination (SD) depends on a discriminant function for classification. In this paper, an improved SD is introduced to reduce the error rate of the standard SD in the context of a two-class classification problem. The learning procedure of the improved SD consists of two stages. Initially a standard SD, but with shorter learning period is carried out to identify an important space where all the misclassified samples are located. Then the standard SD is modified by 1) restricting sampling in the important space, and 2) introducing a new discriminant function for samples in the important space. It is shown by mathematical derivation that the new discriminant function has the same mean, but with a smaller variance than that of the standard SD for samples in the important space. It is also analyzed that the smaller the variance of the discriminant function, the lower the error rate of the classifier. Consequently, the proposed improved SD improves standard SD by its capability of achieving higher classification accuracy. Illustrative examples are provided to demonstrate the effectiveness of the proposed improved SD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change is expected to produce reductions in water availability in England, potentially necessitating adaptive action by the water industry to maintain supplies. As part of Ofwat's fifth Periodic Review (PR09), water companies recently released their draft Water Resources Management Plans, setting out how each company intends to maintain the balance between the supply and demand for water over the next 25 years, following Environment Agency guidelines. This paper reviews these plans to determine company estimates of the impact of climate change on water supply relative to other resource pressures. The approaches adopted for incorporating the impact in the plans and the proposed management solutions are also identified. Climate change impacts for individual resource zones range from no reductions in deployable output to greater than 50% over the planning period. The estimated national aggregated loss of deployable output under a “core” climate scenario is ~520 Ml/d (3% of deployable output) by 2034/35, the equivalent of the supply of one entire water company (South West Water). Climate change is the largest single driver of change in water supplies over the planning period. Over half of the climate change impact is concentrated in southern England. In extreme cases, climate change uncertainty is of the same magnitude as the change under the core scenario (up to a loss of ~475 Ml/d). 44 of the 68 resource zones with available data are estimated to have a climate change impact. In 35 of these climate change has the greatest impact although in 10 zones sustainability reductions have a greater impact. Of the overall change in downward pressure on the supply-demand balance over the planning period, ~56% is accounted for by increased demand (620 Ml/d) and supply side climate change accounts for ~37% (407 Ml/d). Climate change impacts have a cumulative impact in concert with other changing supply side reducing components increasing the national pressure on the supply-demand balance. Whilst the magnitude of climate change appears to justify its explicit consideration, it is rare that adaptation options are planned solely in response to climate change but as a suite of options to provide a resilient supply to a range of pressures (including significant demand side pressures). Supply-side measures still tend to be considered by water companies to be more reliable than demand-side measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airborne LIght Detection And Ranging (LIDAR) provides accurate height information for objects on the earth, which makes LIDAR become more and more popular in terrain and land surveying. In particular, LIDAR data offer vital and significant features for land-cover classification which is an important task in many application domains. In this paper, an unsupervised approach based on an improved fuzzy Markov random field (FMRF) model is developed, by which the LIDAR data, its co-registered images acquired by optical sensors, i.e. aerial color image and near infrared image, and other derived features are fused effectively to improve the ability of the LIDAR system for the accurate land-cover classification. In the proposed FMRF model-based approach, the spatial contextual information is applied by modeling the image as a Markov random field (MRF), with which the fuzzy logic is introduced simultaneously to reduce the errors caused by the hard classification. Moreover, a Lagrange-Multiplier (LM) algorithm is employed to calculate a maximum A posteriori (MAP) estimate for the classification. The experimental results have proved that fusing the height data and optical images is particularly suited for the land-cover classification. The proposed approach works very well for the classification from airborne LIDAR data fused with its coregistered optical images and the average accuracy is improved to 88.9%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A unified view on the interfacial instability in a model of aluminium reduction cells in the presence of a uniform, vertical, background magnetic field is presented. The classification of instability modes is based on the asymptotic theory for high values of parameter β, which characterises the ratio of the Lorentz force based on the disturbance current, and gravity. It is shown that the spectrum of the travelling waves consists of two parts independent of the horizontal cross-section of the cell: highly unstable wall modes and stable or weakly unstable centre, or Sele’s modes. The wall modes with the disturbance of the interface being localised at the sidewalls of the cell dominate the dynamics of instability. Sele’s modes are characterised by a distributed disturbance over the whole horizontal extent of the cell. As β increases these modes are stabilized by the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-world text classification tasks often suffer from poor class structure with many overlapping classes and blurred boundaries. Training data pooled from multiple sources tend to be inconsistent and contain erroneous labelling, leading to poor performance of standard text classifiers. The classification of health service products to specialized procurement classes is used to examine and quantify the extent of these problems. A novel method is presented to analyze the labelled data by selectively merging classes where there is not enough information for the classifier to distinguish them. Initial results show the method can identify the most problematic classes, which can be used either as a focus to improve the training data or to merge classes to increase confidence in the predicted results of the classifier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the use of a genetic algorithm to select financial ratios for corporate distress classification models. For this purpose, the fitness value associated to a set of ratios is made to reflect the requirements of maximizing the amount of information available for the model and minimizing the collinearity between the model inputs. A case study involving 60 failed and continuing British firms in the period 1997-2000 is used for illustration. The classification model based on ratios selected by the genetic algorithm compares favorably with a model employing ratios usually found in the financial distress literature.