163 resultados para User classification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a fully complex-valued radial basis function (RBF) network for regression and classification applications. For regression problems, the locally regularised orthogonal least squares (LROLS) algorithm aided with the D-optimality experimental design, originally derived for constructing parsimonious real-valued RBF models, is extended to the fully complex-valued RBF (CVRBF) network. Like its real-valued counterpart, the proposed algorithm aims to achieve maximised model robustness and sparsity by combining two effective and complementary approaches. The LROLS algorithm alone is capable of producing a very parsimonious model with excellent generalisation performance while the D-optimality design criterion further enhances the model efficiency and robustness. By specifying an appropriate weighting for the D-optimality cost in the combined model selecting criterion, the entire model construction procedure becomes automatic. An example of identifying a complex-valued nonlinear channel is used to illustrate the regression application of the proposed fully CVRBF network. The proposed fully CVRBF network is also applied to four-class classification problems that are typically encountered in communication systems. A complex-valued orthogonal forward selection algorithm based on the multi-class Fisher ratio of class separability measure is derived for constructing sparse CVRBF classifiers that generalise well. The effectiveness of the proposed algorithm is demonstrated using the example of nonlinear beamforming for multiple-antenna aided communication systems that employ complex-valued quadrature phase shift keying modulation scheme. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An orthogonal forward selection (OFS) algorithm based on leave-one-out (LOO) criteria is proposed for the construction of radial basis function (RBF) networks with tunable nodes. Each stage of the construction process determines an RBF node, namely, its center vector and diagonal covariance matrix, by minimizing the LOO statistics. For regression application, the LOO criterion is chosen to be the LOO mean-square error, while the LOO misclassification rate is adopted in two-class classification application. This OFS-LOO algorithm is computationally efficient, and it is capable of constructing parsimonious RBF networks that generalize well. Moreover, the proposed algorithm is fully automatic, and the user does not need to specify a termination criterion for the construction process. The effectiveness of the proposed RBF network construction procedure is demonstrated using examples taken from both regression and classification applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The SPE taxonomy of evolving software systems, first proposed by Lehman in 1980, is re-examined in this work. The primary concepts of software evolution are related to generic theories of evolution, particularly Dawkins' concept of a replicator, to the hermeneutic tradition in philosophy and to Kuhn's concept of paradigm. These concepts provide the foundations that are needed for understanding the phenomenon of software evolution and for refining the definitions of the SPE categories. In particular, this work argues that a software system should be defined as of type P if its controlling stakeholders have made a strategic decision that the system must comply with a single paradigm in its representation of domain knowledge. The proposed refinement of SPE is expected to provide a more productive basis for developing testable hypotheses and models about possible differences in the evolution of E- and P-type systems than is provided by the original scheme. Copyright (C) 2005 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the selection of inputs for classification models based on ratios of measured quantities. For this purpose, all possible ratios are built from the quantities involved and variable selection techniques are used to choose a convenient subset of ratios. In this context, two selection techniques are proposed: one based on a pre-selection procedure and another based on a genetic algorithm. In an example involving the financial distress prediction of companies, the models obtained from ratios selected by the proposed techniques compare favorably to a model using ratios usually found in the financial distress literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In rapid scan Fourier transform spectrometry, we show that the noise in the wavelet coefficients resulting from the filter bank decomposition of the complex insertion loss function is linearly related to the noise power in the sample interferogram by a noise amplification factor. By maximizing an objective function composed of the power of the wavelet coefficients divided by the noise amplification factor, optimal feature extraction in the wavelet domain is performed. The performance of a classifier based on the output of a filter bank is shown to be considerably better than that of an Euclidean distance classifier in the original spectral domain. An optimization procedure results in a further improvement of the wavelet classifier. The procedure is suitable for enhancing the contrast or classifying spectra acquired by either continuous wave or THz transient spectrometers as well as for increasing the dynamic range of THz imaging systems. (C) 2003 Optical Society of America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usefulness of motor subtypes of delirium is unclear due to inconsistency in subtyping methods and a lack of validation with objective measures of activity. The activity of 40 patients was measured over 24 h with a commercial accelerometer-based activity monitor. Accelerometry data from patients with DSM-IV delirium that were readily divided into hyperactive, hypoactive and mixed motor subtypes, were used to create classification trees that were Subsequently applied to the remaining cohort to define motoric subtypes. The classification trees used the periods of sitting/lying, standing, stepping and number of postural transitions as measured by the activity monitor as determining factors from which to classify the delirious cohort. The use of a classification system shows how delirium subtypes can be categorised in relation to overall activity and postural changes, which was one of the most discriminating measures examined. The classification system was also implemented to successfully define other patient motoric subtypes. Motor subtypes of delirium defined by observed ward behaviour differ in electronically measured activity levels. Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The usefulness of motor subtypes of delirium is unclear due to inconsistency in subtyping methods and a lack of validation with objective measures of activity. The activity of 40 patients was measured over 24 h with a discrete accelerometer-based activity monitor. The continuous wavelet transform (CWT) with various mother wavelets were applied to accelerometry data from three randomly selected patients with DSM-IV delirium that were readily divided into hyperactive, hypoactive, and mixed motor subtypes. A classification tree used the periods of overall movement as measured by the discrete accelerometer-based monitor as determining factors for which to classify these delirious patients. This data used to create the classification tree were based upon the minimum, maximum, standard deviation, and number of coefficient values, generated over a range of scales by the CWT. The classification tree was subsequently used to define the remaining motoric subtypes. The use of a classification system shows how delirium subtypes can be categorized in relation to overall motoric behavior. The classification system was also implemented to successfully define other patient motoric subtypes. Motor subtypes of delirium defined by observed ward behavior differ in electronically measured activity levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general packet radio service (GPRS) has been developed to allow packet data to be transported efficiently over an existing circuit-switched radio network, such as GSM. The main application of GPRS are in transporting Internet protocol (IP) datagrams from web servers (for telemetry or for mobile Internet browsers). Four GPRS baseband coding schemes are defined to offer a trade-off in requested data rates versus propagation channel conditions. However, data rates in the order of > 100 kbits/s are only achievable if the simplest coding scheme is used (CS-4) which offers little error detection and correction (EDC) (requiring excellent SNR) and the receiver hardware is capable of full duplex which is not currently available in the consumer market. A simple EDC scheme to improve the GPRS block error rate (BLER) performance is presented, particularly for CS-4, however gains in other coding schemes are seen. For every GPRS radio block that is corrected by the EDC scheme, the block does not need to be retransmitted releasing bandwidth in the channel and improving the user's application data rate. As GPRS requires intensive processing in the baseband, a viable field programmable gate array (FPGA) solution is presented in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The General Packet Radio Service (GPRS) was developed to allow packet data to be transported efficiently over an existing circuit switched radio network. The main applications for GPRS are in transporting IP datagram’s from the user’s mobile Internet browser to and from the Internet, or in telemetry equipment. A simple Error Detection and Correction (EDC) scheme to improve the GPRS Block Error Rate (BLER) performance is presented, particularly for coding scheme 4 (CS-4), however gains in other coding schemes are seen. For every GPRS radio block that is corrected by the EDC scheme, the block does not need to be retransmitted releasing bandwidth in the channel, improving throughput and the user’s application data rate. As GPRS requires intensive processing in the baseband, a viable hardware solution for a GPRS BLER co-processor is discussed that has been currently implemented in a Field Programmable Gate Array (FPGA) and presented in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The knowledge economy offers opportunity to a broad and diverse community of information systems users to efficiently gain information and know-how for improving qualifications and enhancing productivity in the work place. Such demand will continue and users will frequently require optimised and personalised information content. The advancement of information technology and the wide dissemination of information endorse individual users when constructing new knowledge from their experience in the real-world context. However, a design of personalised information provision is challenging because users’ requirements and information provision specifications are complex in their representation. The existing methods are not able to effectively support this analysis process. This paper presents a mechanism which can holistically facilitate customisation of information provision based on individual users’ goals, level of knowledge and cognitive styles preferences. An ontology model with embedded norms represents the domain knowledge of information provision in a specific context where users’ needs can be articulated and represented in a user profile. These formal requirements can then be transformed onto information provision specifications which are used to discover suitable information content from repositories and pedagogically organise the selected content to meet the users’ needs. The method is provided with adaptability which enables an appropriate response to changes in users’ requirements during the process of acquiring knowledge and skills.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes work undertaken by the VERA project to investigate how archaeologists work with information technology (IT) on excavation sites. We used a diary study to research the usual patterns of behaviour of archaeologists digging the Silchester Roman town site during the summer of 2007. Although recording had previously been undertaken using pen and paper, during the 2007 season a part of the dig was dedicated to trials of IT and archaeologists used digital pens and paper and Nokia N800 handheld PDAs to record their work. The goal of the trial was to see whether it was possible to record data from the dig whilst still on site, rather than waiting until after the excavation to enter it into the Integrated Archaeological Database (IADB) and to determine whether the archaeologists found the new technology helpful. The digital pens were a success, however, the N800s were not successful given the extreme conditions on site. Our findings confirmed that it was important that technology should fit in well with the work being undertaken rather than being used for its own sake, and should respect established work flows. We also found that the quality of data being entered was a recurrent concern as was the reliability of the infrastructure and equipment.