83 resultados para Minimal-complexity classifier

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many kernel classifier construction algorithms adopt classification accuracy as performance metrics in model evaluation. Moreover, equal weighting is often applied to each data sample in parameter estimation. These modeling practices often become problematic if the data sets are imbalanced. We present a kernel classifier construction algorithm using orthogonal forward selection (OFS) in order to optimize the model generalization for imbalanced two-class data sets. This kernel classifier identification algorithm is based on a new regularized orthogonal weighted least squares (ROWLS) estimator and the model selection criterion of maximal leave-one-out area under curve (LOO-AUC) of the receiver operating characteristics (ROCs). It is shown that, owing to the orthogonalization procedure, the LOO-AUC can be calculated via an analytic formula based on the new regularized orthogonal weighted least squares parameter estimator, without actually splitting the estimation data set. The proposed algorithm can achieve minimal computational expense via a set of forward recursive updating formula in searching model terms with maximal incremental LOO-AUC value. Numerical examples are used to demonstrate the efficacy of the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a particle swarm optimisation (PSO) aided orthogonal forward regression (OFR) approach for constructing radial basis function (RBF) classifiers with tunable nodes. At each stage of the OFR construction process, the centre vector and diagonal covariance matrix of one RBF node is determined efficiently by minimising the leave-one-out (LOO) misclassification rate (MR) using a PSO algorithm. Compared with the state-of-the-art regularisation assisted orthogonal least square algorithm based on the LOO MR for selecting fixednode RBF classifiers, the proposed PSO aided OFR algorithm for constructing tunable-node RBF classifiers offers significant advantages in terms of better generalisation performance and smaller model size as well as imposes lower computational complexity in classifier construction process. Moreover, the proposed algorithm does not have any hyperparameter that requires costly tuning based on cross validation.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Runoff, sediment, total phosphorus and total dissolved phosphorus losses in overland flow were measured for two years on unbounded plots cropped with wheat and oats. Half of the field was cultivated with minimum tillage (shallow tillage with a tine cultivator) and half was conventionally ploughed. Within each cultivation treatment there were different treatment areas (TAs). In the first year of the experiment, one TA was cultivated up and down the slope, one TA was cultivated on the contour, with a beetle bank acting as a vegetative barrier partway up the slope, and one had a mixed direction cultivation treatment, with cultivation and drilling conducted up and down the slope and all subsequent operations conducted on the contour. In the second year, this mixed treatment was replaced with contour cultivation. Results showed no significant reduction in runoff, sediment losses or total phosphorus losses from minimum tillage when compared to the conventional plough treatment, but there were increased losses of total dissolved phosphorus with minimum tillage. The mixed direction cultivation treatment increased surface runoff and losses of sediment and phosphorus. Increasing surface roughness with contour cultivation reduced surface runoff compared to up and down slope cultivation in both the plough and minimum tillage treatment areas, but this trend was not significant. Sediment and phosphorus losses in the contour cultivation treatment followed a very similar pattern to runoff. Combining contour cultivation with a vegetative barrier in the form of a beetle bank to reduce slope length resulted in a non-significant reduction in surface runoff, sediment and total phosphorus when compared to up and down slope cultivation, but there was a clear trend towards reduced losses. However, the addition of a beetle bank did not provide a significant reduction in runoff, sediment losses or total phosphorus losses when compared to contour cultivation, suggesting only a marginal additional benefit. The economic implications for farmers of the different treatment options are investigated in order to assess their suitability for implementation at a field scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Although the importance of plant community assemblages in structuring invertebrate assemblages is well known, the role that architectural complexity plays is less well understood. In particular, direct empirical data for a range of invertebrate taxa showing how functional groups respond to plant architecture is largely absent from the literature. 2. The significance of sward architectural complexity in determining the species richness of predatory and phytophagous functional groups of spiders, beetles, and true bugs, sampled from 135 field margin plots over 2 years was tested. The present study compares the relative importance of sward architectural complexity to that of plant community assemblage. 3. Sward architectural complexity was found to be a determinant of species richness for all phytophagous and predatory functional groups. When individual species responses were investigated, 62.5% of the spider and beetle species, and 50.0% of the true bugs responded to sward architectural complexity. 4. Interactions between sward architectural complexity and plant community assemblage indicate that the number of invertebrate species supported by the plant community alone could be increased by modification of sward architecture. Management practices could therefore play a key role in diversifying the architectural structure of existing floral assemblages for the benefit of invertebrate assemblages. 5. The contrasting effects of sward architecture on invertebrate functional groups characterised by either direct (phytophagous species) or indirect (predatory species) dependence on plant communities is discussed. It is suggested that for phytophagous taxa, plant community assemblage alone is likely to be insufficient to ensure successful species colonisation or persistence without appropriate development of sward architecture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes two studies. The first study was designed to investigate the ways in which the statutory assessments of reading for 11-year-old children in England assess inferential abilities. The second study was designed to investigate the levels of performance achieved in these tests in 2001 and 2002 by 11-year-old children attending state-funded local authority schools in one London borough. In the first study, content and questions used in the reading papers for the Standard Assessment Tasks (SATs) in the years 2001 and 2002 were analysed to see what types of inference were being assessed. This analysis suggested that the complexity involved in inference making and the variety of inference types that are made during the reading process are not adequately sampled in the SATs. Similar inadequacies are evident in the ways in which the programmes of study for literacy recommended by central government deal with inference. In the second study, scripts of completed SATs reading papers for 2001 and 2002 were analysed to investigate the levels of inferential ability evident in scripts of children achieving different SATs levels. The analysis in this article suggests that children who only just achieve the 'target' Level 4 do so with minimal use of inference skills. They are particularly weak in making inferences that require the application of background knowledge. Thus, many children who achieve the reading level (Level 4) expected of 11-year-olds are entering secondary education with insecure inference-making skills that have not been recognised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To compare insulin sensitivity (Si) from a frequently sampled intravenous glucose tolerance test (FSIGT) and subsequent minimal model analyses with surrogate measures of insulin sensitivity and resistance and to compare features of the metabolic syndrome between Caucasians and Indian Asians living in the UK. SUBJECTS: In all, 27 healthy male volunteers (14 UK Caucasians and 13 UK Indian Asians), with a mean age of 51.2 +/- 1.5 y, BMI of 25.8 +/- 0.6 kg/m(2) and Si of 2.85 +/- 0.37. MEASUREMENTS: Si was determined from an FSIGT with subsequent minimal model analysis. The concentrations of insulin, glucose and nonesterified fatty acids (NEFA) were analysed in fasting plasma and used to calculate surrogate measure of insulin sensitivity (quantitative insulin sensitivity check index (QUICKI), revised QUICKI) and resistance (homeostasis for insulin resistance (HOMA IR), fasting insulin resistance index (FIRI), Bennetts index, fasting insulin, insulin-to-glucose ratio). Plasma concentrations of triacylglycerol (TAG), total cholesterol, high density cholesterol, (HDL-C) and low density cholesterol, (LDL-C) were also measured in the fasted state. Anthropometric measurements were conducted to determine body-fat distribution. RESULTS: Correlation analysis identified the strongest relationship between Si and the revised QUICKI (r = 0.67; P = 0.000). Significant associations were also observed between Si and QUICKI (r = 0.51; P = 0.007), HOMA IR (r = -0.50; P = 0.009), FIRI and fasting insulin. The Indian Asian group had lower HDL-C (P = 0.001), a higher waist-hip ratio (P = 0.01) and were significantly less insulin sensitive (Si) than the Caucasian group (P = 0.02). CONCLUSION: The revised QUICKI demonstrated a statistically strong relationship with the minimal model. However, it was unable to differentiate between insulin-sensitive and -resistant groups in this study. Future larger studies in population groups with varying degrees of insulin sensitivity are recommended to investigate the general applicability of the revised QUICKI surrogate technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human electroencephalogram (EEG) is globally characterized by a 1/f power spectrum superimposed with certain peaks, whereby the "alpha peak" in a frequency range of 8-14 Hz is the most prominent one for relaxed states of wakefulness. We present simulations of a minimal dynamical network model of leaky integrator neurons attached to the nodes of an evolving directed and weighted random graph (an Erdos-Renyi graph). We derive a model of the dendritic field potential (DFP) for the neurons leading to a simulated EEG that describes the global activity of the network. Depending on the network size, we find an oscillatory transition of the simulated EEG when the network reaches a critical connectivity. This transition, indicated by a suitably defined order parameter, is reflected by a sudden change of the network's topology when super-cycles are formed from merging isolated loops. After the oscillatory transition, the power spectra of simulated EEG time series exhibit a 1/f continuum superimposed with certain peaks. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have discovered a novel approach of intrusion detection system using an intelligent data classifier based on a self organizing map (SOM). We have surveyed all other unsupervised intrusion detection methods, different alternative SOM based techniques and KDD winner IDS methods. This paper provides a robust designed and implemented intelligent data classifier technique based on a single large size (30x30) self organizing map (SOM) having the capability to detect all types of attacks given in the DARPA Archive 1999 the lowest false positive rate being 0.04 % and higher detection rate being 99.73% tested using full KDD data sets and 89.54% comparable detection rate and 0.18% lowest false positive rate tested using corrected data sets.