197 resultados para feature based modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A unified approach is proposed for data modelling that includes supervised regression and classification applications as well as unsupervised probability density function estimation. The orthogonal-least-squares regression based on the leave-one-out test criteria is formulated within this unified data-modelling framework to construct sparse kernel models that generalise well. Examples from regression, classification and density estimation applications are used to illustrate the effectiveness of this generic data-modelling approach for constructing parsimonious kernel models with excellent generalisation capability. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The knowledge economy offers opportunity to a broad and diverse community of information systems users to efficiently gain information and know-how for improving qualifications and enhancing productivity in the work place. Such demand will continue and users will frequently require optimised and personalised information content. The advancement of information technology and the wide dissemination of information endorse individual users when constructing new knowledge from their experience in the real-world context. However, a design of personalised information provision is challenging because users’ requirements and information provision specifications are complex in their representation. The existing methods are not able to effectively support this analysis process. This paper presents a mechanism which can holistically facilitate customisation of information provision based on individual users’ goals, level of knowledge and cognitive styles preferences. An ontology model with embedded norms represents the domain knowledge of information provision in a specific context where users’ needs can be articulated and represented in a user profile. These formal requirements can then be transformed onto information provision specifications which are used to discover suitable information content from repositories and pedagogically organise the selected content to meet the users’ needs. The method is provided with adaptability which enables an appropriate response to changes in users’ requirements during the process of acquiring knowledge and skills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of eutrophication in river systems is poorly understood given the complex relationship between fixed plants, algae, hydrodynamics, water chemistry and solar radiation. However there is a pressing need to understand the relationship between the ecological status of rivers and the controlling environmental factors to help the reasoned implementation of the Water Framework Directive and Catchment Sensitive Farming in the UK. This research aims to create a dynamic, process-based, mathematical in-stream model to simulate the growth and competition of different vegetation types (macrophytes, phytoplankton and benthic algae) in rivers. The model, applied to the River Frome (Dorset, UK), captured well the seasonality of simulated vegetation types (suspended algae, macrophytes, epiphytes, sediment biofilm). Macrophyte results showed that local knowledge is important for explaining unusual changes in biomass. Fixed algae simulations indicated the need for the more detailed representation of various herbivorous grazer groups, however this would increase the model complexity, the number of model parameters and the required observation data to better define the model. The model results also highlighted that simulating only phytoplankton is insufficient in river systems, because the majority of the suspended algae have benthic origin in short retention time rivers. Therefore, there is a need for modelling tools that link the benthic and free-floating habitats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The River Lugg has particular problems with high sediment loads that have resulted in detrimental impacts on ecology and fisheries. A new dynamic, process-based model of hydrology and sediments (INCA- SED) has been developed and applied to the River Lugg system using an extensive data set from 1995–2008. The model simulates sediment sources and sinks throughout the catchment and gives a good representation of the sediment response at 22 reaches along the River Lugg. A key question considered in using the model is the management of sediment sources so that concentrations and bed loads can be reduced in the river system. Altogether, five sediment management scenarios were selected for testing on the River Lugg, including land use change, contour tillage, hedging and buffer strips. Running the model with parameters altered to simulate these five scenarios produced some interesting results. All scenarios achieved some reduction in sediment levels, with the 40% land use change achieving the best result with a 19% reduction. The other scenarios also achieved significant reductions of between 7% and 9%. Buffer strips produce the best result at close to 9%. The results suggest that if hedge introduction, contour tillage and buffer strips were all applied, sediment reductions would total 24%, considerably improving the current sediment situation. We present a novel cost-effectiveness analysis of our results where we use percentage of land removed from production as our cost function. Given the minimal loss of land associated with contour tillage, hedges and buffer strips, we suggest that these management practices are the most cost-effective combination to reduce sediment loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An elastomeric, healable, supramolecular polymer blend comprising a chain-folding polyimide and a telechelic polyurethane with pyrenyl end groups is compatibilized by aromatic pi-pi stacking between the pi-electron-deficient diimide groups and the pi-electron-rich pyrenyl units. This interpolymer interaction is the key to forming a tough, healable, elastomeric material. Variable-temperature FTIR analysis of the bulk material also conclusively demonstrates the presence of hydrogen bonding, which complements the pi-pi stacking interactions. Variable-temperature SAXS analysis shows that the healable polymeric blend has a nanophase-separated morphology and that the X-ray contrast between the two types of domain increases with increasing temperature, a feature that is repeatable over several heating and cooling cycles. A fractured sample of this material reproducibly regains more than 95% of the tensile modulus, 91% of the elongation to break, and 77% of the modulus of toughness of the pristine material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a forward-looking infrared (FLIR) video surveillance system is presented for collision avoidance of moving ships to bridge piers. An image preprocessing algorithm is proposed to reduce clutter background by multi-scale fractal analysis, in which the blanket method is used for fractal feature computation. Then, the moving ship detection algorithm is developed from image differentials of the fractal feature in the region of surveillance between regularly interval frames. When the moving ships are detected in region of surveillance, the device for safety alert is triggered. Experimental results have shown that the approach is feasible and effective. It has achieved real-time and reliable alert to avoid collisions of moving ships to bridge piers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The budgets of seven halogenated gases (CFC-11, CFC-12, CFC-113, CFC-114, CFC-115, CCl4 and SF6) are studied by comparing measurements in polar firn air from two Arctic and three Antarctic sites, and simulation results of two numerical models: a 2-D atmospheric chemistry model and a 1-D firn diffusion model. The first one is used to calculate atmospheric concentrations from emission trends based on industrial inventories; the calculated concentration trends are used by the second one to produce depth concentration profiles in the firn. The 2-D atmospheric model is validated in the boundary layer by comparison with atmospheric station measurements, and vertically for CFC-12 by comparison with balloon and FTIR measurements. Firn air measurements provide constraints on historical atmospheric concentrations over the last century. Age distributions in the firn are discussed using a Green function approach. Finally, our results are used as input to a radiative model in order to evaluate the radiative forcing of our target gases. Multi-species and multi-site firn air studies allow to better constrain atmospheric trends. The low concentrations of all studied gases at the bottom of the firn, and their consistency with our model results confirm that their natural sources are small. Our results indicate that the emissions, sinks and trends of CFC-11, CFC-12, CFC-113, CFC-115 and SF6 are well constrained, whereas it is not the case for CFC-114 and CCl4. Significant emission-dependent changes in the lifetimes of halocarbons destroyed in the stratosphere were obtained. Those result from the time needed for their transport from the surface where they are emitted to the stratosphere where they are destroyed. Efforts should be made to update and reduce the large uncertainties on CFC lifetimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multivariable hyperstable robust adaptive decoupling control algorithm based on a neural network is presented for the control of nonlinear multivariable coupled systems with unknown parameters and structure. The Popov theorem is used in the design of the controller. The modelling errors, coupling action and other uncertainties of the system are identified on-line by a neural network. The identified results are taken as compensation signals such that the robust adaptive control of nonlinear systems is realised. Simulation results are given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last decade has seen the re-emergence of artificial neural networks as an alternative to traditional modelling techniques for the control of nonlinear systems. Numerous control schemes have been proposed and have been shown to work in simulations. However, very few analyses have been made of the working of these networks. The authors show that a receding horizon control strategy based on a class of recurrent networks can stabilise nonlinear systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variations on the standard Kohonen feature map can enable an ordering of the map state space by using only a limited subset of the complete input vector. Also it is possible to employ merely a local adaptation procedure to order the map, rather than having to rely on global variables and objectives. Such variations have been included as part of a hybrid learning system (HLS) which has arisen out of a genetic-based classifier system. In the paper a description of the modified feature map is given, which constitutes the HLSs long term memory, and results in the control of a simple maze running task are presented, thereby demonstrating the value of goal related feedback within the overall network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quadratic programming techniques were applied to household food consumption data in England and Wales to estimate likely changes in diet under healthy eating guidelines, and the consequences this would have on agriculture and land use in England and Wales. The first step entailed imposing nutrient restrictions on food consumption following dietary recommendations suggested by the UK Department of Health. The resulting diet was used, in a second step as a proxy for demand in agricultural commodities, to test the impact of such a scenario on food production and land use in England and Wales and the impacts of this on agricultural landscapes. Results of the diet optimisation indicated a large drop in consumption of foods rich in saturated fats and sugar, essentially cheese and sugar-based products, along with lesser cuts of fat and meat products. Conversely, consumption of fruit and vegetables, cereals, and flour would increase to meet dietary fibre recommendations. Such a shift in demand would dramatically affect production patterns: the financial net margin of England and Wales agriculture would rise, due to increased production of high market value and high economic margin crops. Some regions would, however, be negatively affected, mostly those dependent on beef cattle and sheep production that could not benefit from an increased demand for cereals and horticultural crops. The effects of these changes would also be felt in upstream industries, such as animal feed suppliers. While arable dominated landscapes would be little affected, pastoral landscapes would suffer through loss of grazing management and, possibly, land abandonment, especially in upland areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new fast, effective and practical model structure construction algorithm for a mixture of experts network system utilising only process data. The algorithm is based on a novel forward constrained regression procedure. Given a full set of the experts as potential model bases, the structure construction algorithm, formed on the forward constrained regression procedure, selects the most significant model base one by one so as to minimise the overall system approximation error at each iteration, while the gate parameters in the mixture of experts network system are accordingly adjusted so as to satisfy the convex constraints required in the derivation of the forward constrained regression procedure. The procedure continues until a proper system model is constructed that utilises some or all of the experts. A pruning algorithm of the consequent mixture of experts network system is also derived to generate an overall parsimonious construction algorithm. Numerical examples are provided to demonstrate the effectiveness of the new algorithms. The mixture of experts network framework can be applied to a wide variety of applications ranging from multiple model controller synthesis to multi-sensor data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new image data fusion scheme by combining median filtering with self-organizing feature map (SOFM) neural networks. The scheme consists of three steps: (1) pre-processing of the images, where weighted median filtering removes part of the noise components corrupting the image, (2) pixel clustering for each image using self-organizing feature map neural networks, and (3) fusion of the images obtained in Step (2), which suppresses the residual noise components and thus further improves the image quality. It proves that such a three-step combination offers an impressive effectiveness and performance improvement, which is confirmed by simulations involving three image sensors (each of which has a different noise structure).