931 resultados para model-based reasoning


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Restoring a scene distorted by atmospheric turbulence is a challenging problem in video surveillance. The effect, caused by random, spatially varying, perturbations, makes a model-based solution difficult and in most cases, impractical. In this paper, we propose a novel method for mitigating the effects of atmospheric distortion on observed images, particularly airborne turbulence which can severely degrade a region of interest (ROI). In order to extract accurate detail about objects behind the distorting layer, a simple and efficient frame selection method is proposed to select informative ROIs only from good-quality frames. The ROIs in each frame are then registered to further reduce offsets and distortions. We solve the space-varying distortion problem using region-level fusion based on the dual tree complex wavelet transform. Finally, contrast enhancement is applied. We further propose a learning-based metric specifically for image quality assessment in the presence of atmospheric distortion. This is capable of estimating quality in both full-and no-reference scenarios. The proposed method is shown to significantly outperform existing methods, providing enhanced situational awareness in a range of surveillance scenarios. © 1992-2012 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to account for interfacial friction of composite materials, an analytical model based on contact geometry and local friction is proposed. A contact area includes several types of microcontacts depending on reinforcement materials and their shape. A proportion between these areas is defined by in-plane contact geometry. The model applied to a fibre-reinforced composite results in the dependence of friction on surface fibre fraction and local friction coefficients. To validate this analytical model, an experimental study on carbon fibrereinforced epoxy composites under low normal pressure was performed. The effects of fibre volume fraction and fibre orientation were studied, discussed and compared with analytical model results. © Springer Science+Business Media, LLC 2012.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To investigate the seasonal and interannual variations in biological productivity in the South China Sea (SCS), a Pacific basin-wide physical - biogeochemical model has been developed and used to estimate the biological productivity and export flux in the SCS. The Pacific circulation model, based on the Regional Ocean Model Systems (ROMS), is forced with daily air-sea fluxes derived from the NCEP (National Centers for Environmental Prediction) reanalysis between 1990 and 2004. The biogeochemical processes are simulated with a carbon, Si(OH)(4), and nitrogen ecosystem (CoSiNE) model consisting of silicate, nitrate, ammonium, two phytoplankton groups (small phytoplankton and large phytoplankton), two zooplankton grazers (small micrograzers and large mesozooplankton), and two detritus pools. The ROMS-CoSiNE model favourably reproduces many of the observed features, such as ChI a, nutrients, and primary production (PP) in the SCS. The modelled depth-integrated PP over the euphotic zone (0-125 m) varies seasonally, with the highest value of 386 mg C m (-2) d (-1) during winter and the lowest value of 156 mg C m (-2) d (-1) during early summer. The annual mean value is 196 mg C m (-2) d (-1). The model-integrated annual mean new production (uptake of nitrate), in carbon units, is 64.4 mg C m (-2) d (-1) which yields an f-ratio of 0.33 for the entire SCS. The modelled export ratio (e-ratio: the ratio of export to PP) is 0.24 for the basin-wide SCS. The year-to-year variation of biological productivity in the SCS is weaker than the seasonal variation. The large phytoplankton group tends to dominate over the smaller phytoplankton group, and likely plays an important role in determining the interannual variability of primary and new production.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper consists of two major parts. First, we present the outline of a simple approach to very-low bandwidth video-conferencing system relying on an example-based hierarchical image compression scheme. In particular, we discuss the use of example images as a model, the number of required examples, faces as a class of semi-rigid objects, a hierarchical model based on decomposition into different time-scales, and the decomposition of face images into patches of interest. In the second part, we present several algorithms for image processing and animation as well as experimental evaluations. Among the original contributions of this paper is an automatic algorithm for pose estimation and normalization. We also review and compare different algorithms for finding the nearest neighbors in a database for a new input as well as a generalized algorithm for blending patches of interest in order to synthesize new images. Finally, we outline the possible integration of several algorithms to illustrate a simple model-based video-conference system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Joern Fischer, David B. Lindermayer, and Ioan Fazey (2004). Appreciating Ecological Complexity: Habitat Contours as a Conceptual Landscape Model. Conservation Biology, 18 (5)pp.1245-1253 RAE2008

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An improved method for deformable shape-based image segmentation is described. Image regions are merged together and/or split apart, based on their agreement with an a priori distribution on the global deformation parameters for a shape template. The quality of a candidate region merging is evaluated by a cost measure that includes: homogeneity of image properties within the combined region, degree of overlap with a deformed shape model, and a deformation likelihood term. Perceptually-motivated criteria are used to determine where/how to split regions, based on the local shape properties of the region group's bounding contour. A globally consistent interpretation is determined in part by the minimum description length principle. Experiments show that the model-based splitting strategy yields a significant improvement in segmention over a method that uses merging alone.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Constraint programming has emerged as a successful paradigm for modelling combinatorial problems arising from practical situations. In many of those situations, we are not provided with an immutable set of constraints. Instead, a user will modify his requirements, in an interactive fashion, until he is satisfied with a solution. Examples of such applications include, amongst others, model-based diagnosis, expert systems, product configurators. The system he interacts with must be able to assist him by showing the consequences of his requirements. Explanations are the ideal tool for providing this assistance. However, existing notions of explanations fail to provide sufficient information. We define new forms of explanations that aim to be more informative. Even if explanation generation is a very hard task, in the applications we consider, we must manage to provide a satisfactory level of interactivity and, therefore, we cannot afford long computational times. We introduce the concept of representative sets of relaxations, a compact set of relaxations that shows the user at least one way to satisfy each of his requirements and at least one way to relax them, and present an algorithm that efficiently computes such sets. We introduce the concept of most soluble relaxations, maximising the number of products they allow. We present algorithms to compute such relaxations in times compatible with interactivity, achieving this by indifferently making use of different types of compiled representations. We propose to generalise the concept of prime implicates to constraint problems with the concept of domain consequences, and suggest to generate them as a compilation strategy. This sets a new approach in compilation, and allows to address explanation-related queries in an efficient way. We define ordered automata to compactly represent large sets of domain consequences, in an orthogonal way from existing compilation techniques that represent large sets of solutions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. METHODOLOGY: The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. PRINCIPAL FINDINGS: The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. CONCLUSIONS: Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes the approach to the modelling of experiential knowledge in an industrial application of Case-Based Reasoning (CBR). The CBR involves retrieval techniques in conjunction with a relational database. The database is especially designed as a repository of experiential knowledge, and includes qualitative search indices. The system is intended to help design engineers and material engineers in the submarine cable industry. It consists of three parts: a materials database; a database of experiential knowledge; and a CBR system used to retrieve similar past designs based upon component and material qualitative descriptions. The system is currently undergoing user testing at the Alcatel Submarine Networks site in Greenwich.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most of the air quality modelling work has been so far oriented towards deterministic simulations of ambient pollutant concentrations. This traditional approach, which is based on the use of one selected model and one data set of discrete input values, does not reflect the uncertainties due to errors in model formulation and input data. Given the complexities of urban environments and the inherent limitations of mathematical modelling, it is unlikely that a single model based on routinely available meteorological and emission data will give satisfactory short-term predictions. In this study, different methods involving the use of more than one dispersion model, in association with different emission simulation methodologies and meteorological data sets, were explored for predicting best CO and benzene estimates, and related confidence bounds. The different approaches were tested using experimental data obtained during intensive monitoring campaigns in busy street canyons in Paris, France. Three relative simple dispersion models (STREET, OSPM and AEOLIUS) that are likely to be used for regulatory purposes were selected for this application. A sensitivity analysis was conducted in order to identify internal model parameters that might significantly affect results. Finally, a probabilistic methodology for assessing urban air quality was proposed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A complete model of particle impact degradation during dilute-phase pneumatic conveying is developed, which combines a degradation model, based on the experimental determination of breakage matrices, and a physical model of solids and gas flow in the pipeline. The solids flow in a straight pipe element is represented by a model consisting of two zones: a strand-type flow zone immediately downstream of a bend, followed by a fully suspended flow region after dispersion of the strand. The breakage matrices constructed from data on 90° angle single-impact tests are shown to give a good representation of the degradation occurring in a pipe bend of 90° angle. Numerical results are presented for degradation of granulated sugar in a large scale pneumatic conveyor.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we propose a method for interpolation over a set of retrieved cases in the adaptation phase of the case-based reasoning cycle. The method has two advantages over traditional systems: the first is that it can predict “new” instances, not yet present in the case base; the second is that it can predict solutions not present in the retrieval set. The method is a generalisation of Shepard’s Interpolation method, formulated as the minimisation of an error function defined in terms of distance metrics in the solution and problem spaces. We term the retrieval algorithm the Generalised Shepard Nearest Neighbour (GSNN) method. A novel aspect of GSNN is that it provides a general method for interpolation over nominal solution domains. The method is illustrated in the paper with reference to the Irises classification problem. It is evaluated with reference to a simulated nominal value test problem, and to a benchmark case base from the travel domain. The algorithm is shown to out-perform conventional nearest neighbour methods on these problems. Finally, GSNN is shown to improve in efficiency when used in conjunction with a diverse retrieval algorithm.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two studies investigated participants' sensitivity to the amount and diversity of the evidence when reasoning inductively about categories. Both showed that participants are more sensitive to characteristics of the evidence for arguments with general rather than specific conclusions. Both showed an association between cognitive ability and sensitivity to these evidence characteristics, particularly when the conclusion category was general. These results suggest that a simple associative process may not be sufficient to capture some key phenomena of category-based induction. They also support the claim that the need to generate a superordinate category is a complicating factor in category-based reasoning and that adults' tendency to generate such categories while reasoning has been overestimated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Contemporary medical science is reliant upon the rational selection and utilization of devices, and therefore, an increasing need has developed for in vitro systems aimed at replicating the conditions to which urological devices will be subjected to during their use in vivo. We report the development and validation of a novel continuous flow encrustation model based on the commercially available CDC biofilm reactor. Proteus mirabilis-induced encrustation formation on test biomaterial sections under varying experimental parameters was analyzed by X-ray diffraction, infrared- and Raman spectroscopy and by scanning electron microscopy. The model system produced encrusted deposits similar to those observed in archived clinical samples. Results obtained for the system are highly reproducible with encrustation being rapidly deposited on test biomaterial sections. This model will have utility in the rapid screening of encrustation behavior of biomaterials for use in urological applications. (C) 2010 Wiley Periodicals. Inc. J Biomed Mater Res Part B: Appl Biomater 93B: 128-140, 2010