957 resultados para model-based reasoning processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper consists of two major parts. First, we present the outline of a simple approach to very-low bandwidth video-conferencing system relying on an example-based hierarchical image compression scheme. In particular, we discuss the use of example images as a model, the number of required examples, faces as a class of semi-rigid objects, a hierarchical model based on decomposition into different time-scales, and the decomposition of face images into patches of interest. In the second part, we present several algorithms for image processing and animation as well as experimental evaluations. Among the original contributions of this paper is an automatic algorithm for pose estimation and normalization. We also review and compare different algorithms for finding the nearest neighbors in a database for a new input as well as a generalized algorithm for blending patches of interest in order to synthesize new images. Finally, we outline the possible integration of several algorithms to illustrate a simple model-based video-conference system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Joern Fischer, David B. Lindermayer, and Ioan Fazey (2004). Appreciating Ecological Complexity: Habitat Contours as a Conceptual Landscape Model. Conservation Biology, 18 (5)pp.1245-1253 RAE2008

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An improved method for deformable shape-based image segmentation is described. Image regions are merged together and/or split apart, based on their agreement with an a priori distribution on the global deformation parameters for a shape template. The quality of a candidate region merging is evaluated by a cost measure that includes: homogeneity of image properties within the combined region, degree of overlap with a deformed shape model, and a deformation likelihood term. Perceptually-motivated criteria are used to determine where/how to split regions, based on the local shape properties of the region group's bounding contour. A globally consistent interpretation is determined in part by the minimum description length principle. Experiments show that the model-based splitting strategy yields a significant improvement in segmention over a method that uses merging alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study has considered the optimisation of granola breakfast cereal manufacturing processes by wet granulation and pneumatic conveying. Granola is an aggregated food product used as a breakfast cereal and in cereal bars. Processing of granola involves mixing the dry ingredients (typically oats, nuts, etc.) followed by the addition of a binder which can contain honey, water and/or oil. In this work, the design and operation of two parallel wet granulation processes to produce aggregate granola products were incorporated: a) a high shear mixing granulation process followed by drying/toasting in an oven. b) a continuous fluidised bed followed by drying/toasting in an oven. In high shear granulation the influence of process parameters on key granule aggregate quality attributes such as granule size distribution and textural properties of granola were investigated. The experimental results show that the impeller rotational speed is the single most important process parameter which influences granola physical and textural properties. After that binder addition rate and wet massing time also show significant impacts on granule properties. Increasing the impeller speed and wet massing time increases the median granule size while also presenting a positive correlation with density. The combination of high impeller speed and low binder addition rate resulted in granules with the highest levels of hardness and crispness. In the fluidised bed granulation process the effect of nozzle air pressure and binder spray rate on key aggregate quality attributes were studied. The experimental results show that a decrease in nozzle air pressure leads to larger in mean granule size. The combination of lowest nozzle air pressure and lowest binder spray rate results in granules with the highest levels of hardness and crispness. Overall, the high shear granulation process led to larger, denser, less porous and stronger (less likely to break) aggregates than the fluidised bed process. The study also examined the particle breakage of granola during pneumatic conveying produced by both the high shear granulation and the fluidised bed granulation process. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. Particle breakage increases with applied pressure drop, and a 90° bend pipe results in more attrition for all conveying velocities relative to other pipe geometry. Additionally for the granules produced in the high shear granulator; those produced at the highest impeller speed, while being the largest also have the lowest levels of proportional breakage while smaller granules produced at the lowest impeller speed have the highest levels of breakage. This effect clearly shows the importance of shear history (during granule production) on breakage during subsequent processing. In terms of the fluidised bed granulation, there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. Finally, a simple power law breakage model based on process input parameters was developed for both manufacturing processes. It was found suitable for predicting the breakage of granola breakfast cereal at various applied air velocities using a number of pipe configurations, taking into account shear histories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this thesis work is to develop methods for forming and interfacing nanocrystal-molecule nanostructures in order to explore their electrical transport properties in various controlled environments. This work demonstrates the potential of nanocrystal assemblies for laterally contacting molecules for electronic transport measurements. We first propose a phenomenological model based on rate equations for the formation of hybrid nanocrystal-molecule (respectively: 20 nm – 1.2 nm) nanostructures in solution. We then concentrate on nanocrystals (~ 60 nm) assembled between nano-gaps (~ 40 nm) as a contacting strategy for the measurement of electronic transport properties of thiophene-terminated conjugated molecules (1.5 nm long) in a two-terminal configuration, under vacuum conditions. Similar devices were also probed with a three-terminal configuration using thiophene-terminated oxidation-reduction active molecules (1.8 nm long) in liquid medium for the demonstration of the electrolytic gating technique. The experimental and modelling work presented in this thesis project brings into light physical and chemical processes taking place at the extremely narrow (~1 nm separation) and curved interface between two nanocrystals or one nanocrystal and a grain of a metallic electrode. The formation of molecular bridges at this kind of interface necessitates molecules to diffuse from a large liquid reservoir into the region in the first place. Molecular bonding must occur to the surface for both molecular ends: this is a low yield statistical process in itself as it depends on orientation of surfaces, on steric hindrance at the surface and on binding energies. On the other hand, the experimental work also touched the importance of the competition between potentially immiscible liquids in systems such that (organo-)metallic molecules solvated by organic solvent in water and organic solvent in contact with hydrated citrate stabilised nanocrystals dispersed in solutions or assembled between electrodes from both experimental and simulations point of view.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constraint programming has emerged as a successful paradigm for modelling combinatorial problems arising from practical situations. In many of those situations, we are not provided with an immutable set of constraints. Instead, a user will modify his requirements, in an interactive fashion, until he is satisfied with a solution. Examples of such applications include, amongst others, model-based diagnosis, expert systems, product configurators. The system he interacts with must be able to assist him by showing the consequences of his requirements. Explanations are the ideal tool for providing this assistance. However, existing notions of explanations fail to provide sufficient information. We define new forms of explanations that aim to be more informative. Even if explanation generation is a very hard task, in the applications we consider, we must manage to provide a satisfactory level of interactivity and, therefore, we cannot afford long computational times. We introduce the concept of representative sets of relaxations, a compact set of relaxations that shows the user at least one way to satisfy each of his requirements and at least one way to relax them, and present an algorithm that efficiently computes such sets. We introduce the concept of most soluble relaxations, maximising the number of products they allow. We present algorithms to compute such relaxations in times compatible with interactivity, achieving this by indifferently making use of different types of compiled representations. We propose to generalise the concept of prime implicates to constraint problems with the concept of domain consequences, and suggest to generate them as a compilation strategy. This sets a new approach in compilation, and allows to address explanation-related queries in an efficient way. We define ordered automata to compactly represent large sets of domain consequences, in an orthogonal way from existing compilation techniques that represent large sets of solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis a novel theory of electrocatalysis at metal (especially noble metal)/solution interfaces was developed based on the assumption of metal adatom/incipient hydrous oxide cyclic redox transitions. Adatoms are considered as metastable, low coverage species that oxidise in-situ at potentials of often significantly cathodic to the regular metal/metal oxide transition. Because the adatom coverage is so low the electrochemical or spectroscopic response for oxidation is frequently overlooked; however, the product of such oxidation, referred to here as incipient hydrous oxide seems to be the important mediator in a wide variety of electrocatalytically demanding oxidation processes. Conversely, electrocatalytically demanding reductions apparently occur only at adatom sites at the metal/solution interface - such reactions generally occur only at potentials below, i.e. more cathodic than, the adatom/hydrous oxide transition. It was established that while silver in base oxidises in a regular manner (forming initially OHads species) at potentials above 1.0 V (RHE), there is a minor redox transition at much lower potentials, ca. o.35 v (RHE). The latter process is assumed to an adatom/hydrous oxide transition and the low coverage Ag(l) hydrous oxide (or hydroxide) species was shown to trigger or mediate the oxidation of aldehydes, e. g. HCHO. The results of a study of this system were shown to be in good agreement with a kinetic model based on the above assumptions; the similarity between this type of behaviour and enzyme-catalysed processes - both systems involve interfacial active sites - was pointed out. Similar behaviour was established for gold where both Au(l) and Au(lll) hydrous oxide mediators were shown to be the effective oxidants for different organic species. One of the most active electrocatalytic materials known at the present time is platinum. While the classical view of this high activity is based on the concept of activated chemisorption (and the important role of the latter is not discounted here) a vital role is attributed to the adatom/hydrous oxide transition. It was suggested that the well known intermediate (or anomalous) peak in the hydrogen region of the cyclic voltanmogram for platinum region is in fact due to an adatom/hydrous oxide transition. Using potential stepping procedures to minimise the effect of deactivating (COads) species, it was shown that the onset (anodic sweep) and termination (cathodic sweep) potential for the oxidation of a wide variety of organics coincided with the potential for the intermediate peak. The converse was also shown to apply; sluggish reduction reactions, that involve interaction with metal adatoms, occur at significant rates only in the region below the hydrous oxide/adatom transition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a model for stochastic processes with random marginal distributions. Our model relies on a stick-breaking construction for the marginal distribution of the process, and introduces dependence across locations by using a latent Gaussian copula model as the mechanism for selecting the atoms. The resulting latent stick-breaking process (LaSBP) induces a random partition of the index space, with points closer in space having a higher probability of being in the same cluster. We develop an efficient and straightforward Markov chain Monte Carlo (MCMC) algorithm for computation and discuss applications in financial econometrics and ecology. This article has supplementary material online.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. METHODOLOGY: The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. PRINCIPAL FINDINGS: The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. CONCLUSIONS: Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software-based control of life-critical embedded systems has become increasingly complex, and to a large extent has come to determine the safety of the human being. For example, implantable cardiac pacemakers have over 80,000 lines of code which are responsible for maintaining the heart within safe operating limits. As firmware-related recalls accounted for over 41% of the 600,000 devices recalled in the last decade, there is a need for rigorous model-driven design tools to generate verified code from verified software models. To this effect, we have developed the UPP2SF model-translation tool, which facilitates automatic conversion of verified models (in UPPAAL) to models that may be simulated and tested (in Simulink/Stateflow). We describe the translation rules that ensure correct model conversion, applicable to a large class of models. We demonstrate how UPP2SF is used in themodel-driven design of a pacemaker whosemodel is (a) designed and verified in UPPAAL (using timed automata), (b) automatically translated to Stateflow for simulation-based testing, and then (c) automatically generated into modular code for hardware-level integration testing of timing-related errors. In addition, we show how UPP2SF may be used for worst-case execution time estimation early in the design stage. Using UPP2SF, we demonstrate the value of integrated end-to-end modeling, verification, code-generation and testing process for complex software-controlled embedded systems. © 2014 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A commercial pyrometallurgical process for the extraction of platinum-group metals (PGM) from a feedstock slag was analysed with the use of a model based on computational fluid dynamics. The results of the modelling indicate that recovery depends on the behaviour of the collector phase. A possible method is proposed for estimation of the rate at which PGM particles in slag are absorbed into an iron collector droplet that falls through it. Nanoscale modelling techniques (for particle migration or capture) are combined with a diffusion-controlled mass-transfer model to determine the iron collector droplet size needed for >95% PGM recovery in a typical process bath (70 mm deep) in a realistic time-scale (<1 h). The results show that an iron droplet having a diameter in the range 0.1–0.3 mm gives good recovery (>90%) within a reasonable time. This finding is compatible with published experimental data. Pyrometallurgical processes similar to that investigated should be applicable to other types of waste that contain low levels of potentially valuable metals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the approach to the modelling of experiential knowledge in an industrial application of Case-Based Reasoning (CBR). The CBR involves retrieval techniques in conjunction with a relational database. The database is especially designed as a repository of experiential knowledge, and includes qualitative search indices. The system is intended to help design engineers and material engineers in the submarine cable industry. It consists of three parts: a materials database; a database of experiential knowledge; and a CBR system used to retrieve similar past designs based upon component and material qualitative descriptions. The system is currently undergoing user testing at the Alcatel Submarine Networks site in Greenwich.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the air quality modelling work has been so far oriented towards deterministic simulations of ambient pollutant concentrations. This traditional approach, which is based on the use of one selected model and one data set of discrete input values, does not reflect the uncertainties due to errors in model formulation and input data. Given the complexities of urban environments and the inherent limitations of mathematical modelling, it is unlikely that a single model based on routinely available meteorological and emission data will give satisfactory short-term predictions. In this study, different methods involving the use of more than one dispersion model, in association with different emission simulation methodologies and meteorological data sets, were explored for predicting best CO and benzene estimates, and related confidence bounds. The different approaches were tested using experimental data obtained during intensive monitoring campaigns in busy street canyons in Paris, France. Three relative simple dispersion models (STREET, OSPM and AEOLIUS) that are likely to be used for regulatory purposes were selected for this application. A sensitivity analysis was conducted in order to identify internal model parameters that might significantly affect results. Finally, a probabilistic methodology for assessing urban air quality was proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A complete model of particle impact degradation during dilute-phase pneumatic conveying is developed, which combines a degradation model, based on the experimental determination of breakage matrices, and a physical model of solids and gas flow in the pipeline. The solids flow in a straight pipe element is represented by a model consisting of two zones: a strand-type flow zone immediately downstream of a bend, followed by a fully suspended flow region after dispersion of the strand. The breakage matrices constructed from data on 90° angle single-impact tests are shown to give a good representation of the degradation occurring in a pipe bend of 90° angle. Numerical results are presented for degradation of granulated sugar in a large scale pneumatic conveyor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a method for interpolation over a set of retrieved cases in the adaptation phase of the case-based reasoning cycle. The method has two advantages over traditional systems: the first is that it can predict “new” instances, not yet present in the case base; the second is that it can predict solutions not present in the retrieval set. The method is a generalisation of Shepard’s Interpolation method, formulated as the minimisation of an error function defined in terms of distance metrics in the solution and problem spaces. We term the retrieval algorithm the Generalised Shepard Nearest Neighbour (GSNN) method. A novel aspect of GSNN is that it provides a general method for interpolation over nominal solution domains. The method is illustrated in the paper with reference to the Irises classification problem. It is evaluated with reference to a simulated nominal value test problem, and to a benchmark case base from the travel domain. The algorithm is shown to out-perform conventional nearest neighbour methods on these problems. Finally, GSNN is shown to improve in efficiency when used in conjunction with a diverse retrieval algorithm.