824 resultados para decentralised data fusion framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fault diagnosis has become an important component in intelligent systems, such as intelligent control systems and intelligent eLearning systems. Reiter's diagnosis theory, described by first-order sentences, has been attracting much attention in this field. However, descriptions and observations of most real-world situations are related to fuzziness because of the incompleteness and the uncertainty of knowledge, e. g., the fault diagnosis of student behaviors in the eLearning processes. In this paper, an extension of Reiter's consistency-based diagnosis methodology, Fuzzy Diagnosis, has been proposed, which is able to deal with incomplete or fuzzy knowledge. A number of important properties of the Fuzzy diagnoses schemes have also been established. The computing of fuzzy diagnoses is mapped to solving a system of inequalities. Some special cases, abstracted from real-world situations, have been discussed. In particular, the fuzzy diagnosis problem, in which fuzzy observations are represented by clause-style fuzzy theories, has been presented and its solving method has also been given. A student fault diagnostic problem abstracted from a simplified real-world eLearning case is described to demonstrate the application of our diagnostic framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cyclotides are a family of disulfide-rich proteins from plants. They have the characteristic structural features of a circular protein backbone and a knotted arrangement of disulfide bonds. Structural and biochemical studies of the cyclotides suggest that their unique physiological stability can be loaned to bioactive peptide fragments for pharmaceutical and agricultural development. In particular, the cyclotides incorporate a number of solvent-exposed loops that are potentially suitable for epitope grafting applications. Here, we determine the structure of the largest known cyclotide, palicourein, which has an atypical size and composition within one of the surface-exposed loops. The structural data show that an increase in size of a palicourein loop does not perturb the core fold, to which the thermodynamic and chemical stability has been attributed. The cyclotide core fold, thus, can in principle be used as a framework for the development of useful pharmaceutical and agricultural bioactivities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a framework for small area population estimation that enables users to select a method that is fit for the purpose. The adjustments to input data that are needed before use are outlined, with emphasis on developing consistent time series of inputs. We show how geographical harmonization of small areas, which is crucial to comparisons over time, can be achieved. For two study regions, the East of England and Yorkshire and the Humber, the differences in output and consequences of adopting different methods are illustrated. The paper concludes with a discussion of how data, on stream since 1998, might be included in future small area estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present results of application of the density functional theory (DFT) to adsorption and desorption in finite and infinite cylindrical pores accounting for the density distribution in radial and axial directions. Capillary condensation via formation of bridges is considered using canonical and grand canonical versions of the 2D DFT. The potential barrier of nucleation is determined as a function of the bulk pressure and the pore diameter. In the framework of the conventional assumptions on intermolecular interactions both 1D and 2D DFT versions lead to the same results and confirm the classical scenario of condensation and evaporation: the condensation occurs at the vapor-like spinodal point, and the evaporation corresponds to the equilibrium transition pressure. The analysis of experimental data on argon and nitrogen adsorption on MCM-41 samples seems to not completely corroborate this scenario, with adsorption branch being better described by the equilibrium pressure - diameter dependence. This points to the necessity of the further development of basic representations on the hysteresis phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adsorption of pure nitrogen, argon, acetone, chloroform and acetone-chloroform mixture on graphitized thermal carbon black is considered at sub-critical conditions by means of molecular layer structure theory (MLST). In the present version of the MLST an adsorbed fluid is considered as a sequence of 2D molecular layers, whose Helmholtz free energies are obtained directly from the analysis of experimental adsorption isotherm of pure components. The interaction of the nearest layers is accounted for in the framework of mean field approximation. This approach allows quantitative correlating of experimental nitrogen and argon adsorption isotherm both in the monolayer region and in the range of multi-layer coverage up to 10 molecular layers. In the case of acetone and chloroform the approach also leads to excellent quantitative correlation of adsorption isotherms, while molecular approaches such as the non-local density functional theory (NLDFT) fail to describe those isotherms. We extend our new method to calculate the Helmholtz free energy of an adsorbed mixture using a simple mixing rule, and this allows us to predict mixture adsorption isotherms from pure component adsorption isotherms. The approach, which accounts for the difference in composition in different molecular layers, is tested against the experimental data of acetone-chloroform mixture (non-ideal mixture) adsorption on graphitized thermal carbon black at 50 degrees C. (C) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New tools derived from advances in molecular biology have not been widely adopted in plant breeding for complex traits because of the inability to connect information at gene level to the phenotype in a manner that is useful for selection. In this study, we explored whether physiological dissection and integrative modelling of complex traits could link phenotype complexity to underlying genetic systems in a way that enhanced the power of molecular breeding strategies. A crop and breeding system simulation study on sorghum, which involved variation in 4 key adaptive traits-phenology, osmotic adjustment, transpiration efficiency, stay-green-and a broad range of production environments in north-eastern Australia, was used. The full matrix of simulated phenotypes, which consisted of 547 location-season combinations and 4235 genotypic expression states, was analysed for genetic and environmental effects. The analysis was conducted in stages assuming gradually increased understanding of gene-to-phenotype relationships, which would arise from physiological dissection and modelling. It was found that environmental characterisation and physiological knowledge helped to explain and unravel gene and environment context dependencies in the data. Based on the analyses of gene effects, a range of marker-assisted selection breeding strategies was simulated. It was shown that the inclusion of knowledge resulting from trait physiology and modelling generated an enhanced rate of yield advance over cycles of selection. This occurred because the knowledge associated with component trait physiology and extrapolation to the target population of environments by modelling removed confounding effects associated with environment and gene context dependencies for the markers used. Developing and implementing this gene-to-phenotype capability in crop improvement requires enhanced attention to phenotyping, ecophysiological modelling, and validation studies to test the stability of candidate genetic regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Beyond the inherent technical challenges, current research into the three dimensional surface correspondence problem is hampered by a lack of uniform terminology, an abundance of application specific algorithms, and the absence of a consistent model for comparing existing approaches and developing new ones. This paper addresses these challenges by presenting a framework for analysing, comparing, developing, and implementing surface correspondence algorithms. The framework uses five distinct stages to establish correspondence between surfaces. It is general, encompassing a wide variety of existing techniques, and flexible, facilitating the synthesis of new correspondence algorithms. This paper presents a review of existing surface correspondence algorithms, and shows how they fit into the correspondence framework. It also shows how the framework can be used to analyse and compare existing algorithms and develop new algorithms using the framework's modular structure. Six algorithms, four existing and two new, are implemented using the framework. Each implemented algorithm is used to match a number of surface pairs. Results demonstrate that the correspondence framework implementations are faithful implementations of existing algorithms, and that powerful new surface correspondence algorithms can be created. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity market price forecast is a changeling yet very important task for electricity market managers and participants. Due to the complexity and uncertainties in the power grid, electricity prices are highly volatile and normally carry with spikes. which may be (ens or even hundreds of times higher than the normal price. Such electricity spikes are very difficult to be predicted. So far. most of the research on electricity price forecast is based on the normal range electricity prices. This paper proposes a data mining based electricity price forecast framework, which can predict the normal price as well as the price spikes. The normal price can be, predicted by a previously proposed wavelet and neural network based forecast model, while the spikes are forecasted based on a data mining approach. This paper focuses on the spike prediction and explores the reasons for price spikes based on the measurement of a proposed composite supply-demand balance index (SDI) and relative demand index (RDI). These indices are able to reflect the relationship among electricity demand, electricity supply and electricity reserve capacity. The proposed model is based on a mining database including market clearing price, trading hour. electricity), demand, electricity supply and reserve. Bayesian classification and similarity searching techniques are used to mine the database to find out the internal relationships between electricity price spikes and these proposed. The mining results are used to form the price spike forecast model. This proposed model is able to generate forecasted price spike, level of spike and associated forecast confidence level. The model is tested with the Queensland electricity market data with promising results. Crown Copyright (C) 2004 Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With mixed feature data, problems are induced in modeling the gating network of normalized Gaussian (NG) networks as the assumption of multivariate Gaussian becomes invalid. In this paper, we propose an independence model to handle mixed feature data within the framework of NG networks. The method is illustrated using a real example of breast cancer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Equilibrium adsorption data of nitrogen on a series of nongraphitized carbon blacks and nonporous silica at 77 K were analyzed by means of classical density functional theory to determine the solid-fluid potential. The behavior of this potential profile at large distance is particularly considered. The analysis of nitrogen adsorption isotherms seems to indicate that the adsorption in the first molecular layer is localized and controlled mainly by short-range forces due to the surface roughness, crystalline defects, and functional groups. At distances larger than approximately 1.3-1.5 molecular diameters, the adsorption is nonlocalized and appears as a thickening of the adsorbed film with increasing bulk pressure in a relatively weak adsorption potential field. It has been found that the asymptotic decay of the potential obeys the power law with the exponent being -3 for carbon blacks and -4 for silica surface, which signifies that in the latter case the adsorption potential is mainly exerted by surface oxygen atoms. In all cases, the absolute value of the solid-fluid potential is much smaller than that predicted by the Lennard-Jones pair potential with commonly used solid-fluid molecular parameters. The effect of surface heterogeneity on the heat of adsorption is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To present an evidence-based framework to improve the quality of occupational therapy expert opinions on work capacity for litigation, compensation and insurance purposes. Methods: Grounded theory methodology was used to collect and analyse data from a sample of 31 participants, comprising 19 occupational therapists, 6 medical specialists and 6 lawyers. A focused semistructured interview was completed with each participant. In addition, 20 participants verified the key findings. Results: The framework is contextualised within a medicolegal system requiring increasing expertise. The framework consists of (i) broad professional development strategies and principles, and (ii) specific strategies and principles for improving opinions through reporting and assessment practices. Conclusions: The synthesis of the participants' recommendations provides systematic guidelines for improving occupational therapy expert opinion on work capacity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary change results from selection acting on genetic variation. For migration to be successful, many different aspects of an animal's physiology and behaviour need to function in a co-coordinated way. Changes in one migratory trait are therefore likely to be accompanied by changes in other migratory and life-history traits. At present, we have some knowledge of the pressures that operate at the various stages of migration, but we know very little about the extent of genetic variation in various aspects of the migratory syndrome. As a consequence, our ability to predict which species is capable of what kind of evolutionary change, and at which rate, is limited. Here, we review how our evolutionary understanding of migration may benefit from taking a quantitative-genetic approach and present a framework for studying the causes of phenotypic variation. We review past research, that has mainly studied single migratory traits in captive birds, and discuss how this work could be extended to study genetic variation in the wild and to account for genetic correlations and correlated selection. In the future, reaction-norm approaches may become very important, as they allow the study of genetic and environmental effects on phenotypic expression within a single framework, as well as of their interactions. We advocate making more use of repeated measurements on single individuals to study the causes of among-individual variation in the wild, as they are easier to obtain than data on relatives and can provide valuable information for identifying and selecting traits. This approach will be particularly informative if it involves systematic testing of individuals under different environmental conditions. We propose extending this research agenda by using optimality models to predict levels of variation and covariation among traits and constraints. This may help us to select traits in which we might expect genetic variation, and to identify the most informative environmental axes. We also recommend an expansion of the passerine model, as this model does not apply to birds, like geese, where cultural transmission of spatio-temporal information is an important determinant of migration patterns and their variation.