63 resultados para Artificial Information Models

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Motivated by recent models involving off-centre ignition of Type Ia supernova explosions, we undertake three-dimensional time-dependent radiation transport simulations to investigate the range of bolometric light-curve properties that could be observed from supernovae in which there is a lop-sided distribution of the products from nuclear burning. We consider both a grid of artificial toy models which illustrate the conceivable range of effects and a recent three-dimensional hydrodynamical explosion model. We find that observationally significant viewing angle effects are likely to arise in such supernovae and that these may have important ramifications for the interpretation of the observed diversity of Type Ia supernova and the systematic uncertainties which relate to their use as standard candles in contemporary cosmology. © 2007 RAS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents two new approaches for use in complete process monitoring. The firstconcerns the identification of nonlinear principal component models. This involves the application of linear
principal component analysis (PCA), prior to the identification of a modified autoassociative neural network (AAN) as the required nonlinear PCA (NLPCA) model. The benefits are that (i) the number of the reduced set of linear principal components (PCs) is smaller than the number of recorded process variables, and (ii) the set of PCs is better conditioned as redundant information is removed. The result is a new set of input data for a modified neural representation, referred to as a T2T network. The T2T NLPCA model is then used for complete process monitoring, involving fault detection, identification and isolation. The second approach introduces a new variable reconstruction algorithm, developed from the T2T NLPCA model. Variable reconstruction can enhance the findings of the contribution charts still widely used in industry by reconstructing the outputs from faulty sensors to produce more accurate fault isolation. These ideas are illustrated using recorded industrial data relating to developing cracks in an industrial glass melter process. A comparison of linear and nonlinear models, together with the combined use of contribution charts and variable reconstruction, is presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The problem of model selection of a univariate long memory time series is investigated once a semi parametric estimator for the long memory parameter has been used. Standard information criteria are not consistent in this case. A Modified Information Criterion (MIC) that overcomes these difficulties is introduced and proofs that show its asymptotic validity are provided. The results are general and cover a wide range of short memory processes. Simulation evidence compares the new and existing methodologies and empirical applications in monthly inflation and daily realized volatility are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The BDI architecture, where agents are modelled based on their beliefs, desires and intentions, provides a practical approach to develop large scale systems. However, it is not well suited to model complex Supervisory Control And Data Acquisition (SCADA) systems pervaded by uncertainty. In this paper we address this issue by extending the operational semantics of Can(Plan) into Can(Plan)+. We start by modelling the beliefs of an agent as a set of epistemic states where each state, possibly using a different representation, models part of the agent's beliefs. These epistemic states are stratified to make them commensurable and to reason about the uncertain beliefs of the agent. The syntax and semantics of a BDI agent are extended accordingly and we identify fragments with computationally efficient semantics. Finally, we examine how primitive actions are affected by uncertainty and we define an appropriate form of lookahead planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our objective was to study whether “compensatory” models provide better descriptions of clinical judgment than fast and frugal models, according to expertise and experience. Fifty practitioners appraised 60 vignettes describing a child with an exacerbation of asthma and rated their propensities to admit the child. Linear logistic (LL) models of their judgments were compared with a matching heuristic (MH) model that searched available cues in order of importance for a critical value indicating an admission decision. There was a small difference between the 2 models in the proportion of patients allocated correctly (admit or not-admit decisions), 91.2% and 87.8%, respectively. The proportion allocated correctly by the LL model was lower for consultants than juniors, whereas the MH model performed equally well for both. In this vignette study, neither model provided any better description of judgments made by consultants or by pediatricians compared to other grades and specialties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the relation between selection power and selection labor for information retrieval (IR). It is the first part of the development of a labor theoretic approach to IR. Existing models for evaluation of IR systems are reviewed and the distinction of operational from experimental systems partly dissolved. The often covert, but powerful, influence from technology on practice and theory is rendered explicit. Selection power is understood as the human ability to make informed choices between objects or representations of objects and is adopted as the primary value for IR. Selection power is conceived as a property of human consciousness, which can be assisted or frustrated by system design. The concept of selection power is further elucidated, and its value supported, by an example of the discrimination enabled by index descriptions, the discovery of analogous concepts in partly independent scholarly and wider public discourses, and its embodiment in the design and use of systems. Selection power is regarded as produced by selection labor, with the nature of that labor changing with different historical conditions and concurrent information technologies. Selection labor can itself be decomposed into description and search labor. Selection labor and its decomposition into description and search labor will be treated in a subsequent article, in a further development of a labor theoretic approach to information retrieval.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose
Information science has been conceptualized as a partly unreflexive response to developments in information and computer technology, and, most powerfully, as part of the gestalt of the computer. The computer was viewed as an historical accident in the original formulation of the gestalt. An alternative, and timely, approach to understanding, and then dissolving, the gestalt would be to address the motivating technology directly, fully recognizing it as a radical human construction. This paper aims to address the issues.

Design/methodology/approach
– The paper adopts a social epistemological perspective and is concerned with collective, rather than primarily individual, ways of knowing.

Findings
Information technology tends to be received as objectively given, autonomously developing, and causing but not itself caused, by the language of discussions in information science. It has also been characterized as artificial, in the sense of unnatural, and sometimes as threatening. Attitudes to technology are implied, rather than explicit, and can appear weak when articulated, corresponding to collective repression.

Research limitations/implications
– Receiving technology as objectively given has an analogy with the Platonist view of mathematical propositions as discovered, in its exclusion of human activity, opening up the possibility of a comparable critique which insists on human agency.

Originality/value
– Apprehensions of information technology have been raised to consciousness, exposing their limitations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, several belief negotiation models have been introduced to deal with the problem of belief merging. A negotiation model usually consists of two functions: a negotiation function and a weakening function. A negotiation function is defined to choose the weakest sources and these sources will weaken their point of view using a weakening function. However, the currently available belief negotiation models are based on classical logic, which makes them difficult to define weakening functions. In this paper, we define a prioritized belief negotiation model in the framework of possibilistic logic. The priority between formulae provides us with important information to decide which beliefs should be discarded. The problem of merging uncertain information from different sources is then solved by two steps. First, beliefs in the original knowledge bases will be weakened to resolve inconsistencies among them. This step is based on a prioritized belief negotiation model. Second, the knowledge bases obtained by the first step are combined using a conjunctive operator which may have a reinforcement effect in possibilistic logic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A problem with use of the geostatistical Kriging error for optimal sampling design is that the design does not adapt locally to the character of spatial variation. This is because a stationary variogram or covariance function is a parameter of the geostatistical model. The objective of this paper was to investigate the utility of non-stationary geostatistics for optimal sampling design. First, a contour data set of Wiltshire was split into 25 equal sub-regions and a local variogram was predicted for each. These variograms were fitted with models and the coefficients used in Kriging to select optimal sample spacings for each sub-region. Large differences existed between the designs for the whole region (based on the global variogram) and for the sub-regions (based on the local variograms). Second, a segmentation approach was used to divide a digital terrain model into separate segments. Segment-based variograms were predicted and fitted with models. Optimal sample spacings were then determined for the whole region and for the sub-regions. It was demonstrated that the global design was inadequate, grossly over-sampling some segments while under-sampling others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of currency competition focus on the 5% of trading attributable to balance-of-payments flows. We introduce an information approach that focuses on the other 95%. Important departures from traditional models arise when transactions convey information. First, prices reveal different information depending on whether trades are direct or though vehicle currencies. Second, missing markets arise due to insufficiently symmetric information, rather than insufficient transactions scale. Third, the indeterminacy of equilibrium that arises in traditional models is resolved: currency trade patterns no longer concentrate arbitrarily on market size. Empirically, we provide a first analysis of transactions across a full market triangle: the euro, yen and US dollar. The estimated transaction effects on prices support the information approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The eng-genes concept involves the use of fundamental known system functions as activation functions in a neural model to create a 'grey-box' neural network. One of the main issues in eng-genes modelling is to produce a parsimonious model given a model construction criterion. The challenges are that (1) the eng-genes model in most cases is a heterogenous network consisting of more than one type of nonlinear basis functions, and each basis function may have different set of parameters to be optimised; (2) the number of hidden nodes has to be chosen based on a model selection criterion. This is a mixed integer hard problem and this paper investigates the use of a forward selection algorithm to optimise both the network structure and the parameters of the system-derived activation functions. Results are included from case studies performed on a simulated continuously stirred tank reactor process, and using actual data from a pH neutralisation plant. The resulting eng-genes networks demonstrate superior simulation performance and transparency over a range of network sizes when compared to conventional neural models. (c) 2007 Elsevier B.V. All rights reserved.