63 resultados para Artificial Information Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A conceptual model is described for generating distributions of grazing animals, according to their searching behavior, to investigate the mechanisms animals may use to achieve their distributions. The model simulates behaviors ranging from random diffusion, through taxis and cognitively aided navigation (i.e., using memory), to the optimization extreme of the Ideal Free Distribution. These behaviors are generated from simulation of biased diffusion that operates at multiple scales simultaneously, formalizing ideas of multiple-scale foraging behavior. It uses probabilistic bias to represent decisions, allowing multiple search goals to be combined (e.g., foraging and social goals) and the representation of suboptimal behavior. By allowing bias to arise at multiple scales within the environment, each weighted relative to the others, the model can represent different scales of simultaneous decision-making and scale-dependent behavior. The model also allows different constraints to be applied to the animal's ability (e.g., applying food-patch accessibility and information limits). Simulations show that foraging-decision randomness and spatial scale of decision bias have potentially profound effects on both animal intake rate and the distribution of resources in the environment. Spatial variograms show that foraging strategies can differentially change the spatial pattern of resource abundance in the environment to one characteristic of the foraging strategy.</

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ideal free distribution model which relates the spatial distribution of mobile consumers to that of their resource is shown to be a limiting case of a more general model which we develop using simple concepts of diffusion. We show how the ideal free distribution model can be derived from a more general model and extended by incorporating simple models of social influences on predator spacing. First, a free distribution model based on patch switching rules, with a power-law interference term, which represents instantaneous biased diffusion is derived. A social bias term is then introduced to represent the effect of predator aggregation on predator fitness, separate from any effects which act through intake rate. The social bias term is expanded to express an optimum spacing for predators and example solutions of the resulting biased diffusion models are shown. The model demonstrates how an empirical interference coefficient, derived from measurements of predator and prey densities, may include factors expressing the impact of social spacing behaviour on fitness. We conclude that empirical values of log predator/log prey ratio may contain information about more than the relationship between consumer and resource densities. Unlike many previous models, the model shown here applies to conditions without continual input. (C) 1997 Academic Press Limited.</p>

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conditional Gaussian (CG) distributions allow the inclusion of both discrete and continuous variables in a model assuming that the continuous variable is normally distributed. However, the CG distributions have proved to be unsuitable for survival data which tends to be highly skewed. A new method of analysis is required to take into account continuous variables which are not normally distributed. The aim of this paper is to introduce the more appropriate conditional phase-type (C-Ph) distribution for representing a continuous non-normal variable while also incorporating the causal information in the form of a Bayesian network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A general approach to information correction and fusion for belief functions is proposed, where not only may the information items be irrelevant, but sources may lie as well. We introduce a new correction scheme, which takes into account uncertain metaknowledge on the source’s relevance and truthfulness and that generalizes Shafer’s discounting operation. We then show how to reinterpret all connectives of Boolean logic in terms of source behavior assumptions with respect to relevance and truthfulness. We are led to generalize the unnormalized Dempster’s rule to all Boolean connectives, while taking into account the uncertainties pertaining to assumptions concerning the behavior of sources. Eventually, we further extend this approach to an even more general setting, where source behavior assumptions do not have to be restricted to relevance and truthfulness.We also establish the commutativity property between correction and fusion processes, when the behaviors of the sources are independent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When multiple sources provide information about the same unknown quantity, their fusion into a synthetic interpretable message is often a tedious problem, especially when sources are conicting. In this paper, we propose to use possibility theory and the notion of maximal coherent subsets, often used in logic-based representations, to build a fuzzy belief structure that will be instrumental both for extracting useful information about various features of the information conveyed by the sources and for compressing this information into a unique possibility distribution. Extensions and properties of the basic fusion rule are also studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present results for a suite of 14 three-dimensional, high-resolution hydrodynamical simulations of delayed-detonation models of Type Ia supernova (SN Ia) explosions. This model suite comprises the first set of three-dimensional SN Ia simulations with detailed isotopic yield information. As such, it may serve as a data base for Chandrasekhar-mass delayed-detonation model nucleosynthetic yields and for deriving synthetic observables such as spectra and light curves. We employ aphysically motivated, stochastic model based on turbulent velocity fluctuations and fuel density to calculate in situ the deflagration-to-detonation transition probabilities. To obtain different strengths of the deflagration phase and thereby different degrees of pre-expansion, we have chosen a sequence of initial models with 1, 3, 5, 10, 20, 40, 100, 150, 200, 300 and 1600 (two different realizations) ignition kernels in a hydrostatic white dwarf with a central density of 2.9 × 10 g cm, as well as one high central density (5.5 × 10 g cm) and one low central density (1.0 × 10 g cm) rendition of the 100 ignition kernel configuration. For each simulation, we determined detailed nucleosynthetic yields by postprocessing10 tracer particles with a 384 nuclide reaction network. All delayed-detonation models result in explosions unbinding thewhite dwarf, producing a range of 56Ni masses from 0.32 to 1.11M. As a general trend, the models predict that the stableneutron-rich iron-group isotopes are not found at the lowest velocities, but rather at intermediate velocities (~3000×10 000 km s) in a shell surrounding a Ni-rich core. The models further predict relatively low-velocity oxygen and carbon, with typical minimum velocities around 4000 and 10 000 km s, respectively. © 2012 The Authors. Published by Oxford University Press on behalf of the Royal Astronomical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Well planned natural ventilation strategies and systems in the built environments may provide healthy and comfortable indoor conditions, while contributing to a significant reduction in the energy consumed by buildings. Computational Fluid Dynamics (CFD) is particularly suited for modelling indoor conditions in naturally ventilated spaces, which are difficult to predict using other types of building simulation tools. Hence, accurate and reliable CFD models of naturally ventilated indoor spaces are necessary to support the effective design and operation of indoor environments in buildings. This paper presents a formal calibration methodology for the development of CFD models of naturally ventilated indoor environments. The methodology explains how to qualitatively and quantitatively verify and validate CFD models, including parametric analysis utilising the response surface technique to support a robust calibration process. The proposed methodology is demonstrated on a naturally ventilated study zone in the library building at the National University of Ireland in Galway. The calibration process is supported by the on-site measurements performed in a normally operating building. The measurement of outdoor weather data provided boundary conditions for the CFD model, while a network of wireless sensors supplied air speeds and air temperatures inside the room for the model calibration. The concepts and techniques developed here will enhance the process of achieving reliable CFD models that represent indoor spaces and provide new and valuable information for estimating the effect of the boundary conditions on the CFD model results in indoor environments. © 2012 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paradoxical kinesia describes the motor improvement in Parkinson's disease (PD) triggered by the presence of external sensory information relevant for the movement. This phenomenon has been puzzling scientists for over 60 years, both in neurological and motor control research, with the underpinning mechanism still being the subject of fierce debate. In this paper we present novel evidence supporting the idea that the key to understanding paradoxical kinesia lies in both spatial and temporal information conveyed by the cues and the coupling between perception and action. We tested a group of 7 idiopathic PD patients in an upper limb mediolateral movement task. Movements were performed with and without a visual point light display, travelling at 3 different speeds. The dynamic information presented in the visual point light display depicted three different movement speeds of the same amplitude performed by a healthy adult. The displays were tested and validated on a group of neurologically healthy participants before being tested on the PD group. Our data show that the temporal aspects of the movement (kinematics) in PD can be moderated by the prescribed temporal information presented in a dynamic environmental cue. Patients demonstrated a significant improvement in terms of movement time and peak velocity when executing movement in accordance with the information afforded by the point light display, compared to when the movement of the same amplitude and direction was performed without the display. In all patients we observed the effect of paradoxical kinesia, with a strong relationship between the perceptual information prescribed by the biological motion display and the observed motor performance of the patients. © 2013 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In most previous research on distributional semantics, Vector Space Models (VSMs) of words are built either from topical information (e.g., documents in which a word is present), or from syntactic/semantic types of words (e.g., dependency parse links of a word in sentences), but not both. In this paper, we explore the utility of combining these two representations to build VSM for the task of semantic composition of adjective-noun phrases. Through extensive experiments on benchmark datasets, we find that even though a type-based VSM is effective for semantic composition, it is often outperformed by a VSM built using a combination of topic- and type-based statistics. We also introduce a new evaluation task wherein we predict the composed vector representation of a phrase from the brain activity of a human subject reading that phrase. We exploit a large syntactically parsed corpus of 16 billion tokens to build our VSMs, with vectors for both phrases and words, and make them publicly available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates sub-integer implementations of the adaptive Gaussian mixture model (GMM) for background/foreground segmentation to allow the deployment of the method on low cost/low power processors that lack Floating Point Unit (FPU). We propose two novel integer computer arithmetic techniques to update Gaussian parameters. Specifically, the mean value and the variance of each Gaussian are updated by a redefined and generalised "round'' operation that emulates the original updating rules for a large set of learning rates. Weights are represented by counters that are updated following stochastic rules to allow a wider range of learning rates and the weight trend is approximated by a line or a staircase. We demonstrate that the memory footprint and computational cost of GMM are significantly reduced, without significantly affecting the performance of background/foreground segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Base rate neglect on the mammography problem can be overcome by explicitly presenting a causal basis for the typically vague false-positive statistic. One account of this causal facilitation effect is that people make probabilistic judgements over intuitive causal models parameterized with the evidence in the problem. Poorly defined or difficult-to-map evidence interferes with this process, leading to errors in statistical reasoning. To assess whether the construction of parameterized causal representations is an intuitive or deliberative process, in Experiment 1 we combined a secondary load paradigm with manipulations of the presence or absence of an alternative cause in typical statistical reasoning problems. We found limited effects of a secondary load, no evidence that information about an alternative cause improves statistical reasoning, but some evidence that it reduces base rate neglect errors. In Experiments 2 and 3 where we did not impose a load, we observed causal facilitation effects. The amount of Bayesian responding in the causal conditions was impervious to the presence of a load (Experiment 1) and to the precise statistical information that was presented (Experiment 3). However, we found less Bayesian responding in the causal condition than previously reported. We conclude with a discussion of the implications of our findings and the suggestion that there may be population effects in the accuracy of statistical reasoning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When studying heterogeneous aquifer systems, especially at regional scale, a degree of generalization is anticipated. This can be due to sparse sampling regimes, complex depositional environments or lack of accessibility to measure the subsurface. This can lead to an inaccurate conceptualization which can be detrimental when applied to groundwater flow models. It is important that numerical models are based on observed and accurate geological information and do not rely on the distribution of artificial aquifer properties. This can still be problematic as data will be modelled at a different scale to which it was collected. It is proposed here that integrating geophysics and upscaling techniques can assist in a more realistic and deterministic groundwater flow model. In this study, the sedimentary aquifer of the Lagan Valley in Northern Ireland is chosen due to intruding sub-vertical dolerite dykes. These dykes are of a lower permeability than the sandstone aquifer. The use of airborne magnetics allows the delineation of heterogeneities, confirmed by field analysis. Permeability measured at the field scale is then upscaled to different levels using a correlation with the geophysical data, creating equivalent parameters that can be directly imported into numerical groundwater flow models. These parameters include directional equivalent permeabilities and anisotropy. Several stages of upscaling are modelled in finite element. Initial modelling is providing promising results, especially at the intermediate scale, suggesting an accurate distribution of aquifer properties. This deterministic based methodology is being expanded to include stochastic methods of obtaining heterogeneity location based on airborne geophysical data. This is through the Direct Sample method of Multiple-Point Statistics (MPS). This method uses the magnetics as a training image to computationally determine a probabilistic occurrence of heterogeneity. There is also a need to apply the method to alternate geological contexts where the heterogeneity is of a higher permeability than the host rock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract—Power capping is an essential function for efficient power budgeting and cost management on modern server systems. Contemporary server processors operate under power caps by using dynamic voltage and frequency scaling (DVFS). However, these processors are often deployed in non-uniform memory
access (NUMA) architectures, where thread allocation between cores may significantly affect performance and power consumption. This paper proposes a method which maximizes performance under power caps on NUMA systems by dynamically optimizing two knobs: DVFS and thread allocation. The method selects the optimal combination of the two knobs with models based on artificial neural network (ANN) that captures the nonlinear effect of thread allocation on performance. We implement
the proposed method as a runtime system and evaluate it with twelve multithreaded benchmarks on a real AMD Opteron based NUMA system. The evaluation results show that our method outperforms a naive technique optimizing only DVFS by up to
67.1%, under a power cap.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product Line software Engineering depends on capturing the commonality and variability within a family of products, typically using feature modeling, and using this information to evolve a generic reference architecture for the family. For embedded systems, possible variability in hardware and operating system platforms is an added complication. The design process can be facilitated by first exploring the behavior associated with features. In this paper we outline a bidirectional feature modeling scheme that supports the capture of commonality and variability in the platform environment as well as within the required software. Additionally, 'behavior' associated with features can be included in the overall model. This is achieved by integrating the UCM path notation in a way that exploits UCM's static and dynamic stubs to capture behavioral variability and link it to the feature model structure. The resulting model is a richer source of information to support the architecture development process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In highly heterogeneous aquifer systems, conceptualization of regional groundwater flow models frequently results in the generalization or negligence of aquifer heterogeneities, both of which may result in erroneous model outputs. The calculation of equivalence related to hydrogeological parameters and applied to upscaling provides a means of accounting for measurement scale information but at regional scale. In this study, the Permo-Triassic Lagan Valley strategic aquifer in Northern Ireland is observed to be heterogeneous, if not discontinuous, due to subvertical trending low-permeability Tertiary dolerite dykes. Interpretation of ground and aerial magnetic surveys produces a deterministic solution to dyke locations. By measuring relative permeabilities of both the dykes and the sedimentary host rock, equivalent directional permeabilities, that determine anisotropy calculated as a function of dyke density, are obtained. This provides parameters for larger scale equivalent blocks, which can be directly imported to numerical groundwater flow models. Different conceptual models with different degrees of upscaling are numerically tested and results compared to regional flow observations. Simulation results show that the upscaled permeabilities from geophysical data allow one to properly account for the observed spatial variations of groundwater flow, without requiring artificial distribution of aquifer properties. It is also found that an intermediate degree of upscaling, between accounting for mapped field-scale dykes and accounting for one regional anisotropy value (maximum upscaling) provides results the closest to the observations at the regional scale.