138 resultados para approximated inference
Resumo:
In this paper, we study the approximation of solutions of the homogeneous Helmholtz equation Δu + ω 2 u = 0 by linear combinations of plane waves with different directions. We combine approximation estimates for homogeneous Helmholtz solutions by generalized harmonic polynomials, obtained from Vekua’s theory, with estimates for the approximation of generalized harmonic polynomials by plane waves. The latter is the focus of this paper. We establish best approximation error estimates in Sobolev norms, which are explicit in terms of the degree of the generalized polynomial to be approximated, the domain size, and the number of plane waves used in the approximations.
Resumo:
Atmospheric aerosol acts to both reduce the background concentration of natural cluster ions, and to attenuate optical propagation. Hence, the presence of aerosol has two consequences, the reduction of the air’s electrical conductivity and the visual range. Ion-aerosol theory and Koschmieder’s visibility theory are combined here to derive the related non-linear variation of the atmospheric electric potential gradient with visual range. A substantial sensitivity is found under poor visual range conditions, but, for good visual range conditions the sensitivity diminishes and little influence of local aerosol on the fair weather potential gradient occurs. This allows visual range measurements, made simply and routinely at many meteorological sites, to provide inference about the local air’s electrical properties.
Resumo:
Individual differences in cognitive style can be characterized along two dimensions: ‘systemizing’ (S, the drive to analyze or build ‘rule-based’ systems) and ‘empathizing’ (E, the drive to identify another's mental state and respond to this with an appropriate emotion). Discrepancies between these two dimensions in one direction (S > E) or the other (E > S) are associated with sex differences in cognition: on average more males show an S > E cognitive style, while on average more females show an E > S profile. The neurobiological basis of these different profiles remains unknown. Since individuals may be typical or atypical for their sex, it is important to move away from the study of sex differences and towards the study of differences in cognitive style. Using structural magnetic resonance imaging we examined how neuroanatomy varies as a function of the discrepancy between E and S in 88 adult males from the general population. Selecting just males allows us to study discrepant E-S profiles in a pure way, unconfounded by other factors related to sex and gender. An increasing S > E profile was associated with increased gray matter volume in cingulate and dorsal medial prefrontal areas which have been implicated in processes related to cognitive control, monitoring, error detection, and probabilistic inference. An increasing E > S profile was associated with larger hypothalamic and ventral basal ganglia regions which have been implicated in neuroendocrine control, motivation and reward. These results suggest an underlying neuroanatomical basis linked to the discrepancy between these two important dimensions of individual differences in cognitive style.
Resumo:
Policy makers in the European Union are envisioning the introduction of a community farm animal welfare label which would allow consumers to align their consumption habits with their farm animal welfare preferences. For welfare labelling to be viable the market for livestock products produced to higher welfare standards has to be sufficiently segmented with consumers having sufficiently distinct and behaviourally consistent preferences. The present study investigates consumers’ preferences for meat produced to different welfare standards using a hypothetical welfare score. Data is obtained from a contingent valuation study carried out in Britain. The ordered probit model was estimated using Bayesian inference to obtain mean willingness to pay. We find decreasing marginal WTP as animal welfare levels increase and that people’s preferences for different levels of farm animal welfare are sufficiently differentiated making the introduction of a labelling scheme in the form of a certified rating system appear feasible.
Resumo:
Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.
Resumo:
The characteristics of the boundary layer separating a turbulence region from an irrotational (or non-turbulent) flow region are investigated using rapid distortion theory (RDT). The turbulence region is approximated as homogeneous and isotropic far away from the bounding turbulent/non-turbulent (T/NT) interface, which is assumed to remain approximately flat. Inviscid effects resulting from the continuity of the normal velocity and pressure at the interface, in addition to viscous effects resulting from the continuity of the tangential velocity and shear stress, are taken into account by considering a sudden insertion of the T/NT interface, in the absence of mean shear. Profiles of the velocity variances, turbulent kinetic energy (TKE), viscous dissipation rate (epsilon), turbulence length scales, and pressure statistics are derived, showing an excellent agreement with results from direct numerical simulations (DNS). Interestingly, the normalized inviscid flow statistics at the T/NT interface do not depend on the form of the assumed TKE spectrum. Outside the turbulent region, where the flow is irrotational (except inside a thin viscous boundary layer), epsilon decays as z^{-6}, where z is the distance from the T/NT interface. The mean pressure distribution is calculated using RDT, and exhibits a decrease towards the turbulence region due to the associated velocity fluctuations, consistent with the generation of a mean entrainment velocity. The vorticity variance and epsilon display large maxima at the T/NT interface due to the inviscid discontinuities of the tangential velocity variances existing there, and these maxima are quantitatively related to the thickness delta of the viscous boundary layer (VBL). For an equilibrium VBL, the RDT analysis suggests that delta ~ eta (where eta is the Kolmogorov microscale), which is consistent with the scaling law identified in a very recent DNS study for shear-free T/NT interfaces.
Resumo:
This paper presents the notion of Context-based Activity Design (CoBAD) that represents context with its dynamic changes and normative activities in an interactive system design. The development of CoBAD requires an appropriate context ontology model and inference mechanisms. The incorporation of norms and information field theory into Context State Transition Model, and the implementation of new conflict resolution strategies based on the specific situation are discussed. A demonstration of CoBAD using a human agent scenario in a smart home is also presented. Finally, a method of treating conflicting norms in multiple information fields is proposed.
Resumo:
This paper proposes and demonstrates an approach, Skilloscopy, to the assessment of decision makers. In an increasingly sophisticated, connected and information-rich world, decision making is becoming both more important and more difficult. At the same time, modelling decision-making on computers is becoming more feasible and of interest, partly because the information-input to those decisions is increasingly on record. The aims of Skilloscopy are to rate and rank decision makers in a domain relative to each other: the aims do not include an analysis of why a decision is wrong or suboptimal, nor the modelling of the underlying cognitive process of making the decisions. In the proposed method a decision-maker is characterised by a probability distribution of their competence in choosing among quantifiable alternatives. This probability distribution is derived by classic Bayesian inference from a combination of prior belief and the evidence of the decisions. Thus, decision-makers’ skills may be better compared, rated and ranked. The proposed method is applied and evaluated in the gamedomain of Chess. A large set of games by players across a broad range of the World Chess Federation (FIDE) Elo ratings has been used to infer the distribution of players’ rating directly from the moves they play rather than from game outcomes. Demonstration applications address questions frequently asked by the Chess community regarding the stability of the Elo rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The method of Skilloscopy may be applied in any decision domain where the value of the decision-options can be quantified.
Resumo:
This article considers ideas about the suitability of experimental, non-naturalist, narrative forms in theatre and television, through the example of a 1965 BBC2 adaptation of J. B. Priestley's 1939 play Johnson over Jordan. Using both textual analysis of the programme and research into the BBC production documentation, this essay explains how the circumstances and conditions of 1960s television adaptation and the star casting of Sir Ralph Richardson transformed Priestley's stage play. The TV adaptation achieved cosmic effects on an intimate scale, through inference and the imaginative integration of the studio space with dubbed sound.
Resumo:
Seamless phase II/III clinical trials combine traditional phases II and III into a single trial that is conducted in two stages, with stage 1 used to answer phase II objectives such as treatment selection and stage 2 used for the confirmatory analysis, which is a phase III objective. Although seamless phase II/III clinical trials are efficient because the confirmatory analysis includes phase II data from stage 1, inference can pose statistical challenges. In this paper, we consider point estimation following seamless phase II/III clinical trials in which stage 1 is used to select the most effective experimental treatment and to decide if, compared with a control, the trial should stop at stage 1 for futility. If the trial is not stopped, then the phase III confirmatory part of the trial involves evaluation of the selected most effective experimental treatment and the control. We have developed two new estimators for the treatment difference between these two treatments with the aim of reducing bias conditional on the treatment selection made and on the fact that the trial continues to stage 2. We have demonstrated the properties of these estimators using simulations
Resumo:
This paper presents a video surveillance framework that robustly and efficiently detects abandoned objects in surveillance scenes. The framework is based on a novel threat assessment algorithm which combines the concept of ownership with automatic understanding of social relations in order to infer abandonment of objects. Implementation is achieved through development of a logic-based inference engine based on Prolog. Threat detection performance is conducted by testing against a range of datasets describing realistic situations and demonstrates a reduction in the number of false alarms generated. The proposed system represents the approach employed in the EU SUBITO project (Surveillance of Unattended Baggage and the Identification and Tracking of the Owner).
Resumo:
By modelling the average activity of large neuronal populations, continuum mean field models (MFMs) have become an increasingly important theoretical tool for understanding the emergent activity of cortical tissue. In order to be computationally tractable, long-range propagation of activity in MFMs is often approximated with partial differential equations (PDEs). However, PDE approximations in current use correspond to underlying axonal velocity distributions incompatible with experimental measurements. In order to rectify this deficiency, we here introduce novel propagation PDEs that give rise to smooth unimodal distributions of axonal conduction velocities. We also argue that velocities estimated from fibre diameters in slice and from latency measurements, respectively, relate quite differently to such distributions, a significant point for any phenomenological description. Our PDEs are then successfully fit to fibre diameter data from human corpus callosum and rat subcortical white matter. This allows for the first time to simulate long-range conduction in the mammalian brain with realistic, convenient PDEs. Furthermore, the obtained results suggest that the propagation of activity in rat and human differs significantly beyond mere scaling. The dynamical consequences of our new formulation are investigated in the context of a well known neural field model. On the basis of Turing instability analyses, we conclude that pattern formation is more easily initiated using our more realistic propagator. By increasing characteristic conduction velocities, a smooth transition can occur from self-sustaining bulk oscillations to travelling waves of various wavelengths, which may influence axonal growth during development. Our analytic results are also corroborated numerically using simulations on a large spatial grid. Thus we provide here a comprehensive analysis of empirically constrained activity propagation in the context of MFMs, which will allow more realistic studies of mammalian brain activity in the future.
Resumo:
In this paper, we develop a method, termed the Interaction Distribution (ID) method, for analysis of quantitative ecological network data. In many cases, quantitative network data sets are under-sampled, i.e. many interactions are poorly sampled or remain unobserved. Hence, the output of statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks. The ID method can support assessment and inference of under-sampled ecological network data. In the current paper, we illustrate and discuss the ID method based on the properties of plant-animal pollination data sets of flower visitation frequencies. However, the ID method may be applied to other types of ecological networks. The method can supplement existing network analyses based on two definitions of the underlying probabilities for each combination of pollinator and plant species: (1), pi,j: the probability for a visit made by the i’th pollinator species to take place on the j’th plant species; (2), qi,j: the probability for a visit received by the j’th plant species to be made by the i’th pollinator. The method applies the Dirichlet distribution to estimate these two probabilities, based on a given empirical data set. The estimated mean values for pi,j and qi,j reflect the relative differences between recorded numbers of visits for different pollinator and plant species, and the estimated uncertainty of pi,j and qi,j decreases with higher numbers of recorded visits.
Resumo:
The recovery of the Arctic polar vortex following stratospheric sudden warmings is found to take upward of 3 months in a particular subset of cases, termed here polar-night jet oscillation (PJO) events. The anomalous zonal-mean circulation above the pole during this recovery is characterized by a persistently warm lower stratosphere, and above this a cold midstratosphere and anomalously high stratopause, which descends as the event unfolds. Composites of these events in the Canadian Middle Atmosphere Model show the persistence of the lower-stratospheric anomaly is a result of strongly suppressed wave driving and weak radiative cooling at these heights. The upper-stratospheric and lower-mesospheric anomalies are driven immediately following the warming by anomalous planetary-scale eddies, following which, anomalous parameterized nonorographic and orographic gravity waves play an important role. These details are found to be robust for PJO events (as opposed to sudden warmings in general) in that many details of individual PJO events match the composite mean. Azonal-mean quasigeostrophic model on the sphere is shown to reproduce the response to the thermal and mechanical forcings produced during a PJO event. The former is well approximated by Newtonian cooling. The response can thus be considered as a transient approach to the steady-state, downward control limit. In this context, the time scale of the lower-stratospheric anomaly is determined by the transient, radiative response to the extended absence of wave driving. The extent to which the dynamics of the wave-driven descent of the stratopause can be considered analogous to the descending phases of the quasi-biennial oscillation (QBO) is also discussed.
Resumo:
Atmospheric CO2 concentration is hypothesized to influence vegetation distribution via tree–grass competition, with higher CO2 concentrations favouring trees. The stable carbon isotope (δ13C) signature of vegetation is influenced by the relative importance of C4 plants (including most tropical grasses) and C3 plants (including nearly all trees), and the degree of stomatal closure – a response to aridity – in C3 plants. Compound-specific δ13C analyses of leaf-wax biomarkers in sediment cores of an offshore South Atlantic transect are used here as a record of vegetation changes in subequatorial Africa. These data suggest a large increase in C3 relative to C4 plant dominance after the Last Glacial Maximum. Using a process-based biogeography model that explicitly simulates 13C discrimination, it is shown that precipitation and temperature changes cannot explain the observed shift in δ13C values. The physiological effect of increasing CO2 concentration is decisive, altering the C3/C4 balance and bringing the simulated and observed δ13C values into line. It is concluded that CO2 concentration itself was a key agent of vegetation change in tropical southern Africa during the last glacial–interglacial transition. Two additional inferences follow. First, long-term variations in terrestrial δ13Cvalues are not simply a proxy for regional rainfall, as has sometimes been assumed. Although precipitation and temperature changes have had major effects on vegetation in many regions of the world during the period between the Last Glacial Maximum and recent times, CO2 effects must also be taken into account, especially when reconstructing changes in climate between glacial and interglacial states. Second, rising CO2 concentration today is likely to be influencing tree–grass competition in a similar way, and thus contributing to the "woody thickening" observed in savannas worldwide. This second inference points to the importance of experiments to determine how vegetation composition in savannas is likely to be influenced by the continuing rise of CO2 concentration.