32 resultados para Random-set theory

em CentAUR: Central Archive University of Reading - UK


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Models used in neoclassical economics assume human behaviour to be purely rational. On the other hand, models adopted in social and behavioural psychology are founded on the ‘black box’ of human cognition. In view of these observations, this paper aims at bridging this gap by introducing psychological constructs in the well established microeconomic framework of choice behaviour based on random utility theory. In particular, it combines constructs developed employing Ajzen’s theory of planned behaviour with Lancaster’s theory of consumer demand for product characteristics to explain stated preferences over certified animal-friendly foods. To reach this objective a web survey was administered in the largest five EU-25 countries: France, Germany, Italy, Spain and the UK. Findings identify some salient cross-cultural differences between northern and southern Europe and suggest that psychological constructs developed using the Ajzen model are useful in explaining heterogeneity of preferences. Implications for policy makers and marketers involved with certified animal-friendly foods are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

What constitutes a baseline level of success for protein fold recognition methods? As fold recognition benchmarks are often presented without any thought to the results that might be expected from a purely random set of predictions, an analysis of fold recognition baselines is long overdue. Given varying amounts of basic information about a protein—ranging from the length of the sequence to a knowledge of its secondary structure—to what extent can the fold be determined by intelligent guesswork? Can simple methods that make use of secondary structure information assign folds more accurately than purely random methods and could these methods be used to construct viable hierarchical classifications?

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical diagnostics of mixing and transport are computed for a numerical model of forced shallow-water flow on the sphere and a middle-atmosphere general circulation model. In particular, particle dispersion statistics, transport fluxes, Liapunov exponents (probability density functions and ensemble averages), and tracer concentration statistics are considered. It is shown that the behavior of the diagnostics is in accord with that of kinematic chaotic advection models so long as stochasticity is sufficiently weak. Comparisons with random-strain theory are made.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a new method to calculate sky view factors (SVFs) from high resolution urban digital elevation models using a shadow casting algorithm. By utilizing weighted annuli to derive SVF from hemispherical images, the distance light source positions can be predefined and uniformly spread over the whole hemisphere, whereas another method applies a random set of light source positions with a cosine-weighted distribution of sun altitude angles. The 2 methods have similar results based on a large number of SVF images. However, when comparing variations at pixel level between an image generated using the new method presented in this paper with the image from the random method, anisotropic patterns occur. The absolute mean difference between the 2 methods is 0.002 ranging up to 0.040. The maximum difference can be as much as 0.122. Since SVF is a geometrically derived parameter, the anisotropic errors created by the random method must be considered as significant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper finds preference reversals in measurements of ambiguity aversion, even if psychological and informational circumstances are kept constant. The reversals are of a fundamentally different nature than the reversals found before because they cannot be explained by context-dependent weightings of attributes. We offer an explanation based on Sugden's random-reference theory, with different elicitation methods generating different random reference points. Then measurements of ambiguity aversion that use willingness to pay are confounded by loss aversion and hence overestimate ambiguity aversion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the first half of this memoir we explore the interrelationships between the abstract theory of limit operators (see e.g. the recent monographs of Rabinovich, Roch and Silbermann (2004) and Lindner (2006)) and the concepts and results of the generalised collectively compact operator theory introduced by Chandler-Wilde and Zhang (2002). We build up to results obtained by applying this generalised collectively compact operator theory to the set of limit operators of an operator (its operator spectrum). In the second half of this memoir we study bounded linear operators on the generalised sequence space , where and is some complex Banach space. We make what seems to be a more complete study than hitherto of the connections between Fredholmness, invertibility, invertibility at infinity, and invertibility or injectivity of the set of limit operators, with some emphasis on the case when the operator is a locally compact perturbation of the identity. Especially, we obtain stronger results than previously known for the subtle limiting cases of and . Our tools in this study are the results from the first half of the memoir and an exploitation of the partial duality between and and its implications for bounded linear operators which are also continuous with respect to the weaker topology (the strict topology) introduced in the first half of the memoir. Results in this second half of the memoir include a new proof that injectivity of all limit operators (the classic Favard condition) implies invertibility for a general class of almost periodic operators, and characterisations of invertibility at infinity and Fredholmness for operators in the so-called Wiener algebra. In two final chapters our results are illustrated by and applied to concrete examples. Firstly, we study the spectra and essential spectra of discrete Schrödinger operators (both self-adjoint and non-self-adjoint), including operators with almost periodic and random potentials. In the final chapter we apply our results to integral operators on .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The perturbed Hartree–Fock theory developed in the preceding paper is applied to LiH, BH, and HF, using limited basis‐set SCF–MO wavefunctions derived by previous workers. The calculated values for the force constant ke and the dipole‐moment derivative μ(1) are (experimental values in parentheses): LiH, ke  =  1.618(1.026)mdyn/Å,μ(1)  =  −18.77(−2.0±0.3)D/ÅBH,ke  =  5.199(3.032)mdyn/Å,μ(1)  =  −1.03(−)D/Å;HF,ke  =  12.90(9.651)mdyn/Å,μ(1)  =  −2.15(+1.50)D/Å. The values of the force on the proton were calculated exactly and according to the Hellmann–Feynman theorem in each case, and the discrepancies show that none of the wavefunctions used are close to the Hartree–Fock limit, so that the large errors in ke and μ(1) are not surprising. However no difficulties arose in the perturbed Hartree–Fock calculation, so that the application of the theory to more accurate wavefunctions appears quite feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theory of harmonic force constant refinement calculations is reviewed, and a general-purpose program for force constant and normal coordinate calculations is described. The program, called ASYM20. is available through Quantum Chemistry Program Exchange. It will work on molecules of any symmetry containing up to 20 atoms and will produce results on a series of isotopomers as desired. The vibrational secular equations are solved in either nonredundant valence internal coordinates or symmetry coordinates. As well as calculating the (harmonic) vibrational wavenumbers and normal coordinates, the program will calculate centrifugal distortion constants, Coriolis zeta constants, harmonic contributions to the α′s. root-mean-square amplitudes of vibration, and other quantities related to gas electron-diffraction studies and thermodynamic properties. The program will work in either a predict mode, in which it calculates results from an input force field, or in a refine mode, in which it refines an input force field by least squares to fit observed data on the quantities mentioned above. Predicate values of the force constants may be included in the data set for a least-squares refinement. The program is written in FORTRAN for use on a PC or a mainframe computer. Operation is mainly controlled by steering indices in the input data file, but some interactive control is also implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviourbased models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley’s declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviour-based models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley's declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population subdivision complicates analysis of molecular variation. Even if neutrality is assumed, three evolutionary forces need to be considered: migration, mutation, and drift. Simplification can be achieved by assuming that the process of migration among and drift within subpopulations is occurring fast compared to Mutation and drift in the entire population. This allows a two-step approach in the analysis: (i) analysis of population subdivision and (ii) analysis of molecular variation in the migrant pool. We model population subdivision using an infinite island model, where we allow the migration/drift parameter Theta to vary among populations. Thus, central and peripheral populations can be differentiated. For inference of Theta, we use a coalescence approach, implemented via a Markov chain Monte Carlo (MCMC) integration method that allows estimation of allele frequencies in the migrant pool. The second step of this approach (analysis of molecular variation in the migrant pool) uses the estimated allele frequencies in the migrant pool for the study of molecular variation. We apply this method to a Drosophila ananassae sequence data set. We find little indication of isolation by distance, but large differences in the migration parameter among populations. The population as a whole seems to be expanding. A population from Bogor (Java, Indonesia) shows the highest variation and seems closest to the species center.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.