967 resultados para Function theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies decision making under uncertainty and how economic agents respond to information. The classic model of subjective expected utility and Bayesian updating is often at odds with empirical and experimental results; people exhibit systematic biases in information processing and often exhibit aversion to ambiguity. The aim of this work is to develop simple models that capture observed biases and study their economic implications.

In the first chapter I present an axiomatic model of cognitive dissonance, in which an agent's response to information explicitly depends upon past actions. I introduce novel behavioral axioms and derive a representation in which beliefs are directionally updated. The agent twists the information and overweights states in which his past actions provide a higher payoff. I then characterize two special cases of the representation. In the first case, the agent distorts the likelihood ratio of two states by a function of the utility values of the previous action in those states. In the second case, the agent's posterior beliefs are a convex combination of the Bayesian belief and the one which maximizes the conditional value of the previous action. Within the second case a unique parameter captures the agent's sensitivity to dissonance, and I characterize a way to compare sensitivity to dissonance between individuals. Lastly, I develop several simple applications and show that cognitive dissonance contributes to the equity premium and price volatility, asymmetric reaction to news, and belief polarization.

The second chapter characterizes a decision maker with sticky beliefs. That is, a decision maker who does not update enough in response to information, where enough means as a Bayesian decision maker would. This chapter provides axiomatic foundations for sticky beliefs by weakening the standard axioms of dynamic consistency and consequentialism. I derive a representation in which updated beliefs are a convex combination of the prior and the Bayesian posterior. A unique parameter captures the weight on the prior and is interpreted as the agent's measure of belief stickiness or conservatism bias. This parameter is endogenously identified from preferences and is easily elicited from experimental data.

The third chapter deals with updating in the face of ambiguity, using the framework of Gilboa and Schmeidler. There is no consensus on the correct way way to update a set of priors. Current methods either do not allow a decision maker to make an inference about her priors or require an extreme level of inference. In this chapter I propose and axiomatize a general model of updating a set of priors. A decision maker who updates her beliefs in accordance with the model can be thought of as one that chooses a threshold that is used to determine whether a prior is plausible, given some observation. She retains the plausible priors and applies Bayes' rule. This model includes generalized Bayesian updating and maximum likelihood updating as special cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of codes, classically motivated by the need to communicate information reliably in the presence of error, has found new life in fields as diverse as network communication, distributed storage of data, and even has connections to the design of linear measurements used in compressive sensing. But in all contexts, a code typically involves exploiting the algebraic or geometric structure underlying an application. In this thesis, we examine several problems in coding theory, and try to gain some insight into the algebraic structure behind them.

The first is the study of the entropy region - the space of all possible vectors of joint entropies which can arise from a set of discrete random variables. Understanding this region is essentially the key to optimizing network codes for a given network. To this end, we employ a group-theoretic method of constructing random variables producing so-called "group-characterizable" entropy vectors, which are capable of approximating any point in the entropy region. We show how small groups can be used to produce entropy vectors which violate the Ingleton inequality, a fundamental bound on entropy vectors arising from the random variables involved in linear network codes. We discuss the suitability of these groups to design codes for networks which could potentially outperform linear coding.

The second topic we discuss is the design of frames with low coherence, closely related to finding spherical codes in which the codewords are unit vectors spaced out around the unit sphere so as to minimize the magnitudes of their mutual inner products. We show how to build frames by selecting a cleverly chosen set of representations of a finite group to produce a "group code" as described by Slepian decades ago. We go on to reinterpret our method as selecting a subset of rows of a group Fourier matrix, allowing us to study and bound our frames' coherences using character theory. We discuss the usefulness of our frames in sparse signal recovery using linear measurements.

The final problem we investigate is that of coding with constraints, most recently motivated by the demand for ways to encode large amounts of data using error-correcting codes so that any small loss can be recovered from a small set of surviving data. Most often, this involves using a systematic linear error-correcting code in which each parity symbol is constrained to be a function of some subset of the message symbols. We derive bounds on the minimum distance of such a code based on its constraints, and characterize when these bounds can be achieved using subcodes of Reed-Solomon codes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect on the scattering amplitude of the existence of a pole in the angular momentum plane near J = 1 in the channel with the quantum numbers of the vacuum is calculated. This is then compared with a fourth order calculation of the scattering of neutral vector mesons from a fermion pair field in the limit of large momentum transfer. The presence of the third double spectral function in the perturbation amplitude complicates the identification of pole trajectory parameters, and the limitations of previous methods of treating this are discussed. A gauge invariant scheme for extracting the contribution of the vacuum trajectory is presented which gives agreement with unitarity predictions, but further calculations must be done to determine the position and slope of the trajectory at s = 0. The residual portion of the amplitude is compared with the Gribov singularity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The propagation of waves in an extended, irregular medium is studied under the "quasi-optics" and the "Markov random process" approximations. Under these assumptions, a Fokker-Planck equation satisfied by the characteristic functional of the random wave field is derived. A complete set of the moment equations with different transverse coordinates and different wavenumbers is then obtained from the characteristic functional. The derivation does not require Gaussian statistics of the random medium and the result can be applied to the time-dependent problem. We then solve the moment equations for the phase correlation function, angular broadening, temporal pulse smearing, intensity correlation function, and the probability distribution of the random waves. The necessary and sufficient conditions for strong scintillation are also given.

We also consider the problem of diffraction of waves by a random, phase-changing screen. The intensity correlation function is solved in the whole Fresnel diffraction region and the temporal pulse broadening function is derived rigorously from the wave equation.

The method of smooth perturbations is applied to interplanetary scintillations. We formulate and calculate the effects of the solar-wind velocity fluctuations on the observed intensity power spectrum and on the ratio of the observed "pattern" velocity and the true velocity of the solar wind in the three-dimensional spherical model. The r.m.s. solar-wind velocity fluctuations are found to be ~200 km/sec in the region about 20 solar radii from the Sun.

We then interpret the observed interstellar scintillation data using the theories derived under the Markov approximation, which are also valid for the strong scintillation. We find that the Kolmogorov power-law spectrum with an outer scale of 10 to 100 pc fits the scintillation data and that the ambient averaged electron density in the interstellar medium is about 0.025 cm-3. It is also found that there exists a region of strong electron density fluctuation with thickness ~10 pc and mean electron density ~7 cm-3 between the PSR 0833-45 pulsar and the earth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Maxwell integral equations of transfer are applied to a series of problems involving flows of arbitrary density gases about spheres. As suggested by Lees a two sided Maxwellian-like weighting function containing a number of free parameters is utilized and a sufficient number of partial differential moment equations is used to determine these parameters. Maxwell's inverse fifth-power force law is used to simplify the evaluation of the collision integrals appearing in the moment equations. All flow quantities are then determined by integration of the weighting function which results from the solution of the differential moment system. Three problems are treated: the heat-flux from a slightly heated sphere at rest in an infinite gas; the velocity field and drag of a slowly moving sphere in an unbounded space; the velocity field and drag torque on a slowly rotating sphere. Solutions to the third problem are found to both first and second-order in surface Mach number with the secondary centrifugal fan motion being of particular interest. Singular aspects of the moment method are encountered in the last two problems and an asymptotic study of these difficulties leads to a formal criterion for a "well posed" moment system. The previously unanswered question of just how many moments must be used in a specific problem is now clarified to a great extent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The object of this report is to calculate the electron density profile of plane stratified inhomogeneous plasmas. The electron density profile is obtained through a numerical solution of the inverse scattering algorithm.

The inverse scattering algorithm connects the time dependent reflected field resulting from a δ-function field incident normally on the plasma to the inhomogeneous plasma density.

Examples show that the method produces uniquely the electron density on or behind maxima of the plasma frequency.

It is shown that the δ-function incident field used in the inverse scattering algorithm can be replaced by a thin square pulse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this dissertation is to study the theory of distributions and some of its applications. Certain concepts which we would include in the theory of distributions nowadays have been widely used in several fields of mathematics and physics. It was Dirac who first introduced the delta function as we know it, in an attempt to keep a convenient notation in his works in quantum mechanics. Their work contributed to open a new path in mathematics, as new objects, similar to functions but not of their same nature, were being used systematically. Distributions are believed to have been first formally introduced by the Soviet mathematician Sergei Sobolev and by Laurent Schwartz. The aim of this project is to show how distribution theory can be used to obtain what we call fundamental solutions of partial differential equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cerebral prefrontal function is one of the important aspects in neurobiology. Based on the experimental results of neuroanatomy, neurophysiology, behavioral sciences, and the principles of cybernetics and information theory after constructed a simple model simulating prefrontal control function, this paper simulated the behavior of Macaca mulatta completing delayed tasks both before and after its cerebral prefrontal cortex being damaged. The results indicated that there is an obvious difference in the capacity of completing delayed response tasks for the normal monkeys and those of prefrontal cortex cut away. The results are agreement with experiments. The authors suggest that the factors of affecting complete delayed response tasks might be in information keeping and extracting of memory including information storing, keeping and extracting procedures rather than in information storing process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental problem in the analysis of structured relational data like graphs, networks, databases, and matrices is to extract a summary of the common structure underlying relations between individual entities. Relational data are typically encoded in the form of arrays; invariance to the ordering of rows and columns corresponds to exchangeable arrays. Results in probability theory due to Aldous, Hoover and Kallenberg show that exchangeable arrays can be represented in terms of a random measurable function which constitutes the natural model parameter in a Bayesian model. We obtain a flexible yet simple Bayesian nonparametric model by placing a Gaussian process prior on the parameter function. Efficient inference utilises elliptical slice sampling combined with a random sparse approximation to the Gaussian process. We demonstrate applications of the model to network data and clarify its relation to models in the literature, several of which emerge as special cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concepts of function are central to design but statements about a device's functions can be interpreted in different ways. This raises problems for researchers trying to clarify the foundations of design theory and for those developing design support-tools that can represent and reason about function. By showing how functions relate systems to their sub-systems and super-systems, this article illustrates some limitations of existing function terminology and some problems with existing function statements. To address these issues, a system-relative function terminology is introduced. This is used to demonstrate that systems function not only with respect to their most local super-system, but also with respect to their more global super-systems. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electron multiplication charge-coupled devices (EMCCD) are widely used for photon counting experiments and measurements of low intensity light sources, and are extensively employed in biological fluorescence imaging applications. These devices have a complex statistical behaviour that is often not fully considered in the analysis of EMCCD data. Robust and optimal analysis of EMCCD images requires an understanding of their noise properties, in particular to exploit fully the advantages of Bayesian and maximum-likelihood analysis techniques, whose value is increasingly recognised in biological imaging for obtaining robust quantitative measurements from challenging data. To improve our own EMCCD analysis and as an effort to aid that of the wider bioimaging community, we present, explain and discuss a detailed physical model for EMCCD noise properties, giving a likelihood function for image counts in each pixel for a given incident intensity, and we explain how to measure the parameters for this model from various calibration images. © 2013 Hirsch et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop an analytical theory of high-power passively mode-locked lasers with a slow absorber; the theory is valid at pulse energies well exceeding the saturation energy. We analyze the Haus modelocking master equation in the pulse-energy-domain representation, approximating the intensity profile function by a series in the vicinity of its peak value. We consider the high-power operation regime of subpicosecond blue-violet GaN mode-locked diode lasers, using the approach developed. © 2010 Springer Science+Business Media, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Division of labour is a marked feature of multicellular organisms. Margulis proposed that the ancestors of metazoans had only one microtubule organizing center (MTOC), so they could not move and divide simultaneously. Selection for simultaneous movement and cell division had driven the division of labour between cells. However, no evidence or explanation for this assumption was provided. Why could the unicellular ancetors not have multiple MTOCs? The gain and loss of three possible strategies are discussed. It was found that the advantage of one or two MTOC per cell is environment-dependent. Unicellular organisms with only one MTOC per cell are favored only in resource-limited environments without strong predatory pressure. If division of labour occurring in a bicellular organism just makes simultaneous movement and cell division possible, the possibility of its fixation by natural selection is very low because a somatic cell performing the function of an MTOC is obviously wasting resources. Evolutionary biologists should search for other selective forces for division of labour in cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomimetic pattern recogntion (BPR), which is based on "cognition" instead of "classification", is much closer to the function of human being. The basis of BPR is the Principle of homology-continuity (PHC), which means the difference between two samples of the same class must be gradually changed. The aim of BPR is to find an optimal covering in the feature space, which emphasizes the "similarity" among homologous group members, rather than "division" in traditional pattern recognition. Some applications of BPR are surveyed, in which the results of BPR are much better than the results of Support Vector Machine. A novel neuron model, Hyper sausage neuron (HSN), is shown as a kind of covering units in BPR. The mathematical description of HSN is given and the 2-dimensional discriminant boundary of HSN is shown. In two special cases, in which samples are distributed in a line segment and a circle, both the HSN networks and RBF networks are used for covering. The results show that HSN networks act better than RBF networks in generalization, especially for small sample set, which are consonant with the results of the applications of BPR. And a brief explanation of the HSN networks' advantages in covering general distributed samples is also given.