959 resultados para Systematic theory
Resumo:
A molecular theory of underdamped dielectric relaxation of a dense dipolar liquid is presented. This theory properly takes into account the collective effects that are present (due to strong intermolecular correlations) in a dipolar liquid. For small rigid molecules, the theory again leads to a three-variable description which, however, is somewhat different from the traditional version. In particular, two of the three parameters are collective in nature and are determined by the orientational pair correlation function. A detailed comparison between the theory and the computer simulation results of Neria and Nitzan is performed and an excellent agreement is obtained without the use of any adjustable or free parameter - the calculation is fully microscopic. The theory can also provide a systematic description of the Poley absorption often observed in dipolar liquids in the high-frequency regime.
Resumo:
In this paper, we present a kinematic theory for Hoberman and other similar foldable linkages. By recognizing that the building blocks of such linkages can be modeled as planar linkages, different classes of possible solutions are systematically obtained including some novel arrangements. Criteria for foldability are arrived by analyzing the algebraic locus of the coupler curve of a PRRP linkage. They help explain generalized Hoberman and other mechanisms reported in the literature. New properties of such mechanisms including the extent of foldability, shape-preservation of the inner and outer profiles, multi-segmented assemblies and heterogeneous circumferential arrangements are derived. The design equations derived here make the conception of even complex planar radially foldable mechanisms systematic and easy. Representative examples are presented to illustrate the usage of the design equations and the kinematic theory.
Resumo:
The van der Waals and Platteuw (vdVVP) theory has been successfully used to model the thermodynamics of gas hydrates. However, earlier studies have shown that this could be due to the presence of a large number of adjustable parameters whose values are obtained through regression with experimental data. To test this assertion, we carry out a systematic and rigorous study of the performance of various models of vdWP theory that have been proposed over the years. The hydrate phase equilibrium data used for this study is obtained from Monte Carlo molecular simulations of methane hydrates. The parameters of the vdWP theory are regressed from this equilibrium data and compared with their true values obtained directly from simulations. This comparison reveals that (i) methane-water interactions beyond the first cage and methane-methane interactions make a significant contribution to the partition function and thus cannot be neglected, (ii) the rigorous Monte Carlo integration should be used to evaluate the Langmuir constant instead of the spherical smoothed cell approximation, (iii) the parameter values describing the methane-water interactions cannot be correctly regressed from the equilibrium data using the vdVVP theory in its present form, (iv) the regressed empty hydrate property values closely match their true values irrespective of the level of rigor in the theory, and (v) the flexibility of the water lattice forming the hydrate phase needs to be incorporated in the vdWP theory. Since methane is among the simplest of hydrate forming molecules, the conclusions from this study should also hold true for more complicated hydrate guest molecules.
Resumo:
This thesis studies decision making under uncertainty and how economic agents respond to information. The classic model of subjective expected utility and Bayesian updating is often at odds with empirical and experimental results; people exhibit systematic biases in information processing and often exhibit aversion to ambiguity. The aim of this work is to develop simple models that capture observed biases and study their economic implications.
In the first chapter I present an axiomatic model of cognitive dissonance, in which an agent's response to information explicitly depends upon past actions. I introduce novel behavioral axioms and derive a representation in which beliefs are directionally updated. The agent twists the information and overweights states in which his past actions provide a higher payoff. I then characterize two special cases of the representation. In the first case, the agent distorts the likelihood ratio of two states by a function of the utility values of the previous action in those states. In the second case, the agent's posterior beliefs are a convex combination of the Bayesian belief and the one which maximizes the conditional value of the previous action. Within the second case a unique parameter captures the agent's sensitivity to dissonance, and I characterize a way to compare sensitivity to dissonance between individuals. Lastly, I develop several simple applications and show that cognitive dissonance contributes to the equity premium and price volatility, asymmetric reaction to news, and belief polarization.
The second chapter characterizes a decision maker with sticky beliefs. That is, a decision maker who does not update enough in response to information, where enough means as a Bayesian decision maker would. This chapter provides axiomatic foundations for sticky beliefs by weakening the standard axioms of dynamic consistency and consequentialism. I derive a representation in which updated beliefs are a convex combination of the prior and the Bayesian posterior. A unique parameter captures the weight on the prior and is interpreted as the agent's measure of belief stickiness or conservatism bias. This parameter is endogenously identified from preferences and is easily elicited from experimental data.
The third chapter deals with updating in the face of ambiguity, using the framework of Gilboa and Schmeidler. There is no consensus on the correct way way to update a set of priors. Current methods either do not allow a decision maker to make an inference about her priors or require an extreme level of inference. In this chapter I propose and axiomatize a general model of updating a set of priors. A decision maker who updates her beliefs in accordance with the model can be thought of as one that chooses a threshold that is used to determine whether a prior is plausible, given some observation. She retains the plausible priors and applies Bayes' rule. This model includes generalized Bayesian updating and maximum likelihood updating as special cases.
Resumo:
The study of codes, classically motivated by the need to communicate information reliably in the presence of error, has found new life in fields as diverse as network communication, distributed storage of data, and even has connections to the design of linear measurements used in compressive sensing. But in all contexts, a code typically involves exploiting the algebraic or geometric structure underlying an application. In this thesis, we examine several problems in coding theory, and try to gain some insight into the algebraic structure behind them.
The first is the study of the entropy region - the space of all possible vectors of joint entropies which can arise from a set of discrete random variables. Understanding this region is essentially the key to optimizing network codes for a given network. To this end, we employ a group-theoretic method of constructing random variables producing so-called "group-characterizable" entropy vectors, which are capable of approximating any point in the entropy region. We show how small groups can be used to produce entropy vectors which violate the Ingleton inequality, a fundamental bound on entropy vectors arising from the random variables involved in linear network codes. We discuss the suitability of these groups to design codes for networks which could potentially outperform linear coding.
The second topic we discuss is the design of frames with low coherence, closely related to finding spherical codes in which the codewords are unit vectors spaced out around the unit sphere so as to minimize the magnitudes of their mutual inner products. We show how to build frames by selecting a cleverly chosen set of representations of a finite group to produce a "group code" as described by Slepian decades ago. We go on to reinterpret our method as selecting a subset of rows of a group Fourier matrix, allowing us to study and bound our frames' coherences using character theory. We discuss the usefulness of our frames in sparse signal recovery using linear measurements.
The final problem we investigate is that of coding with constraints, most recently motivated by the demand for ways to encode large amounts of data using error-correcting codes so that any small loss can be recovered from a small set of surviving data. Most often, this involves using a systematic linear error-correcting code in which each parity symbol is constrained to be a function of some subset of the message symbols. We derive bounds on the minimum distance of such a code based on its constraints, and characterize when these bounds can be achieved using subcodes of Reed-Solomon codes.
Resumo:
The ground-state properties of Hs nuclei are studied in the framework of the relativistic meanfield theory. We find that the more relatively stable isotopes are located on the proton abundant side of the isotopic chain. The last stable nucleus near the proton drip line is probably the (255)Hs nucleus. The alpha-decay half-lives of Hs nuclei are predicted, and together with the evaluation of the spontaneous-fission half-lives it is shown that the nuclei, which are possibly stable against spontaneous fission are (263-274)Hs. This is in coincidence with the larger binding energies per nucleon. If (271-274)Hs can be synthesized and identified, only those nuclei from the upper Z = 118 isotopic chain, which are lighter than the nucleus (294)118, and those nuclei in the corresponding alpha-decay chain lead to Hs nuclei. The most stable unknown Hs nucleus is (268)Hs. The density-dependent delta interaction pairing is used to improve the BCS pairing correction, which results in more reasonable single-particle energy level distributions and nucleon occupation probabilities. It is shown that the properties of nuclei in the superheavy region can be described with this interaction.
Resumo:
Facilitated alkali metal ion (M+= Li+, Na+, K+, Rb+, and Cs+) transfers across the micro- and nano-water/1,2-dichloroethane (W/DCE) interfaces supported at the tips of micro- and nanopipets by dibenzo-18-crown-6 (DB18C6) have been investigated systematically using cyclic voltammetry. The theory developed by Matsuda et al. was applied to estimate the association constants of DB18C6 and M+ in the DCE phase based on the experimental voltammetric results. The kinetic measurements for alkali metal ion transfer across the W/DCE interface facilitated by DB18C6 were conducted using nanopipets or-submicropipets, and the standard rate constants (k(0)) were evaluated by analysis of the experimental voltammetric data. They increase in the following order: k(Cs+)(0) < k(Li+)(0) < k(Rb+)(0) < k(Na+)(0) < k(K+)(0), which is in accordance with their association constants except Cs+ and Li+.
Resumo:
Background: Accommodating Interruptions is a theory that emerged in the context of young people who have asthma. A background to the prevalence and management of asthma in Ireland is given to situate the theory. Ireland has the fourth highest incidence of asthma in the world, with almost one in five Irish young people having asthma. Although national and international asthma management guidelines exist it is accepted that the symptom control of asthma among the young people population is poor. Aim: The aim of this research is to investigate the lives of young people who have asthma, to allow for a deeper understanding of the issues affecting them. Methods: This research was undertaken using a Classic Grounded Theory approach. It is a systematic approach to allowing conceptual emergence from data in generating a theory that explains behaviour in resolving the participant’s main concern. The data were collected through in-depth interviews with young people aged 11-16 years who had asthma for over one year. Data were also collected from participant diaries. Constant comparative analysis, theoretical coding and memo writing were used to develop the theory. Results: The theory explains how young people resolve their main concern of being restricted, by maximizing their participation and inclusion in activities, events and relationships in spite of their asthma. They achieve this by accommodating interruptions in their lives in minimizing the effects of asthma on their everyday lives. Conclusion: The theory of accommodating interruptions explains young people’s asthma management behaviours in a new way. It allows us to understand how and why young people behave the way they do in order minimise the effect of asthma on their lives. The theory adds to the body of knowledge on young people with asthma and challenges some viewpoints regarding their behaviours.
Resumo:
We present an improved nonlinear theory for the perpendicular transport of charged particles. This approach is based on an improved nonlinear treatment of field-line random walk in combination with a generalized compound diffusion model. The generalized compound diffusion model employed is more systematic and reliable, in comparison with previous theories. Furthermore, the theory shows remarkably good agreement with test-particle simulations and solar wind observations.
Resumo:
Prior research has argued that use of optional properties in conceptual models results in loss of information about the semantics of the domains represented by the models. Empirical research undertaken to date supports this argument. Nevertheless, no systematic analysis has been done of whether use of optional properties is always problematic. Furthermore, prior empirical research might have deliberately or unwittingly employed models where use of optionality always causes problems. Accordingly, we examine analytically whether use of optional properties is always problematic. We employ our analytical results to inform the design of an experiment where we systematically examined the impact of optionality on users’ ability to understand domains represented by different types of conceptual models. We found evidence that use of optionality undermines users’ ability to understand the domain represented by a model but that this effect weakens when use of mandatory properties to replace optional properties leads to more-complex models.
Resumo:
Many children are cared for on a full-time basis by relatives or adult friends, rather than their biological parents, and often in response to family crises. These kinship care arrangements have received increasing attention from the social science academy and social care professions. However, more information is needed on informal kinship care that is undertaken without official ratification by welfare agencies and often unsupported by the state. This article presents a comprehensive, narrative review of international, research literature on informal, kinship care to address this gap. Using systematic search and review protocols, it synthesises findings regarding: (i) the way that informal kinship care is defined and conceptualised; (ii) the needs of the carers and children; and (iii) ways of supporting this type of care. A number of prominent themes are highlighted including the lack of definitional clarity; the various adversities experienced by the families; and the requirement to understand the interface between formal and informal supports. Key messages are finally identified to inform the development of family friendly policies, interventions, and future research.
Resumo:
An analytic method to evaluate nuclear contributions to electrical properties of polyatomic molecules is presented. Such contributions control changes induced by an electric field on equilibrium geometry (nuclear relaxation contribution) and vibrational motion (vibrational contribution) of a molecular system. Expressions to compute the nuclear contributions have been derived from a power series expansion of the potential energy. These contributions to the electrical properties are given in terms of energy derivatives with respect to normal coordinates, electric field intensity or both. Only one calculation of such derivatives at the field-free equilibrium geometry is required. To show the useful efficiency of the analytical evaluation of electrical properties (the so-called AEEP method), results for calculations on water and pyridine at the SCF/TZ2P and the MP2/TZ2P levels of theory are reported. The results obtained are compared with previous theoretical calculations and with experimental values
Resumo:
A comparative systematic study of the CrO2F2 compound has been performed using different conventional ab initio methodologies and density functional procedures. Two points have been analyzed: first, the accuracy of results yielded by each method under study, and second, the computational cost required to reach such results. Weighing up both aspects, density functional theory has been found to be more appropriate than the Hartree-Fock (HF) and the analyzed post-HF methods. Hence, the structural characterization and spectroscopic elucidation of the full CrO2X2 series (X=F,Cl,Br,I) has been done at this level of theory. Emphasis has been given to the unknown CrO2I2 species, and specially to the UV/visible spectra of all four compounds. Furthermore, a topological analysis in terms of charge density distributions has revealed why the valence shell electron pair repulsion model fails in predicting the molecular shape of such CrO2X2 complexes