991 resultados para Analytic theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This essay reviews the decision-making process that led to India exploding a nuclear device in May, 1974. An examination of the Analytic, Cybernetic and Cognitive Theories of decision, will enable a greater understanding of the events that led up to the 1974 test. While each theory is seen to be only partially useful, it is only by synthesising the three theories that a comprehensive account of the 1974 test can be given. To achieve this analysis, literature on decision-making in national security issues is reviewed, as well as the domestic and international environment in which involved decisionmakers operated. Finally, the rationale for the test in 1974 is examined. The conclusion revealed is that the explosion of a nuclear device by India in 1974 was primarily related to improving Indian international prestige among Third World countries and uniting a rapidly disintegrating Indian societal consensus. In themselves, individual decision-making theories were found to be of little use, but a combination of the various elements allowed a greater comprehension of the events leading up to the test than might otherwise have been the case.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The object of this thesis is to formulate a basic commutative difference operator theory for functions defined on a basic sequence, and a bibasic commutative difference operator theory for functions defined on a bibasic sequence of points, which can be applied to the solution of basic and bibasic difference equations. in this thesis a brief survey of the work done in this field in the classical case, as well as a review of the development of q~difference equations, q—analytic function theory, bibasic analytic function theory, bianalytic function theory, discrete pseudoanalytic function theory and finally a summary of results of this thesis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a recent trend to describe physical phenomena without the use of infinitesimals or infinites. This has been accomplished replacing differential calculus by the finite difference theory. Discrete function theory was first introduced in l94l. This theory is concerned with a study of functions defined on a discrete set of points in the complex plane. The theory was extensively developed for functions defined on a Gaussian lattice. In 1972 a very suitable lattice H: {Ci qmxO,I qnyo), X0) 0, X3) 0, O < q < l, m, n 5 Z} was found and discrete analytic function theory was developed. Very recently some work has been done in discrete monodiffric function theory for functions defined on H. The theory of pseudoanalytic functions is a generalisation of the theory of analytic functions. When the generator becomes the identity, ie., (l, i) the theory of pseudoanalytic functions reduces to the theory of analytic functions. Theugh the theory of pseudoanalytic functions plays an important role in analysis, no discrete theory is available in literature. This thesis is an attempt in that direction. A discrete pseudoanalytic theory is derived for functions defined on H.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By eliminating the short range negative divergence of the Debye–Hückel pair distribution function, but retaining the exponential charge screening known to operate at large interparticle separation, the thermodynamic properties of one-component plasmas of point ions or charged hard spheres can be well represented even in the strong coupling regime. Predicted electrostatic free energies agree within 5% of simulation data for typical Coulomb interactions up to a factor of 10 times the average kinetic energy. Here, this idea is extended to the general case of a uniform ionic mixture, comprising an arbitrary number of components, embedded in a rigid neutralizing background. The new theory is implemented in two ways: (i) by an unambiguous iterative algorithm that requires numerical methods and breaks the symmetry of cross correlation functions; and (ii) by invoking generalized matrix inverses that maintain symmetry and yield completely analytic solutions, but which are not uniquely determined. The extreme computational simplicity of the theory is attractive when considering applications to complex inhomogeneous fluids of charged particles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the theory and observations related to the "superhump" precession of eccentric accretion discs in close binary systems. We agree with earlier work, although for different reasons, that the discrepancy between observation and dynamical theory implies that the effect of pressure in the disc cannot be neglected. We extend earlier work that investigates this effect to include the correct expression for the radius at which resonant orbits occur. Using analytic expressions for the accretion disc structure, we derive a relationship between the period excess and mass ratio with the pressure effects included. This is compared to the observed data, recently derived results for detailed integration of the disc equations and the equivalent empirically derived relations and used to predict values for the mass ratio based on measured values of the period excess for 88 systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Retrieving a subset of items can cause the forgetting of other items, a phenomenon referred to as retrieval-induced forgetting. According to some theorists, retrieval-induced forgetting is the consequence of an inhibitory mechanism that acts to reduce the accessibility of non-target items that interfere with the retrieval of target items. Other theorists argue that inhibition is unnecessary to account for retrieval-induced forgetting, contending instead that the phenomenon can be best explained by non-inhibitory mechanisms, such as strength-based competition or blocking. The current paper provides the first major meta-analysis of retrieval-induced forgetting, conducted with the primary purpose of quantitatively evaluating the multitude of findings that have been used to contrast these two theoretical viewpoints. The results largely supported inhibition accounts, but also provided some challenging evidence, with the nature of the results often varying as a function of how retrieval-induced forgetting was assessed. Implications for further research and theory development are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patient perspectives on how therapeutic letters contributed to their experience of cognitive analytic therapy (CAT) were investigated. Eight patients took part in semistructured interviews. A grounded, thematic analysis of their accounts suggested four general processes. First, letters offered a tangible, lasting framework for the assimilation of a new perspective about themselves and their relationships and facilitated coping with a complex range of emotions and risks this awareness required. Second, they demonstrated therapists’ commitment to patients’ growth. Third, they helped to teach participants about the therapy process as an example of an interpersonal exchange. Fourth, they helped participants consider how they wished to share personal information. These data offer a more complex understanding of this standard CAT intervention. Although some findings are consistent with CAT theory, the range of emotional dilemmas associated with letters has not received specific attention. Clinical implications are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The logic of proofs (lp) was proposed as Gdels missed link between Intuitionistic and S4-proofs, but so far the tableau-based methods proposed for lp have not explored this closeness with S4 and contain rules whose analycity is not immediately evident. We study possible formulations of analytic tableau proof methods for lp that preserve the subformula property. Two sound and complete tableau decision methods of increasing degree of analycity are proposed, KELP and preKELP. The latter is particularly inspired on S4-proofs. The crucial role of proof constants in the structure of lp-proofs methods is analysed. In particular, a method for the abduction of proof constant specifications in strongly analytic preKELP proofs is presented; abduction heuristics and the complexity of the method are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss an algebraic theory for generalized Jordan chains and partial signatures, that are invariants associated to sequences of symmetric bilinear forms on a vector space. We introduce an intrinsic notion of partial signatures in the Lagrangian Grassmannian of a symplectic space that does not use local coordinates, and we give a formula for the Maslov index of arbitrary real analytic paths in terms of partial signatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contents:
1. Role of multi-criteria decision making in natural resource management /​ Gamini Herath and Tony Prato
2. Analysis of forest policy using multi-attribute value theory /​ Jayanath Ananda and Gamini Herath
3. Comparing Riparian revegetation policy options using the analytic hierarchy process /​ M. E. Qureshi and S. R. Harrison
4. Managing environmental and health risks from a lead and zinc smelter : an application of deliberative multi-criteria evaluation /​ Wendy Proctor, Chris McQuade and Anne Dekker
5. Multiple attribute evaluation of management alternatives for the Missouri River System /​ Tony Prato
6. Multi-criteria decision analysis for integrated watershed management /​ Zeyuan Qiu
7. Fuzzy multiple attribute evaluation of agricultural systems /​ Leonie A. Marks and Elizabeth G. Dunn
8. Multi-criteria decision support for energy supply assessment /​ Bram Noble
9. Seaport development in Vietnam : evaluation using the analytic hierarchy process /​ Tran Phuong Dong and David M. Chapman
10. Valuing wetland aquatic resources using the analytic hierarchy process /​ Premachandra Wattage and Simon Mardle
11. Multiple attribute evaluation for national park management /​ Tony Prato
12. The future of MCDA in natural resource management : some generalizations /​ Gamini Herath and Tony Prato.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanostructured and ultra-fine grained metals have higher strength but extremely limited ductility compared to coarse grained metals. However, their ductility can be greatly improved by introducing a specific range of grain sizes in the microstructures. In the paper, multiscale unit cell approach (UCA) is developed and applied to predict the averaged stress-strain relations of the multiscale microstructure metals. The unit cell models are three-phase structured at different scale lengths of 100 nm, 1 μm and 10 μm with different volume fractions and periodic boundary conditions. The contributions of multi-scale microstructures to the macroscopic structural properties of metals are also studied using a analytic approach—two-step mean-field method (TSMF), where three microstructural parameters are introduced and thus mechanical properties such as strength and ductility are presented as a function of these parameters. For verification of these proposed numerical and theoretical algorithms, the structural properties of the pure nickel with three-grain microstructures are studied and the results from FEA and the proposed theory have good agreement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the applied importance of cohesion within organisational settings, researchers have yet to reach consensus about the dimensionality of group cohesion, and therefore appropriate tools for its measurement. The way that cohesion has generally been conceptualised has changed over time, but the measures appear not to reflect the underlying theory. This deficiency has impeded attempts to explore the relationship between co-worker cohesion and group performance (Beal et al., 2003; Mullen & Copper, 1994). Given inconsistent findings from previous factor analyses of cohesion, the present study employed exploratory means to help clarify the factor structure of cohesion within the workplace. Potential participants were recruited via the researchers' social networks. This snowballing technique led to 236 participants completing the online questionnaire. Exploratory factor analysis revealed four first-order factors of team commitment, friendliness, interpersonal conflict and communication that collectively accounted for 55.17% of the variance shared among the 75 cohesion items. Subsequently, a single higher-order factor was extracted which accounted for over half of the co-variation among the first order factors. This higher-order factor seems to reflect a general cohesion factor, as it was loaded by a diffuse collection of items, including those from the four lower-order factors as well as items that failed to load onto these lower-order factors. While there were similarities between these results and those of previous studies, the present factor structure did not map perfectly onto any of the existing conceptual models of cohesion. This finding highlights the need to incorporate some alternate factors that have previously been given little consideration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter locates knowledge mapping within the theoretical framework of cultural historical activity theory. Cultural historical activity theory provides an analytic tool for understanding how knowledge maps can act as “stimuli-means”: a cultural artefact that can mediate the performance of subjects (Vygotsky, 1978 ). Knowledge maps possess Vygotsky’s double nature: they not only enable students to enact academic practice but also allow refl ection on that practice. They enable students to build an “internal cognitive schematisation of that practice” (Guile, 2005 , p.127). Further, cultural historical activity theory gives the tools to analyse the social context of our use of knowledge maps and thus consider the mediating rules (tacit and explicit) and division of labour that mediate our use of knowledge maps. Knowledge maps can be viewed as acting within Brandom’s ( 2000 ) space of reasons , which allows learners to use reasons to develop and exchange judgements based on shareable, theoretically articulated concepts and collectively develop the ability to restructure their knowledge and enact these judgements (Guile, 2011 ). In particular multimodal collaborative knowledge maps can act as Vygotsky’s (Vygotsky, 1978 ) zone of proximal development , where teacher and peer-to-peer interaction allow students to solve problems and learn concepts and skills that they would be otherwise unable to tackle.