959 resultados para metric number theory
Resumo:
The extension of density functional theory (DFT) to include pairing correlations without formal violation of the particle-number conservation condition is described. This version of the theory can be considered as a foundation of the application of existing DFT plus pairing approaches to atoms, molecules, ultracooled and magnetically trapped atomic Fermi gases, and atomic nuclei where the number of particles is conserved exactly. The connection with Hartree-Fock-Bogoliubov (HFB) theory is discussed, and the method of quasilocal reduction of the nonlocal theory is also described. This quasilocal reduction allows equations of motion to be obtained which are much simpler for numerical solution than the equations corresponding to the nonlocal case. Our theory is applied to the study of some even Sn isotopes, and the results are compared with those obtained in the standard HFB theory and with the experimental ones.
Resumo:
In order to estimate the motion of an object, the visual system needs to combine multiple local measurements, each of which carries some degree of ambiguity. We present a model of motion perception whereby measurements from different image regions are combined according to a Bayesian estimator --- the estimated motion maximizes the posterior probability assuming a prior favoring slow and smooth velocities. In reviewing a large number of previously published phenomena we find that the Bayesian estimator predicts a wide range of psychophysical results. This suggests that the seemingly complex set of illusions arise from a single computational strategy that is optimal under reasonable assumptions.
Resumo:
One of the disadvantages of old age is that there is more past than future: this, however, may be turned into an advantage if the wealth of experience and, hopefully, wisdom gained in the past can be reflected upon and throw some light on possible future trends. To an extent, then, this talk is necessarily personal, certainly nostalgic, but also self critical and inquisitive about our understanding of the discipline of statistics. A number of almost philosophical themes will run through the talk: search for appropriate modelling in relation to the real problem envisaged, emphasis on sensible balances between simplicity and complexity, the relative roles of theory and practice, the nature of communication of inferential ideas to the statistical layman, the inter-related roles of teaching, consultation and research. A list of keywords might be: identification of sample space and its mathematical structure, choices between transform and stay, the role of parametric modelling, the role of a sample space metric, the underused hypothesis lattice, the nature of compositional change, particularly in relation to the modelling of processes. While the main theme will be relevance to compositional data analysis we shall point to substantial implications for general multivariate analysis arising from experience of the development of compositional data analysis…
Resumo:
[1] Temperature and ozone observations from the Microwave Limb Sounder (MLS) on the EOS Aura satellite are used to study equatorial wave activity in the autumn of 2005. In contrast to previous observations for the same season in other years, the temperature anomalies in the middle and lower tropical stratosphere are found to be characterized by a strong wave-like eastward progression with zonal wave number equal to 3. Extended empirical orthogonal function (EOF) analysis reveals that the wave 3 components detected in the temperature anomalies correspond to a slow Kelvin wave with a period of 8 days and a phase speed of 19 m/s. Fluctuations associated with this Kelvin wave mode are also apparent in ozone profiles. Moreover, as expected by linear theory, the ozone fluctuations observed in the lower stratosphere are in phase with the temperature perturbations, and peak around 20–30 hPa where the mean ozone mixing ratios have the steepest vertical gradient. A search for other Kelvin wave modes has also been made using both the MLS observations and the analyses from one experiment where MLS ozone profiles are assimilated into the European Centre for Medium-Range Weather Forecasts (ECMWF) data assimilation system via a 6-hourly 3D var scheme. Our results show that the characteristics of the wave activity detected in the ECMWF temperature and ozone analyses are in good agreement with MLS data.
Resumo:
We compare laboratory observations of equilibrated baroclinic waves in the rotating two-layer annulus, with numerical simulations from a quasi-geostrophic model. The laboratory experiments lie well outside the quasi-geostrophic regime: the Rossby number reaches unity; the depth-to-width aspect ratio is large; and the fluid contains ageostrophic inertia–gravity waves. Despite being formally inapplicable, the quasi-geostrophic model captures the laboratory flows reasonably well. The model displays several systematic biases, which are consequences of its treatment of boundary layers and neglect of interfacial surface tension and which may be explained without invoking the dynamical effects of the moderate Rossby number, large aspect ratio or inertia–gravity waves. We conclude that quasi-geostrophic theory appears to continue to apply well outside its formal bounds.
Resumo:
1. Life-history theory assumes that trade-offs exist between an individual's life-history components, such that an increased allocation of a resource to one fitness trait might be expected to result in a cost for a conflicting fitness trait. Recent evidence from experimental manipulations of wild individuals supports this assumption. 2. The management of many bird populations involves harvesting for both commercial and conservation purposes. One frequently harvested life-history stage is the egg, but the consequences of repeated egg harvesting for the individual and the long-term dynamics of the population remain poorly understood. 3. We used a well-documented restored population of the Mauritius kestrel Falco punctatus as a model system to explore the consequences of egg harvesting (and associated management practices) for an individual within the context of life-history theory. 4. Our analysis indicated that management practices enhanced both the size and number of clutches laid by managed females, and improved mid-life male and female adult survival relative to unmanaged adult kestrels. 5. Although management resulted in an increased effort in egg production, it reduced parental effort during incubation and the rearing of offspring, which could account for these observed changes. 6. Synthesis and applications. This study demonstrates how a commonly applied harvesting strategy, when examined within the context of life-history theory, can identify improvements in particular fitness traits that might alleviate some of the perceived negative impact of harvesting on the long-term dynamics of a managed population.
Resumo:
A large number of processes are involved in the pathogenesis of atherosclerosis but it is unclear which of them play a rate-limiting role. One way of resolving this problem is to investigate the highly non-uniform distribution of disease within the arterial system; critical steps in lesion development should be revealed by identifying arterial properties that differ between susceptible and protected sites. Although the localisation of atherosclerotic lesions has been investigated intensively over much of the 20th century, this review argues that the factor determining the distribution of human disease has only recently been identified. Recognition that the distribution changes with age has, for the first time, allowed it to be explained by variation in transport properties of the arterial wall; hitherto, this view could only be applied to experimental atherosclerosis in animals. The newly discovered transport variations which appear to play a critical role in the development of adult disease have underlying mechanisms that differ from those elucidated for the transport variations relevant to experimental atherosclerosis: they depend on endogenous NO synthesis and on blood flow. Manipulation of transport properties might have therapeutic potential. Copyright (C) 2004 S. Karger AG, Basel.
Resumo:
Firms form consortia in order to win contracts. Once a project has been awarded to a consortium each member then concentrates on his or her own contract with the client. Therefore, consortia are marketing devices, which present the impression of teamworking, but the production process is just as fragmented as under conventional procurement methods. In this way, the consortium forms a barrier between the client and the actual construction production process. Firms form consortia, not as a simple development of normal ways of working, but because the circumstances for specific projects make it a necessary vehicle. These circumstances include projects that are too large or too complex to undertake alone or projects that require on-going services which cannot be provided by the individual firms inhouse. It is not a preferred way of working, because participants carry extra risk in the form of liability for the actions of their partners in the consortium. The behaviour of members of consortia is determined by their relative power, based on several factors, including financial commitment and ease of replacement. The level of supply chain visibility to the public sector client and to the industry is reduced by the existence of a consortium because the consortium forms an additional obstacle between the client and the firms undertaking the actual construction work. Supply chain visibility matters to the client who otherwise loses control over the process of construction or service provision, while remaining accountable for cost overruns. To overcome this separation there is a convincing argument in favour of adopting the approach put forward in the Project Partnering Contract 2000 (PPC2000) Agreement. Members of consortia do not necessarily go on to work in the same consortia again because members need to respond flexibly to opportunities as and when they arise. Decision-making processes within consortia tend to be on an ad hoc basis. Construction risk is taken by the contractor and the construction supply chain but the reputational risk is carried by all the firms associated with a consortium. There is a wide variation in the manner that consortia are formed, determined by the individual circumstances of each project; its requirements, size and complexity, and the attitude of individual project leaders. However, there are a number of close working relationships based on generic models of consortia-like arrangements for the purpose of building production, such as the Housing Corporation Guidance Notes and the PPC2000.
Resumo:
Experimentally and theoretically determined infrared spectra are reported for a series of straight-chain perfluorocarbons: C2F6, C3F8, C4F10, C5F12, C6F14, and C8F18. Theoretical spectra were determined using both density functional (DFT) and ab initio methods. Radiative efficiencies (REs) were determined using the method of Pinnock et al. (1995) and combined with atmospheric lifetimes from the literature to determine global warming potentials (GWPs). Theoretically determined absorption cross sections were within 10% of experimentally determined values. Despite being much less computationally expensive, DFT calculations were generally found to perform better than ab initio methods. There is a strong wavenumber dependence of radiative forcing in the region of the fundamental C-F vibration, and small differences in wavelength between band positions determined by theory and experiment have a significant impact on the REs. We apply an empirical correction to the theoretical spectra and then test this correction on a number of branched chain and cyclic perfluoroalkanes. We then compute absorption cross sections, REs, and GWPs for an additional set of perfluoroalkenes.
Resumo:
This paper compares a number of different extreme value models for determining the value at risk (VaR) of three LIFFE futures contracts. A semi-nonparametric approach is also proposed, where the tail events are modeled using the generalised Pareto distribution, and normal market conditions are captured by the empirical distribution function. The value at risk estimates from this approach are compared with those of standard nonparametric extreme value tail estimation approaches, with a small sample bias-corrected extreme value approach, and with those calculated from bootstrapping the unconditional density and bootstrapping from a GARCH(1,1) model. The results indicate that, for a holdout sample, the proposed semi-nonparametric extreme value approach yields superior results to other methods, but the small sample tail index technique is also accurate.
Resumo:
Recent research has shown that Lighthill–Ford spontaneous gravity wave generation theory, when applied to numerical model data, can help predict areas of clear-air turbulence. It is hypothesized that this is the case because spontaneously generated atmospheric gravity waves may initiate turbulence by locally modifying the stability and wind shear. As an improvement on the original research, this paper describes the creation of an ‘operational’ algorithm (ULTURB) with three modifications to the original method: (1) extending the altitude range for which the method is effective downward to the top of the boundary layer, (2) adding turbulent kinetic energy production from the environment to the locally produced turbulent kinetic energy production, and, (3) transforming turbulent kinetic energy dissipation to eddy dissipation rate, the turbulence metric becoming the worldwide ‘standard’. In a comparison of ULTURB with the original method and with the Graphical Turbulence Guidance second version (GTG2) automated procedure for forecasting mid- and upper-level aircraft turbulence ULTURB performed better for all turbulence intensities. Since ULTURB, unlike GTG2, is founded on a self-consistent dynamical theory, it may offer forecasters better insight into the causes of the clear-air turbulence and may ultimately enhance its predictability.
Resumo:
By eliminating the short range negative divergence of the Debye–Hückel pair distribution function, but retaining the exponential charge screening known to operate at large interparticle separation, the thermodynamic properties of one-component plasmas of point ions or charged hard spheres can be well represented even in the strong coupling regime. Predicted electrostatic free energies agree within 5% of simulation data for typical Coulomb interactions up to a factor of 10 times the average kinetic energy. Here, this idea is extended to the general case of a uniform ionic mixture, comprising an arbitrary number of components, embedded in a rigid neutralizing background. The new theory is implemented in two ways: (i) by an unambiguous iterative algorithm that requires numerical methods and breaks the symmetry of cross correlation functions; and (ii) by invoking generalized matrix inverses that maintain symmetry and yield completely analytic solutions, but which are not uniquely determined. The extreme computational simplicity of the theory is attractive when considering applications to complex inhomogeneous fluids of charged particles.
Resumo:
The Fredholm properties of Toeplitz operators on the Bergman space A2 have been well-known for continuous symbols since the 1970s. We investigate the case p=1 with continuous symbols under a mild additional condition, namely that of the logarithmic vanishing mean oscillation in the Bergman metric. Most differences are related to boundedness properties of Toeplitz operators acting on Ap that arise when we no longer have 1
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
The assumption that ‘states' primary goal is survival’ lies at the heart of the neorealist paradigm. A careful examination of the assumption, however, reveals that neorealists draw upon a number of distinct interpretations of the ‘survival assumption’ that are then treated as if they are the same, pointing towards conceptual problems that surround the treatment of state preferences. This article offers a specification that focuses on two questions that highlight the role and function of the survival assumption in the neorealist logic: (i) what do states have to lose if they fail to adopt self-help strategies?; and (ii) how does concern for relevant losses motivate state behaviour and affect international outcomes? Answering these questions through the exploration of governing elites' sensitivity towards regime stability and territorial integrity of the state, in turn, addresses the aforementioned conceptual problems. This specification has further implications for the debates among defensive and offensive realists, potential extensions of the neorealist logic beyond the Westphalian states, and the relationship between neorealist theory and policy analysis.