932 resultados para Random-set theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the so-called Basis Set Superposition Error (BSSE) from both a methodological and a practical point of view. The purpose of the present thesis is twofold: (a) to contribute step ahead in the correct characterization of weakly bound complexes and, (b) to shed light the understanding of the actual implications of the basis set extension effects in the ab intio calculations and contribute to the BSSE debate. The existing BSSE-correction procedures are deeply analyzed, compared, validated and, if necessary, improved. A new interpretation of the counterpoise (CP) method is used in order to define counterpoise-corrected descriptions of the molecular complexes. This novel point of view allows for a study of the BSSE-effects not only in the interaction energy but also on the potential energy surface and, in general, in any property derived from the molecular energy and its derivatives A program has been developed for the calculation of CP-corrected geometry optimizations and vibrational frequencies, also using several counterpoise schemes for the case of molecular clusters. The method has also been implemented in Gaussian98 revA10 package. The Chemical Hamiltonian Approach (CHA) methodology has been also implemented at the RHF and UHF levels of theory for an arbitrary number interacting systems using an algorithm based on block-diagonal matrices. Along with the methodological development, the effects of the BSSE on the properties of molecular complexes have been discussed in detail. The CP and CHA methodologies are used for the determination of BSSE-corrected molecular complexes properties related to the Potential Energy Surfaces and molecular wavefunction, respectively. First, the behaviour of both BSSE-correction schemes are systematically compared at different levels of theory and basis sets for a number of hydrogen-bonded complexes. The Complete Basis Set (CBS) limit of both uncorrected and CP-corrected molecular properties like stabilization energies and intermolecular distances has also been determined, showing the capital importance of the BSSE correction. Several controversial topics of the BSSE correction are addressed as well. The application of the counterpoise method is applied to internal rotational barriers. The importance of the nuclear relaxation term is also pointed out. The viability of the CP method for dealing with charged complexes and the BSSE effects on the double-well PES blue-shifted hydrogen bonds is also studied in detail. In the case of the molecular clusters the effect of high-order BSSE effects introduced with the hierarchical counterpoise scheme is also determined. The effect of the BSSE on the electron density-related properties is also addressed. The first-order electron density obtained with the CHA/F and CHA/DFT methodologies was used to assess, both graphically and numerically, the redistribution of the charge density upon BSSE-correction. Several tools like the Atoms in Molecules topologycal analysis, density difference maps, Quantum Molecular Similarity, and Chemical Energy Component Analysis were used to deeply analyze, for the first time, the BSSE effects on the electron density of several hydrogen bonded complexes of increasing size. The indirect effect of the BSSE on intermolecular perturbation theory results is also pointed out It is shown that for a BSSE-free SAPT study of hydrogen fluoride clusters, the use of a counterpoise-corrected PES is essential in order to determine the proper molecular geometry to perform the SAPT analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The perturbed Hartree–Fock theory developed in the preceding paper is applied to LiH, BH, and HF, using limited basis‐set SCF–MO wavefunctions derived by previous workers. The calculated values for the force constant ke and the dipole‐moment derivative μ(1) are (experimental values in parentheses): LiH, ke  =  1.618(1.026)mdyn/Å,μ(1)  =  −18.77(−2.0±0.3)D/ÅBH,ke  =  5.199(3.032)mdyn/Å,μ(1)  =  −1.03(−)D/Å;HF,ke  =  12.90(9.651)mdyn/Å,μ(1)  =  −2.15(+1.50)D/Å. The values of the force on the proton were calculated exactly and according to the Hellmann–Feynman theorem in each case, and the discrepancies show that none of the wavefunctions used are close to the Hartree–Fock limit, so that the large errors in ke and μ(1) are not surprising. However no difficulties arose in the perturbed Hartree–Fock calculation, so that the application of the theory to more accurate wavefunctions appears quite feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theory of harmonic force constant refinement calculations is reviewed, and a general-purpose program for force constant and normal coordinate calculations is described. The program, called ASYM20. is available through Quantum Chemistry Program Exchange. It will work on molecules of any symmetry containing up to 20 atoms and will produce results on a series of isotopomers as desired. The vibrational secular equations are solved in either nonredundant valence internal coordinates or symmetry coordinates. As well as calculating the (harmonic) vibrational wavenumbers and normal coordinates, the program will calculate centrifugal distortion constants, Coriolis zeta constants, harmonic contributions to the α′s. root-mean-square amplitudes of vibration, and other quantities related to gas electron-diffraction studies and thermodynamic properties. The program will work in either a predict mode, in which it calculates results from an input force field, or in a refine mode, in which it refines an input force field by least squares to fit observed data on the quantities mentioned above. Predicate values of the force constants may be included in the data set for a least-squares refinement. The program is written in FORTRAN for use on a PC or a mainframe computer. Operation is mainly controlled by steering indices in the input data file, but some interactive control is also implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviourbased models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley’s declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviour-based models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley's declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population subdivision complicates analysis of molecular variation. Even if neutrality is assumed, three evolutionary forces need to be considered: migration, mutation, and drift. Simplification can be achieved by assuming that the process of migration among and drift within subpopulations is occurring fast compared to Mutation and drift in the entire population. This allows a two-step approach in the analysis: (i) analysis of population subdivision and (ii) analysis of molecular variation in the migrant pool. We model population subdivision using an infinite island model, where we allow the migration/drift parameter Theta to vary among populations. Thus, central and peripheral populations can be differentiated. For inference of Theta, we use a coalescence approach, implemented via a Markov chain Monte Carlo (MCMC) integration method that allows estimation of allele frequencies in the migrant pool. The second step of this approach (analysis of molecular variation in the migrant pool) uses the estimated allele frequencies in the migrant pool for the study of molecular variation. We apply this method to a Drosophila ananassae sequence data set. We find little indication of isolation by distance, but large differences in the migration parameter among populations. The population as a whole seems to be expanding. A population from Bogor (Java, Indonesia) shows the highest variation and seems closest to the species center.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimentally and theoretically determined infrared spectra are reported for a series of straight-chain perfluorocarbons: C2F6, C3F8, C4F10, C5F12, C6F14, and C8F18. Theoretical spectra were determined using both density functional (DFT) and ab initio methods. Radiative efficiencies (REs) were determined using the method of Pinnock et al. (1995) and combined with atmospheric lifetimes from the literature to determine global warming potentials (GWPs). Theoretically determined absorption cross sections were within 10% of experimentally determined values. Despite being much less computationally expensive, DFT calculations were generally found to perform better than ab initio methods. There is a strong wavenumber dependence of radiative forcing in the region of the fundamental C-F vibration, and small differences in wavelength between band positions determined by theory and experiment have a significant impact on the REs. We apply an empirical correction to the theoretical spectra and then test this correction on a number of branched chain and cyclic perfluoroalkanes. We then compute absorption cross sections, REs, and GWPs for an additional set of perfluoroalkenes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose and analyse a class of evolving network models suitable for describing a dynamic topological structure. Applications include telecommunication, on-line social behaviour and information processing in neuroscience. We model the evolving network as a discrete time Markov chain, and study a very general framework where, conditioned on the current state, edges appear or disappear independently at the next timestep. We show how to exploit symmetries in the microscopic, localized rules in order to obtain conjugate classes of random graphs that simplify analysis and calibration of a model. Further, we develop a mean field theory for describing network evolution. For a simple but realistic scenario incorporating the triadic closure effect that has been empirically observed by social scientists (friends of friends tend to become friends), the mean field theory predicts bistable dynamics, and computational results confirm this prediction. We also discuss the calibration issue for a set of real cell phone data, and find support for a stratified model, where individuals are assigned to one of two distinct groups having different within-group and across-group dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a new proof of a theorem of Chandler-Wilde, Chonchaiya, and Lindner that the spectra of a certain class of infinite, random, tridiagonal matrices contain the unit disc almost surely. It also obtains an analogous result for a more general class of random matrices whose spectra contain a hole around the origin. The presence of the hole forces substantial changes to the analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper I analyze the general equilibrium in a random Walrasian economy. Dependence among agents is introduced in the form of dependency neighborhoods. Under the uncertainty, an agent may fail to survive due to a meager endowment in a particular state (direct effect), as well as due to unfavorable equilibrium price system at which the value of the endowment falls short of the minimum needed for survival (indirect terms-of-trade effect). To illustrate the main result I compute the stochastic limit of equilibrium price and probability of survival of an agent in a large Cobb-Douglas economy.