107 resultados para 010201 Approximation Theory and Asymptotic Methods
Resumo:
Population subdivision complicates analysis of molecular variation. Even if neutrality is assumed, three evolutionary forces need to be considered: migration, mutation, and drift. Simplification can be achieved by assuming that the process of migration among and drift within subpopulations is occurring fast compared to Mutation and drift in the entire population. This allows a two-step approach in the analysis: (i) analysis of population subdivision and (ii) analysis of molecular variation in the migrant pool. We model population subdivision using an infinite island model, where we allow the migration/drift parameter Theta to vary among populations. Thus, central and peripheral populations can be differentiated. For inference of Theta, we use a coalescence approach, implemented via a Markov chain Monte Carlo (MCMC) integration method that allows estimation of allele frequencies in the migrant pool. The second step of this approach (analysis of molecular variation in the migrant pool) uses the estimated allele frequencies in the migrant pool for the study of molecular variation. We apply this method to a Drosophila ananassae sequence data set. We find little indication of isolation by distance, but large differences in the migration parameter among populations. The population as a whole seems to be expanding. A population from Bogor (Java, Indonesia) shows the highest variation and seems closest to the species center.
Resumo:
G3B3 and G2MP2 calculations using Gaussian 03 have been carried out to investigate the protonation preferences for phenylboronic acid. All nine heavy atoms have been protonated in turn. With both methodologies, the two lowest protonation energies are obtained with the proton located either at the ipso carbon atom or at a hydroxyl oxygen atom. Within the G3B3 formalism, the lowest-energy configuration by 4.3 kcal . mol(-1) is found when the proton is located at the ipso carbon, rather than at the electronegative oxygen atom. In the resulting structure, the phenyl ring has lost a significant amount of aromaticity. By contrast, calculations with G2MP2 show that protonation at the hydroxyl oxygen atom is favored by 7.7 kcal . mol(-1). Calculations using the polarizable continuum model (PCM) solvent method also give preference to protonation at the oxygen atom when water is used as the solvent. The preference for protonation at the ipso carbon found by the more accurate G3B3 method is unexpected and its implications in Suzuki coupling are discussed. (C) 2006 Wiley Periodicals, Inc.
Resumo:
The constructivist model of 'soft' value management (VM) is contrasted with the VM discourse appropriated by cost consultants who operate from within UK quantity surveying (QS) practices. The enactment of VM by cost consultants is shaped by the institutional context within which they operate and is not necessarily representative of VM practice per se. Opportunities to perform VM during the formative stages of design are further constrained by the positivistic rhetoric that such practitioners use to conceptualize and promote their services. The complex interplay between VM theory and practice is highlighted and analysed from a non-deterministic perspective. Codified models of 'best practice' are seen to be socially constructed and legitimized through human interaction in the context of interorganizational networks. Published methodologies are seen to inform practice in only a loose and indirect manner, with extensive scope for localized improvization. New insights into the relationship between VM theory and practice are derived from the dramaturgical metaphor. The social reality of VM is seen to be constituted through scripts and performances, both of which are continuously contested across organizational arenas. It is concluded that VM defies universal definition and is conceptualized and enacted differently across different localized contexts.
Resumo:
Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.
Resumo:
Syntactic theory provides a rich array of representational assumptions about linguistic knowledge and processes. Such detailed and independently motivated constraints on grammatical knowledge ought to play a role in sentence comprehension. However most grammar-based explanations of processing difficulty in the literature have attempted to use grammatical representations and processes per se to explain processing difficulty. They did not take into account that the description of higher cognition in mind and brain encompasses two levels: on the one hand, at the macrolevel, symbolic computation is performed, and on the other hand, at the microlevel, computation is achieved through processes within a dynamical system. One critical question is therefore how linguistic theory and dynamical systems can be unified to provide an explanation for processing effects. Here, we present such a unification for a particular account to syntactic theory: namely a parser for Stabler's Minimalist Grammars, in the framework of Smolensky's Integrated Connectionist/Symbolic architectures. In simulations we demonstrate that the connectionist minimalist parser produces predictions which mirror global empirical findings from psycholinguistic research.
Resumo:
The aim of this introductory paper, and of this special issue of Cognition and Emotion, is to stimulate debate about theoretical issues that will inform child anxiety research in the coming years. Papers included in this special issue have arisen from an Economic and Social Research Council (ESRC, UK) funded seminar series, which we called Child Anxiety Theory and Treatment (CATTS). We begin with an overview of the CATTS project before discussing (1) the application of adult models of anxiety to children, and (2) the role of parents in child anxiety. We explore the utility of adult models of anxiety for child populations before discussing the problems that are associated with employing them uncritically in this context. The study of anxiety in children provides the opportunity to observe the trajectory of anxiety and to identify variables that causally influence its development. Parental influences are of particular interest and new and imaginative strategies are required to isolate the complex network of causal relationships therein. We conclude by suggesting that research into the causes and developmental course of anxiety in children should be developed further. We also propose that, although much is known about the role of parents in the development of anxiety, it would be useful for research in this area to move towards an examination of the specific processes involved. We hope that these views represent a constructive agenda for people in the field to consider when planning future research.
Resumo:
This paper formally derives a new path-based neural branch prediction algorithm (FPP) into blocks of size two for a lower hardware solution while maintaining similar input-output characteristic to the algorithm. The blocked solution, here referred to as B2P algorithm, is obtained using graph theory and retiming methods. Verification approaches were exercised to show that prediction performances obtained from the FPP and B2P algorithms differ within one mis-prediction per thousand instructions using a known framework for branch prediction evaluation. For a chosen FPGA device, circuits generated from the B2P algorithm showed average area savings of over 25% against circuits for the FPP algorithm with similar time performances thus making the proposed blocked predictor superior from a practical viewpoint.