65 resultados para Elementary Methods In Number Theory
Resumo:
This paper examines the potential of using Participatory Farm Management methods to examine the suitability of a technology with farmers prior to on-farm trials. A study examining the suitability of green manuring as a technology for use with wet season tomato producers in Ghana is described. Findings from this case-study demonstrate that Participatory Budgeting can be used by farmers and researchers to analyse current cultivation practices, identify the options for including green manures into the system and explore the direct and wider resource implications of the technology. Scored-Causal Diagrams can be used to identify farmers' perceptions of the relative importance of the problem that the technology seeks to address. The use of the methods in this examine evaluation process appears to have the potential to improve the effectiveness and efficiency of the adaptive research process. This ensures that technologies subsequently examined in trials ate relevant to farmers' interests, existing systems and resources, thereby increasing the chances of farmer adoption. It is concluded that this process has potential for use-with other technologies and in other farming systems. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In conventional phylogeographic studies, historical demographic processes are elucidated from the geographical distribution of individuals represented on an inferred gene tree. However, the interpretation of gene trees in this context can be difficult as the same demographic/geographical process can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods.
Resumo:
An alternative approach to research is described that has been developed through a succession of significant construction management research projects. The approach follows the principles of iterative grounded theory, whereby researchers iterate between alternative theoretical frameworks and emergent empirical data. Of particular importance is an orientation toward mixing methods, thereby overcoming the existing tendency to dichotomize quantitative and qualitative approaches. The approach is positioned against the existing contested literature on grounded theory, and the possibility of engaging with empirical data in a “theory free” manner is discounted. Emphasis instead is given to the way in which researchers must be theoretically sensitive as a result of being steeped in relevant literatures. Knowledge of existing literatures therefore shapes the initial research design; but emergent empirical findings cause fresh theoretical perspectives to be mobilized. The advocated approach is further aligned with notions of knowledge coproduction and the underlying principles of contextualist research. It is this unique combination of ideas which characterizes the paper's contribution to the research methodology literature within the field of construction management. Examples are provided and consideration is given to the extent to which the emergent findings are generalizable beyond the specific context from which they are derived.
Resumo:
The commercial process in construction projects is an expensive and highly variable overhead. Collaborative working practices carry many benefits, which are widely disseminated, but little information is available about their costs. Transaction Cost Economics is a theoretical framework that seeks explanations for why there are firms and how the boundaries of firms are defined through the “make-or-buy” decision. However, it is not a framework that offers explanations for the relative costs of procuring construction projects in different ways. The idea that different methods of procurement will have characteristically different costs is tested by way of a survey. The relevance of transaction cost economics to the study of commercial costs in procurement is doubtful. The survey shows that collaborative working methods cost neither more nor less than traditional methods. But the benefits of collaboration mean that there is a great deal of enthusiasm for collaboration rather than competition.
Resumo:
With the rapid development of proteomics, a number of different methods appeared for the basic task of protein identification. We made a simple comparison between a common liquid chromatography-tandem mass spectrometry (LC-MS/MS) workflow using an ion trap mass spectrometer and a combined LC-MS and LC-MS/MS method using Fourier transform ion cyclotron resonance (FTICR) mass spectrometry and accurate peptide masses. To compare the two methods for protein identification, we grew and extracted proteins from E. coli using established protocols. Cystines were reduced and alkylated, and proteins digested by trypsin. The resulting peptide mixtures were separated by reversed-phase liquid chromatography using a 4 h gradient from 0 to 50% acetonitrile over a C18 reversed-phase column. The LC separation was coupled on-line to either a Bruker Esquire HCT ion trap or a Bruker 7 tesla APEX-Qe Qh-FTICR hybrid mass spectrometer. Data-dependent Qh-FTICR-MS/MS spectra were acquired using the quadrupole mass filter and collisionally induced dissociation into the external hexapole trap. Proteins were in both schemes identified by Mascot MS/MS ion searches and the peptides identified from these proteins in the FTICR MS/MS data were used for automatic internal calibration of the FTICR-MS data, together with ambient polydimethylcyclosiloxane ions.
Resumo:
Summary 1. In recent decades there have been population declines of many UK bird species, which have become the focus of intense research and debate. Recently, as the populations of potential predators have increased there is concern that increased rates of predation may be contributing to the declines. In this review, we assess the methodologies behind the current published science on the impacts of predators on avian prey in the UK. 2. We identified suitable studies, classified these according to study design (experimental ⁄observational) and assessed the quantity and quality of the data upon which any variation in predation rates was inferred. We then explored whether the underlying study methodology had implications for study outcome. 3. We reviewed 32 published studies and found that typically observational studies comprehensively monitored significantly fewer predator species than experimental studies. Data for a difference in predator abundance from targeted (i.e. bespoke) census techniques were available for less than half of the 32 predator species studied. 4. The probability of a study detecting an impact on prey abundance was strongly, positively related to the quality and quantity of data upon which the gradient in predation rates was inferred. 5. The findings suggest that if a study is based on good quality abundance data for a range of predator species then it is more likely to detect an effect than if it relies on opportunistic data for a smaller number of predators. 6. We recommend that the findings from studies which use opportunistic data, for a limited number of predator species, should be treated with caution and that future studies employ bespoke census techniques to monitor predator abundance for an appropriate suite of predators.
Resumo:
Some 50,000 Win Studies in Chess challenge White to find an effectively unique route to a win. Judging the impact of less than absolute uniqueness requires both technical analysis and artistic judgment. Here, for the first time, an algorithm is defined to help analyse uniqueness in endgame positions objectively. The key idea is to examine how critical certain positions are to White in achieving the win. The algorithm uses sub-n-man endgame tables (EGTs) for both Chess and relevant, adjacent variants of Chess. It challenges authors of EGT generators to generalise them to create EGTs for these chess variants. It has already proved efficient and effective in an implementation for Starchess, itself a variant of chess. The approach also addresses a number of similar questions arising in endgame theory, games and compositions.
Resumo:
The DNA G-qadruplexes are one of the targets being actively explored for anti-cancer therapy by inhibiting them through small molecules. This computational study was conducted to predict the binding strengths and orientations of a set of novel dimethyl-amino-ethyl-acridine (DACA) analogues that are designed and synthesized in our laboratory, but did not diffract in Synchrotron light.Thecrystal structure of DNA G-Quadruplex(TGGGGT)4(PDB: 1O0K) was used as target for their binding properties in our studies.We used both the force field (FF) and QM/MM derived atomic charge schemes simultaneously for comparing the predictions of drug binding modes and their energetics. This study evaluates the comparative performance of fixed point charge based Glide XP docking and the quantum polarized ligand docking schemes. These results will provide insights on the effects of including or ignoring the drug-receptor interfacial polarization events in molecular docking simulations, which in turn, will aid the rational selection of computational methods at different levels of theory in future drug design programs. Plenty of molecular modelling tools and methods currently exist for modelling drug-receptor or protein-protein, or DNA-protein interactionssat different levels of complexities.Yet, the capasity of such tools to describevarious physico-chemical propertiesmore accuratelyis the next step ahead in currentresearch.Especially, the usage of most accurate methods in quantum mechanics(QM) is severely restricted by theirtedious nature. Though the usage of massively parallel super computing environments resulted in a tremendous improvement in molecular mechanics (MM) calculations like molecular dynamics,they are still capable of dealing with only a couple of tens to hundreds of atoms for QM methods. One such efficient strategy that utilizes thepowers of both MM and QM are the QM/MM hybrid methods. Lately, attempts have been directed towards the goal of deploying several different QM methods for betterment of force field based simulations, but with practical restrictions in place. One of such methods utilizes the inclusion of charge polarization events at the drug-receptor interface, that is not explicitly present in the MM FF.
Resumo:
Assimilation of temperature observations into an ocean model near the equator often results in a dynamically unbalanced state with unrealistic overturning circulations. The way in which these circulations arise from systematic errors in the model or its forcing is discussed. A scheme is proposed, based on the theory of state augmentation, which uses the departures of the model state from the observations to update slowly evolving bias fields. Results are summarized from an experiment applying this bias correction scheme to an ocean general circulation model. They show that the method produces more balanced analyses and a better fit to the temperature observations.
Resumo:
Salmonella enterica serotypes Derby, Mbandaka, Montevideo, Livingstone, and Senftenberg were among the 10 most prevalent serotypes isolated from farm animals in England and Wales in 1999. These serotypes are of potential zoonotic relevance; however, there is currently no "gold standard" fingerprinting method for them. A collection of isolates representing the former serotypes and serotype Gold Coast were analyzed using plasmid profiling, pulsed-field gel electrophoresis (PFGE), and ribotyping. The success of the molecular methods in identifying DNA polymorphisms was different for each serotype. Plasmid profiling was particularly useful for serotype Derby isolates, and it also provided a good level of discrimination for serotype Senftenberg. For most serotypes, we observed a number of nontypeable plasmid-free strains, which represents a limitation of this technique. Fingerprinting of genomic DNA by ribotyping and PFGE produced a significant variation in results, depending on the serotype of the strain. Both PstI/SphI ribotyping and XbaI-PFGE provided a similar degree of strain differentiation for serotype Derby and serotype Senftenberg, only marginally lower than that achieved by plasmid profiling. Ribotyping was less sensitive than PFGE when applied to serotype Mbandaka or serotype Montevideo. Serotype Gold Coast isolates were found to be nontypeable by XbaI-PFGE, and a significant proportion of them were found to be plasmid free. A similar situation applies to a number of serotype Livingstone isolates which were nontypeable by plasmid profiling and/or PFGE. In summary, the serotype of the isolates has a considerable influence in deciding the best typing strategy; a single method cannot be relied upon for discriminating between strains, and a combination of typing methods allows further discrimination.
Resumo:
The paper considers second kind equations of the form (abbreviated x=y + K2x) in which and the factor z is bounded but otherwise arbitrary so that equations of Wiener-Hopf type are included as a special case. Conditions on a set are obtained such that a generalized Fredholm alternative is valid: if W satisfies these conditions and I − Kz, is injective for each z ε W then I − Kz is invertible for each z ε W and the operators (I − Kz)−1 are uniformly bounded. As a special case some classical results relating to Wiener-Hopf operators are reproduced. A finite section version of the above equation (with the range of integration reduced to [−a, a]) is considered, as are projection and iterated projection methods for its solution. The operators (where denotes the finite section version of Kz) are shown uniformly bounded (in z and a) for all a sufficiently large. Uniform stability and convergence results, for the projection and iterated projection methods, are obtained. The argument generalizes an idea in collectively compact operator theory. Some new results in this theory are obtained and applied to the analysis of projection methods for the above equation when z is compactly supported and k(s − t) replaced by the general kernel k(s,t). A boundary integral equation of the above type, which models outdoor sound propagation over inhomogeneous level terrain, illustrates the application of the theoretical results developed.
Resumo:
Building assessment methods have become a popular research field since the early 1990s. An international tool which allows the assessment of buildings in all regions, taking into account differences in climates, topographies and cultures does not yet exist. This paper aims to demonstrate the importance of criteria and sub-criteria in developing a new potential building assessment method for Saudi Arabia. Recently, the awareness of sustainability has been increasing in developing countries due to high energy consumption, pollution and high carbon foot print. There is no debate that assessment criteria have an important role to identify the tool’s orientation. However, various aspects influence the criteria and sub-criteria of assessment tools such as environment, economic, social and cultural to mention but a few. The author provides an investigation on the most popular and globally used schemes: BREEAM, LEED, Green Star, CASBEE and Estidama in order to identify the effectiveness of the different aspects of the assessment criteria and the impacts of these criteria on the assessment results; that will provide a solid foundation to develop an effective sustainable assessment method for buildings in Saudi Arabia. Initial results of the investigation suggest that each country needs to develop its own assessment method in order to achieve desired results, while focusing upon the indigenous environmental, economic, social and cultural conditions. Keywords: Assessment methods, BREEAM, LEED, Green Star, CASBEE, Estidama, sustainability, sustainable buildings, Environment, Saudi Arabia.
Resumo:
Interpersonal interaction in public goods contexts is very different in character to its depiction in economic theory, despite the fact that the standard model is based on a small number of apparently plausible assumptions. Approaches to the problem are reviewed both from within and outside economics. It is argued that quick fixes such as a taste for giving do not provide a way forward. An improved understanding of why people contribute to such goods seems to require a different picture of the relationships between individuals than obtains in standard microeconomic theory, where they are usually depicted as asocial. No single economic model at present is consistent with all the relevant field and laboratory data. It is argued that there are defensible ideas from outside the discipline which ought to be explored, relying on different conceptions of rationality and/or more radically social agents. Three such suggestions are considered, one concerning the expressive/communicative aspect of behaviour, a second the possibility of a part-whole relationship between interacting agents and the third a version of conformism.