84 resultados para Conservative Bidder
Resumo:
This article describes a novel algorithmic development extending the contour advective semi-Lagrangian model to include nonconservative effects. The Lagrangian contour representation of finescale tracer fields, such as potential vorticity, allows for conservative, nondiffusive treatment of sharp gradients allowing very high numerical Reynolds numbers. It has been widely employed in accurate geostrophic turbulence and tracer advection simulations. In the present, diabatic version of the model the constraint of conservative dynamics is overcome by including a parallel Eulerian field that absorbs the nonconservative ( diabatic) tendencies. The diabatic buildup in this Eulerian field is limited through regular, controlled transfers of this field to the contour representation. This transfer is done with a fast newly developed contouring algorithm. This model has been implemented for several idealized geometries. In this paper a single-layer doubly periodic geometry is used to demonstrate the validity of the model. The present model converges faster than the analogous semi-Lagrangian models at increased resolutions. At the same nominal spatial resolution the new model is 40 times faster than the analogous semi-Lagrangian model. Results of an orographically forced idealized storm track show nontrivial dependency of storm-track statistics on resolution and on the numerical model employed. If this result is more generally applicable, this may have important consequences for future high-resolution climate modeling.
Resumo:
The basic premise of transaction-cost theory is that the decision to outsource, rather than to undertake work in-house, is determined by the relative costs incurred in each of these forms of economic organization. In construction the "make or buy" decision invariably leads to a contract. Reducing the costs of entering into a contractual relationship (transaction costs) raises the value of production and is therefore desirable. Commonly applied methods of contractor selection may not minimise the costs of contracting. Research evidence suggests that although competitive tendering typically results in the lowest bidder winning the contract this may not represent the lowest project cost after completion. Multi-parameter and quantitative models for contractor selection have been developed to identify the best (or least risky) among bidders. A major area in which research is still needed is in investigating the impact of different methods of contractor selection on the costs of entering into a contract and the decision to outsource.
Resumo:
Smallholdings in the rural areas of northwest Syria are a result of land fragmentation that is due to inheritance. Because of rapid population growth combined with land fragmentation, these smallholdings are increasing and cannot sustain the rural households whose sizes and needs are also increasing rapidly. This situation has led to increasing numbers of mates migrating to urban areas in Syria and to neighbouring countries looking for work opportunities. In addition, recent agricultural intensification trends seem to have led to the emergence of a waged labour force which, in the absence of male workers owing to significant rates of migration, is now predominantly female. Agricultural labour use depends upon household characteristics and resources (type of labour used, gender of labour waged/exchanged/familial). The article attempts to present a comprehensive analysis of household labour use in distinctive farming systems in one region of Syria that has undergone great change in recent decades, and examines the changes in the composition of the agricultural labour force. Secondary information, rapid rural appraisals and formal farm surveys were used to gather information on the households in a study area where different farming systems coexist. The results show that the decrease in landholding size, the resulting male migration, and land intensification have resulted in the expansion of female labour in agricultural production, which has been termed in this research a 'feminization of agricultural labour'. This suggests that agricultural research and extension services will have to work more with women farmers and farm workers, seek their wisdom and involve them in technology and tran, fer This is not easy in conservative societies but requires research and extension institutions to take this reality into consideration in their programmes.
Resumo:
R. H. Whittaker's idea that plant diversity can be divided into a hierarchy of spatial components from alpha at the within-habitat scale through beta for the turnover of species between habitats to gamma along regional gradients implies the underlying existence of alpha, beta, and gamma niches. We explore the hypothesis that the evolution of a, (3, and gamma niches is also hierarchical, with traits that define the a niche being labile, while those defining a and 7 niches are conservative. At the a level we find support for the hypothesis in the lack of close significant phylogenetic relationship between meadow species that have similar a niches. In a second test, a niche overlap based on a variety of traits is compared between congeners and noncongeners in several communities; here, too, there is no evidence of a correlation between a niche and phylogeny. To test whether beta and gamma niches evolve conservatively, we reconstructed the evolution of relevant traits on evolutionary trees for 14 different clades. Tests against null models revealed a number of instances, including some in island radiations, in which habitat (beta niche) and elevational maximum (an aspect of the gamma niche) showed evolutionary conservatism.
Resumo:
We investigate the impact of past climates on plant diversification by tracking the "footprint" of climate change on a phylogenetic tree. Diversity within the cosmopolitan carnivorous plant genus Drosera (Droseraceae) is focused within Mediterranean climate regions. We explore whether this diversity is temporally linked to Mediterranean-type climatic shifts of the mid-Miocene and whether climate preferences are conservative over phylogenetic timescales. Phyloclimatic modeling combines environmental niche (bioclimatic) modeling with phylogenetics in order to study evolutionary patterns in relation to climate change. We present the largest and most complete such example to date using Drosera. The bioclimatic models of extant species demonstrate clear phylogenetic patterns; this is particularly evident for the tuberous sundews from southwestern Australia (subgenus Ergaleium). We employ a method for establishing confidence intervals of node ages on a phylogeny using replicates from a Bayesian phylogenetic analysis. This chronogram shows that many clades, including subgenus Ergaleium and section Bryastrum, diversified during the establishment of the Mediterranean-type climate. Ancestral reconstructions of bioclimatic models demonstrate a pattern of preference for this climate type within these groups. Ancestral bioclimatic models are projected into palaeo-climate reconstructions for the time periods indicated by the chronogram. We present two such examples that each generate plausible estimates of ancestral lineage distribution, which are similar to their current distributions. This is the first study to attempt bioclimatic projections on evolutionary time scales. The sundews appear to have diversified in response to local climate development. Some groups are specialized for Mediterranean climates, others show wide-ranging generalism. This demonstrates that Phyloclimatic modeling could be repeated for other plant groups and is fundamental to the understanding of evolutionary responses to climate change.
Resumo:
We describe and evaluate a new estimator of the effective population size (N-e), a critical parameter in evolutionary and conservation biology. This new "SummStat" N-e. estimator is based upon the use of summary statistics in an approximate Bayesian computation framework to infer N-e. Simulations of a Wright-Fisher population with known N-e show that the SummStat estimator is useful across a realistic range of individuals and loci sampled, generations between samples, and N-e values. We also address the paucity of information about the relative performance of N-e estimators by comparing the SUMMStat estimator to two recently developed likelihood-based estimators and a traditional moment-based estimator. The SummStat estimator is the least biased of the four estimators compared. In 32 of 36 parameter combinations investigated rising initial allele frequencies drawn from a Dirichlet distribution, it has the lowest bias. The relative mean square error (RMSE) of the SummStat estimator was generally intermediate to the others. All of the estimators had RMSE > 1 when small samples (n = 20, five loci) were collected a generation apart. In contrast, when samples were separated by three or more generations and Ne less than or equal to 50, the SummStat and likelihood-based estimators all had greatly reduced RMSE. Under the conditions simulated, SummStat confidence intervals were more conservative than the likelihood-based estimators and more likely to include true N-e. The greatest strength of the SummStat estimator is its flexible structure. This flexibility allows it to incorporate any, potentially informative summary statistic from Population genetic data.
Resumo:
Cardiovascular disease (CVD), which includes coronary heart disease and stroke, remains the major killer in the EU, being responsible for 42% of total mortality. The amount and composition of dietary fat is arguably the most important dietary factor contributing to disease risk. A significant body of consistent evidence indicates that a decrease in dietary saturated fat:unsaturated (polyunsaturated + monounsaturated) ratio and an increased intake of long-chain n-3 polyunsaturated fatty acids (LC n-3 PUFA) found in fish, is cardioprotective. Furthermore, although the evidence is currently less convincing, such a strategy is also likely to improve insulin sensitivity, the central metabolic defect in diabetes. Currently in the UK only 12% of men, 17% of women and 8% of children have an SFA intakes <10% of energy. The average intake of LC n-3 PUFA is <0.2 g/day, which is less than half the current conservative recommendation of a minimum of 0.45 g/day. Public health strategies to reverse these dietary fatty acid imbalances, aimed at educating and motivating the consumer and making affordable and acceptable food products with an ‘enhanced’ fatty acid profile more widely available, must remain a public health priority in the ‘fight’ against CVD.
Resumo:
In the 1990s the Message Passing Interface Forum defined MPI bindings for Fortran, C, and C++. With the success of MPI these relatively conservative languages have continued to dominate in the parallel computing community. There are compelling arguments in favour of more modern languages like Java. These include portability, better runtime error checking, modularity, and multi-threading. But these arguments have not converted many HPC programmers, perhaps due to the scarcity of full-scale scientific Java codes, and the lack of evidence for performance competitive with C or Fortran. This paper tries to redress this situation by porting two scientific applications to Java. Both of these applications are parallelized using our thread-safe Java messaging system—MPJ Express. The first application is the Gadget-2 code, which is a massively parallel structure formation code for cosmological simulations. The second application uses the finite-domain time-difference method for simulations in the area of computational electromagnetics. We evaluate and compare the performance of the Java and C versions of these two scientific applications, and demonstrate that the Java codes can achieve performance comparable with legacy applications written in conventional HPC languages. Copyright © 2009 John Wiley & Sons, Ltd.
Resumo:
This paper seeks to illustrate the point that physical inconsistencies between thermodynamics and dynamics usually introduce nonconservative production/destruction terms in the local total energy balance equation in numerical ocean general circulation models (OGCMs). Such terms potentially give rise to undesirable forces and/or diabatic terms in the momentum and thermodynamic equations, respectively, which could explain some of the observed errors in simulated ocean currents and water masses. In this paper, a theoretical framework is developed to provide a practical method to determine such nonconservative terms, which is illustrated in the context of a relatively simple form of the hydrostatic Boussinesq primitive equation used in early versions of OGCMs, for which at least four main potential sources of energy nonconservation are identified; they arise from: (1) the “hanging” kinetic energy dissipation term; (2) assuming potential or conservative temperature to be a conservative quantity; (3) the interaction of the Boussinesq approximation with the parameterizations of turbulent mixing of temperature and salinity; (4) some adiabatic compressibility effects due to the Boussinesq approximation. In practice, OGCMs also possess spurious numerical energy sources and sinks, but they are not explicitly addressed here. Apart from (1), the identified nonconservative energy sources/sinks are not sign definite, allowing for possible widespread cancellation when integrated globally. Locally, however, these terms may be of the same order of magnitude as actual energy conversion terms thought to occur in the oceans. Although the actual impact of these nonconservative energy terms on the overall accuracy and physical realism of the oceans is difficult to ascertain, an important issue is whether they could impact on transient simulations, and on the transition toward different circulation regimes associated with a significant reorganization of the different energy reservoirs. Some possible solutions for improvement are examined. It is thus found that the term (2) can be substantially reduced by at least one order of magnitude by using conservative temperature instead of potential temperature. Using the anelastic approximation, however, which was initially thought as a possible way to greatly improve the accuracy of the energy budget, would only marginally reduce the term (4) with no impact on the terms (1), (2) and (3).
Resumo:
A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.
Resumo:
Techniques for the coherent generation and detection of electromagnetic radiation in the far infrared, or terahertz, region of the electromagnetic spectrum have recently developed rapidly and may soon be applied for in vivo medical imaging. Both continuous wave and pulsed imaging systems are under development, with terahertz pulsed imaging being the more common method. Typically a pump and probe technique is used, with picosecond pulses of terahertz radiation generated from femtosecond infrared laser pulses, using an antenna or nonlinear crystal. After interaction with the subject either by transmission or reflection, coherent detection is achieved when the terahertz beam is combined with the probe laser beam. Raster scanning of the subject leads to an image data set comprising a time series representing the pulse at each pixel. A set of parametric images may be calculated, mapping the values of various parameters calculated from the shape of the pulses. A safety analysis has been performed, based on current guidelines for skin exposure to radiation of wavelengths 2.6 µm–20 mm (15 GHz–115 THz), to determine the maximum permissible exposure (MPE) for such a terahertz imaging system. The international guidelines for this range of wavelengths are drawn from two U.S. standards documents. The method for this analysis was taken from the American National Standard for the Safe Use of Lasers (ANSI Z136.1), and to ensure a conservative analysis, parameters were drawn from both this standard and from the IEEE Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Fields (C95.1). The calculated maximum permissible average beam power was 3 mW, indicating that typical terahertz imaging systems are safe according to the current guidelines. Further developments may however result in systems that will exceed the calculated limit. Furthermore, the published MPEs for pulsed exposures are based on measurements at shorter wavelengths and with pulses of longer duration than those used in terahertz pulsed imaging systems, so the results should be treated with caution.
Resumo:
In this chapter we described how the inclusion of a model of a human arm, combined with the measurement of its neural input and a predictor, can provide to a previously proposed teleoperator design robustness under time delay. Our trials gave clear indications of the superiority of the NPT scheme over traditional as well as the modified Yokokohji and Yoshikawa architectures. Its fundamental advantages are: the time-lead of the slave, the more efficient, and providing a more natural feeling manipulation, and the fact that incorporating an operator arm model leads to more credible stability results. Finally, its simplicity allows less likely to fail local control techniques to be employed. However, a significant advantage for the enhanced Yokokohji and Yoshikawa architecture results from the very fact that it’s a conservative modification of current designs. Under large prediction errors, it can provide robustness through directing the master and slave states to their means and, since it relies on the passivity of the mechanical part of the system, it would not confuse the operator. An experimental implementation of the techniques will provide further evidence for the performance of the proposed architectures. The employment of neural networks and fuzzy logic, which will provide an adaptive model of the human arm and robustifying control terms, is scheduled for the near future.
Resumo:
Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.
Resumo:
Lean construction is considered from a human resource management (HRM) perspective. It is contended that the UK construction sector is characterised by an institutionalised regressive approach to HRM. In the face of rapidly declining recruitment rates for built environment courses, the dominant HRM philosophy of utilitarian instrumentalism does little to attract the intelligent and creative young people that the industry so badly needs. Given this broader context, there is a danger that an uncritical acceptance of lean construction will exacerbate the industry's reputation for unrewarding jobs. Construction academics have strangely ignored the extensive literature that equates lean production to a HRM regime of control, exploitation and surveillance. The emphasis of lean thinking on eliminating waste and improving efficiency makes it easy to absorb into the best practice agenda because it conforms to the existing dominant way of thinking. 'Best practice' is seemingly judged by the extent to which it serves the interests of the industry's technocratic elite. Hence it acts as a conservative force in favour of maintaining the status quo. In this respect, lean construction is the latest manifestation of a long established trend. In common with countless other improvement initiatives, the rhetoric is heavy in the machine metaphor whilst exhorting others to be more efficient. If current trends in lean construction are extrapolated into the future the ultimate destination may be uncomfortably close to Aldous Huxley's apocalyptic vision of a Brave New World. In the face of these trends, the lean construction research community pleads neutrality whilst confining its attention to the rational high ground. The future of lean construction is not yet predetermined. Many choices remain to be made. The challenge for the research community is to improve practice whilst avoiding the dehumanising tendencies of high utilitarianism.
Resumo:
For much of the 1990s and 2000s, the emphasis of urban policy in many global cities was on managing and mitigating the social and environmental effects of rapid economic growth. The credit crunch of 2008 and the subsequent recession have undermined some of the core assumptions on which such policies were based. It is in this context that the concept of resilience planning has taken on a new significance. Drawing on contemporary research in London and Hong Kong, the paper shows how resilience and recovery planning has become a key area of political debate. It examines what is meant by conservative and radical interpretations of resilience and how conservative views have come to dominate ‘recovery’ thinking, with élite groups unwilling to accept the limits to the neo-liberal orthodoxies that helped to precipitate the economic crisis. The paper explores the implications of such thinking for the politics of urban development.