73 resultados para Conservative


Relevância:

10.00% 10.00%

Publicador:

Resumo:

R. H. Whittaker's idea that plant diversity can be divided into a hierarchy of spatial components from alpha at the within-habitat scale through beta for the turnover of species between habitats to gamma along regional gradients implies the underlying existence of alpha, beta, and gamma niches. We explore the hypothesis that the evolution of a, (3, and gamma niches is also hierarchical, with traits that define the a niche being labile, while those defining a and 7 niches are conservative. At the a level we find support for the hypothesis in the lack of close significant phylogenetic relationship between meadow species that have similar a niches. In a second test, a niche overlap based on a variety of traits is compared between congeners and noncongeners in several communities; here, too, there is no evidence of a correlation between a niche and phylogeny. To test whether beta and gamma niches evolve conservatively, we reconstructed the evolution of relevant traits on evolutionary trees for 14 different clades. Tests against null models revealed a number of instances, including some in island radiations, in which habitat (beta niche) and elevational maximum (an aspect of the gamma niche) showed evolutionary conservatism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the impact of past climates on plant diversification by tracking the "footprint" of climate change on a phylogenetic tree. Diversity within the cosmopolitan carnivorous plant genus Drosera (Droseraceae) is focused within Mediterranean climate regions. We explore whether this diversity is temporally linked to Mediterranean-type climatic shifts of the mid-Miocene and whether climate preferences are conservative over phylogenetic timescales. Phyloclimatic modeling combines environmental niche (bioclimatic) modeling with phylogenetics in order to study evolutionary patterns in relation to climate change. We present the largest and most complete such example to date using Drosera. The bioclimatic models of extant species demonstrate clear phylogenetic patterns; this is particularly evident for the tuberous sundews from southwestern Australia (subgenus Ergaleium). We employ a method for establishing confidence intervals of node ages on a phylogeny using replicates from a Bayesian phylogenetic analysis. This chronogram shows that many clades, including subgenus Ergaleium and section Bryastrum, diversified during the establishment of the Mediterranean-type climate. Ancestral reconstructions of bioclimatic models demonstrate a pattern of preference for this climate type within these groups. Ancestral bioclimatic models are projected into palaeo-climate reconstructions for the time periods indicated by the chronogram. We present two such examples that each generate plausible estimates of ancestral lineage distribution, which are similar to their current distributions. This is the first study to attempt bioclimatic projections on evolutionary time scales. The sundews appear to have diversified in response to local climate development. Some groups are specialized for Mediterranean climates, others show wide-ranging generalism. This demonstrates that Phyloclimatic modeling could be repeated for other plant groups and is fundamental to the understanding of evolutionary responses to climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe and evaluate a new estimator of the effective population size (N-e), a critical parameter in evolutionary and conservation biology. This new "SummStat" N-e. estimator is based upon the use of summary statistics in an approximate Bayesian computation framework to infer N-e. Simulations of a Wright-Fisher population with known N-e show that the SummStat estimator is useful across a realistic range of individuals and loci sampled, generations between samples, and N-e values. We also address the paucity of information about the relative performance of N-e estimators by comparing the SUMMStat estimator to two recently developed likelihood-based estimators and a traditional moment-based estimator. The SummStat estimator is the least biased of the four estimators compared. In 32 of 36 parameter combinations investigated rising initial allele frequencies drawn from a Dirichlet distribution, it has the lowest bias. The relative mean square error (RMSE) of the SummStat estimator was generally intermediate to the others. All of the estimators had RMSE > 1 when small samples (n = 20, five loci) were collected a generation apart. In contrast, when samples were separated by three or more generations and Ne less than or equal to 50, the SummStat and likelihood-based estimators all had greatly reduced RMSE. Under the conditions simulated, SummStat confidence intervals were more conservative than the likelihood-based estimators and more likely to include true N-e. The greatest strength of the SummStat estimator is its flexible structure. This flexibility allows it to incorporate any, potentially informative summary statistic from Population genetic data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cardiovascular disease (CVD), which includes coronary heart disease and stroke, remains the major killer in the EU, being responsible for 42% of total mortality. The amount and composition of dietary fat is arguably the most important dietary factor contributing to disease risk. A significant body of consistent evidence indicates that a decrease in dietary saturated fat:unsaturated (polyunsaturated + monounsaturated) ratio and an increased intake of long-chain n-3 polyunsaturated fatty acids (LC n-3 PUFA) found in fish, is cardioprotective. Furthermore, although the evidence is currently less convincing, such a strategy is also likely to improve insulin sensitivity, the central metabolic defect in diabetes. Currently in the UK only 12% of men, 17% of women and 8% of children have an SFA intakes <10% of energy. The average intake of LC n-3 PUFA is <0.2 g/day, which is less than half the current conservative recommendation of a minimum of 0.45 g/day. Public health strategies to reverse these dietary fatty acid imbalances, aimed at educating and motivating the consumer and making affordable and acceptable food products with an ‘enhanced’ fatty acid profile more widely available, must remain a public health priority in the ‘fight’ against CVD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the 1990s the Message Passing Interface Forum defined MPI bindings for Fortran, C, and C++. With the success of MPI these relatively conservative languages have continued to dominate in the parallel computing community. There are compelling arguments in favour of more modern languages like Java. These include portability, better runtime error checking, modularity, and multi-threading. But these arguments have not converted many HPC programmers, perhaps due to the scarcity of full-scale scientific Java codes, and the lack of evidence for performance competitive with C or Fortran. This paper tries to redress this situation by porting two scientific applications to Java. Both of these applications are parallelized using our thread-safe Java messaging system—MPJ Express. The first application is the Gadget-2 code, which is a massively parallel structure formation code for cosmological simulations. The second application uses the finite-domain time-difference method for simulations in the area of computational electromagnetics. We evaluate and compare the performance of the Java and C versions of these two scientific applications, and demonstrate that the Java codes can achieve performance comparable with legacy applications written in conventional HPC languages. Copyright © 2009 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper seeks to illustrate the point that physical inconsistencies between thermodynamics and dynamics usually introduce nonconservative production/destruction terms in the local total energy balance equation in numerical ocean general circulation models (OGCMs). Such terms potentially give rise to undesirable forces and/or diabatic terms in the momentum and thermodynamic equations, respectively, which could explain some of the observed errors in simulated ocean currents and water masses. In this paper, a theoretical framework is developed to provide a practical method to determine such nonconservative terms, which is illustrated in the context of a relatively simple form of the hydrostatic Boussinesq primitive equation used in early versions of OGCMs, for which at least four main potential sources of energy nonconservation are identified; they arise from: (1) the “hanging” kinetic energy dissipation term; (2) assuming potential or conservative temperature to be a conservative quantity; (3) the interaction of the Boussinesq approximation with the parameterizations of turbulent mixing of temperature and salinity; (4) some adiabatic compressibility effects due to the Boussinesq approximation. In practice, OGCMs also possess spurious numerical energy sources and sinks, but they are not explicitly addressed here. Apart from (1), the identified nonconservative energy sources/sinks are not sign definite, allowing for possible widespread cancellation when integrated globally. Locally, however, these terms may be of the same order of magnitude as actual energy conversion terms thought to occur in the oceans. Although the actual impact of these nonconservative energy terms on the overall accuracy and physical realism of the oceans is difficult to ascertain, an important issue is whether they could impact on transient simulations, and on the transition toward different circulation regimes associated with a significant reorganization of the different energy reservoirs. Some possible solutions for improvement are examined. It is thus found that the term (2) can be substantially reduced by at least one order of magnitude by using conservative temperature instead of potential temperature. Using the anelastic approximation, however, which was initially thought as a possible way to greatly improve the accuracy of the energy budget, would only marginally reduce the term (4) with no impact on the terms (1), (2) and (3).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Techniques for the coherent generation and detection of electromagnetic radiation in the far infrared, or terahertz, region of the electromagnetic spectrum have recently developed rapidly and may soon be applied for in vivo medical imaging. Both continuous wave and pulsed imaging systems are under development, with terahertz pulsed imaging being the more common method. Typically a pump and probe technique is used, with picosecond pulses of terahertz radiation generated from femtosecond infrared laser pulses, using an antenna or nonlinear crystal. After interaction with the subject either by transmission or reflection, coherent detection is achieved when the terahertz beam is combined with the probe laser beam. Raster scanning of the subject leads to an image data set comprising a time series representing the pulse at each pixel. A set of parametric images may be calculated, mapping the values of various parameters calculated from the shape of the pulses. A safety analysis has been performed, based on current guidelines for skin exposure to radiation of wavelengths 2.6 µm–20 mm (15 GHz–115 THz), to determine the maximum permissible exposure (MPE) for such a terahertz imaging system. The international guidelines for this range of wavelengths are drawn from two U.S. standards documents. The method for this analysis was taken from the American National Standard for the Safe Use of Lasers (ANSI Z136.1), and to ensure a conservative analysis, parameters were drawn from both this standard and from the IEEE Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Fields (C95.1). The calculated maximum permissible average beam power was 3 mW, indicating that typical terahertz imaging systems are safe according to the current guidelines. Further developments may however result in systems that will exceed the calculated limit. Furthermore, the published MPEs for pulsed exposures are based on measurements at shorter wavelengths and with pulses of longer duration than those used in terahertz pulsed imaging systems, so the results should be treated with caution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this chapter we described how the inclusion of a model of a human arm, combined with the measurement of its neural input and a predictor, can provide to a previously proposed teleoperator design robustness under time delay. Our trials gave clear indications of the superiority of the NPT scheme over traditional as well as the modified Yokokohji and Yoshikawa architectures. Its fundamental advantages are: the time-lead of the slave, the more efficient, and providing a more natural feeling manipulation, and the fact that incorporating an operator arm model leads to more credible stability results. Finally, its simplicity allows less likely to fail local control techniques to be employed. However, a significant advantage for the enhanced Yokokohji and Yoshikawa architecture results from the very fact that it’s a conservative modification of current designs. Under large prediction errors, it can provide robustness through directing the master and slave states to their means and, since it relies on the passivity of the mechanical part of the system, it would not confuse the operator. An experimental implementation of the techniques will provide further evidence for the performance of the proposed architectures. The employment of neural networks and fuzzy logic, which will provide an adaptive model of the human arm and robustifying control terms, is scheduled for the near future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lean construction is considered from a human resource management (HRM) perspective. It is contended that the UK construction sector is characterised by an institutionalised regressive approach to HRM. In the face of rapidly declining recruitment rates for built environment courses, the dominant HRM philosophy of utilitarian instrumentalism does little to attract the intelligent and creative young people that the industry so badly needs. Given this broader context, there is a danger that an uncritical acceptance of lean construction will exacerbate the industry's reputation for unrewarding jobs. Construction academics have strangely ignored the extensive literature that equates lean production to a HRM regime of control, exploitation and surveillance. The emphasis of lean thinking on eliminating waste and improving efficiency makes it easy to absorb into the best practice agenda because it conforms to the existing dominant way of thinking. 'Best practice' is seemingly judged by the extent to which it serves the interests of the industry's technocratic elite. Hence it acts as a conservative force in favour of maintaining the status quo. In this respect, lean construction is the latest manifestation of a long established trend. In common with countless other improvement initiatives, the rhetoric is heavy in the machine metaphor whilst exhorting others to be more efficient. If current trends in lean construction are extrapolated into the future the ultimate destination may be uncomfortably close to Aldous Huxley's apocalyptic vision of a Brave New World. In the face of these trends, the lean construction research community pleads neutrality whilst confining its attention to the rational high ground. The future of lean construction is not yet predetermined. Many choices remain to be made. The challenge for the research community is to improve practice whilst avoiding the dehumanising tendencies of high utilitarianism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For much of the 1990s and 2000s, the emphasis of urban policy in many global cities was on managing and mitigating the social and environmental effects of rapid economic growth. The credit crunch of 2008 and the subsequent recession have undermined some of the core assumptions on which such policies were based. It is in this context that the concept of resilience planning has taken on a new significance. Drawing on contemporary research in London and Hong Kong, the paper shows how resilience and recovery planning has become a key area of political debate. It examines what is meant by conservative and radical interpretations of resilience and how conservative views have come to dominate ‘recovery’ thinking, with élite groups unwilling to accept the limits to the neo-liberal orthodoxies that helped to precipitate the economic crisis. The paper explores the implications of such thinking for the politics of urban development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Established following the Conservative Party's election victory in April 1992, the Department of National Heritage has been heralded as an important stage in the growing recognition of the significance of the leisure industry to Britain. By combining, for the first time, responsibility for sport, tourism, the arts, libraries, heritage, broadcasting and film, and by providing them with Cabinet representation, a unique opportunity has, seemingly, been provided to develop and promote the interests of leisure in Britain. This paper takes the view that although this initiative has been broadly welcomed, there are important inconsistencies which require attention. On the one hand the selection of the portfolio appears somewhat eclectic. On the other hand, it is questionable why such a department should have been developed at all. An inspection of the implicit ideology suggests that rather than the traditional use of the state to promote leisure interests, the introduction of the department signifies a shift to the use of leisure to promote the Government's interests. Thus the new Department of National Heritage is to be used as a central feature in the legitimation of the government's political programme. Rather than emphasising its traditional quasi-welfare role, the new place for leisure and heritage is firmly in the market economy. Whilst a leisured society may be the epitome of post-industrialism, therefore, the citizen rights claim for access to leisure activities can only be secured by engaging with the market. This legitimised construction of post- modern citizenship is at the centre of a new political order where choice has been replaced by means and where the classless paradigm championed by the Prime Minister will be a classlessness of constructed omission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although much has been written about the effect on services of public sector restructuring, little is yet available on public leisure provision. This omission is addressed by considering how the delivery of public leisure services in Britain has been affected by the imposition of Compulsory Competitive Tendering (CCT). In particular, it focuses on the changing relationship between the central and local levels of government recognising, on the part of local government, a continuum of structural responses to central initiatives which have, in some cases, conspired to reduce the impact of CCT on public leisure provision. The paper concludes that although attempts have been made to protect local services, the outcome of the CCT process has been the regeneration of public leisure provision away from its service roots, but within an enduring ideological paradigm of conservative professionalism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vertebral compression fractures are a common clinical problem and the incidence of them will increase with the ageing population. Traditionally management has been conservative; however, there has been a growing trend towards vertebroplasty as an alternative therapy in patients with persisting severe pain. NICE produced guidance in 2003 recommending the procedure after 4 weeks of conservative management. Recent high-quality studies have been contradictory and there is currently a debate surrounding the role of the procedure with no agreement in the literature. We examine the evidence in both osteoporotic and malignant vertebral compression fractures; we also describe the benefits and side effects, alternative treatment options and the cost of the procedure. Finally, we recommend when vertebroplasty is most appropriately used based on the best available evidence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.