16 resultados para Long-term Use

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Species’ distributions are likely to be affected by a combination of environmental drivers. We used a data set of 11 million species occurrence records over the period 1970–2010 to assess changes in the frequency of occurrence of 673 macro-moth species in Great Britain. Groups of species with different predicted sensitivities showed divergent trends, which we interpret in the context of land-use and climatic changes. 2. A diversity of responses was revealed: 260 moth species declined significantly, whereas 160 increased significantly. Overall, frequencies of occurrence declined, mirroring trends in less species-rich, yet more intensively studied taxa. 3. Geographically widespread species, which were predicted to be more sensitive to land use than to climate change, declined significantly in southern Britain, where the cover of urban and arable land has increased. 4. Moths associated with low nitrogen and open environments (based on their larval host plant characteristics) declined most strongly, which is also consistent with a land-use change explanation. 5. Some moths that reach their northern (leading edge) range limit in southern Britain increased, whereas species restricted to northern Britain (trailing edge) declined significantly, consistent with a climate change explanation. 6. Not all species of a given type behaved similarly, suggesting that complex interactions between species’ attributes and different combinations of environmental drivers determine frequency of occurrence changes. 7. Synthesis and applications. Our findings are consistent with large-scale responses to climatic and land-use changes, with some species increasing and others decreasing. We suggest that land-use change (e.g. habitat loss, nitrogen deposition) and climate change are both major drivers of moth biodiversity change, acting independently and in combination. Importantly, the diverse responses revealed in this species-rich taxon show that multifaceted conservation strategies are needed to minimize negative biodiversity impacts of multiple environmental changes. We suggest that habitat protection, management and ecological restoration can mitigate combined impacts of land-use change and climate change by providing environments that are suitable for existing populations and also enable species to shift their ranges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RothC and Century are two of the most widely used soil organic matter (SOM) models. However there are few examples of specific parameterisation of these models for environmental conditions in East Africa. The aim of this study was therefore, to evaluate the ability of RothC and the Century to estimate changes in soil organic carbon (SOC) resulting from varying land use/management practices for the climate and soil conditions found in Kenya. The study used climate, soils and crop data from a long term experiment (1976-2001) carried out at The Kabete site at The Kenya National Agricultural Research Laboratories (NARL, located in a semi-humid region) and data from a 13 year experiment carried out in Machang'a (Embu District, located in a semi-arid region). The NARL experiment included various fertiliser (0, 60 and 120 kg of N and P2O5 ha(-1)), farmyard manure (FYM - 5 and 10 t ha(-1)) and plant residue treatments, in a variety of combinations. The Machang'a experiment involved a fertiliser (51 kg N ha(-1)) and a FYM (0, 5 and 10 t ha(-1)) treatment with both monocropping and intercropping. At Kabete both models showed a fair to good fit to measured data, although Century simulations for treatments with high levels of FYM were better than those without. At the Machang'a site with monocrops, both models showed a fair to good fit to measured data for all treatments. However, the fit of both models (especially RothC) to measured data for intercropping treatments at Machang'a was much poorer. Further model development for intercrop systems is recommended. Both models can be useful tools in soil C Predictions, provided time series of measured soil C and crop production data are available for validating model performance against local or regional agricultural crops. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land plants have had the reputation of being problematic for DNA barcoding for two general reasons: (i) the standard DNA regions used in algae, animals and fungi have exceedingly low levels of variability and (ii) the typically used land plant plastid phylogenetic markers (e.g. rbcL, trnL-F, etc.) appear to have too little variation. However, no one has assessed how well current phylogenetic resources might work in the context of identification (versus phylogeny reconstruction). In this paper, we make such an assessment, particularly with two of the markers commonly sequenced in land plant phylogenetic studies, plastid rbcL and internal transcribed spacers of the large subunits of nuclear ribosomal DNA (ITS), and find that both of these DNA regions perform well even though the data currently available in GenBank/EBI were not produced to be used as barcodes and BLAST searches are not an ideal tool for this purpose. These results bode well for the use of even more variable regions of plastid DNA (such as, for example, psbA-trnH) as barcodes, once they have been widely sequenced. In the short term, efforts to bring land plant barcoding up to the standards being used now in other organisms should make swift progress. There are two categories of DNA barcode users, scientists in fields other than taxonomy and taxonomists. For the former, the use of mitochondrial and plastid DNA, the two most easily assessed genomes, is at least in the short term a useful tool that permits them to get on with their studies, which depend on knowing roughly which species or species groups they are dealing with, but these same DNA regions have important drawbacks for use in taxonomic studies (i.e. studies designed to elucidate species limits). For these purposes, DNA markers from uniparentally (usually maternally) inherited genomes can only provide half of the story required to improve taxonomic standards being used in DNA barcoding. In the long term, we will need to develop more sophisticated barcoding tools, which would be multiple, low-copy nuclear markers with sufficient genetic variability and PCR-reliability; these would permit the detection of hybrids and permit researchers to identify the 'genetic gaps' that are useful in assessing species limits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To describe the calculations and approaches used to design experimental diets of differing saturated fatty acid (SFA) and monounsaturated fatty acid (MUFA) compositions for use in a long-term dietary intervention study, and to evaluate the degree to which the dietary targets were met. Design, setting and subjects: Fifty-one students living in a university hall of residence consumed a reference (SFA) diet for 8 weeks followed by either a moderate MUFA (MM) diet or a high MUFA (HM) diet for 16 weeks. The three diets were designed to differ only in their proportions of SFA and MUFA, while keeping total fat, polyunsaturated fatty acids (PUFA), trans-fatty acids, and the ratio of palmitic to stearic acid, and n-6 to n-3 PUFA, unchanged. Results: Using habitual diet records and a standardised database for food fatty acid compositions, a sequential process of theoretical fat substitutions enabled suitable fat sources for use in the three diets to be identified, and experimental margarines for baking, spreading and the manufacture of snack foods to be designed. The dietary intervention was largely successful in achieving the fatty acid targets of the three diets, although unintended differences between the original target and the analysed fatty acid composition of the experimental margarines resulted in a lower than anticipated MUFA intake on the HM diet, and a lower ratio of palmitic to stearic acid compared with the reference or MM diet. Conclusions: This study has revealed important theoretical considerations that should be taken into account when designing diets of specific fatty acid composition, as well as practical issues of implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a free-air CO2 enrichment (FACE) experiment, poplar trees (Populus · euramericana clone I214) were exposed to either ambient or elevated [CO2] from planting, for a 5-year period during canopy development, closure, coppice and re-growth. In each year, measurements were taken of stomatal density (SD, number mm2) and stomatal index (SI, the proportion of epidermal cells forming stomata). In year 5, measurements were also taken of leaf stomatal conductance (gs, lmol m2 s1), photosynthetic CO2 fixation (A, mmol m2 s1), instantaneous water-use efficiency (A/E) and the ratio of intercellular to atmospheric CO2 (Ci:Ca). Elevated [CO2] caused reductions in SI in the first year, and in SD in the first 2 years, when the canopy was largely open. In following years, when the canopy had closed, elevated [CO2] had no detectable effects on stomatal numbers or index. In contrast, even after 5 years of exposure to elevated [CO2], gs was reduced, A/E was stimulated, and Ci:Ca was reduced relative to ambient [CO2]. These outcomes from the long-term realistic field conditions of this forest FACE experiment suggest that stomatal numbers (SD and SI) had no role in determining the improved instantaneous leaf-level efficiency of water use under elevated [CO2]. We propose that altered cuticular development during canopy closure may partially explain the changing response of stomata to elevated [CO2], although the mechanism for this remains obscure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The effects of landscape modifications on the long-term persistence of wild animal populations is of crucial importance to wildlife managers and conservation biologists, but obtaining experimental evidence using real landscapes is usually impossible. To circumvent this problem we used individual-based models (IBMs) of interacting animals in experimental modifications of a real Danish landscape. The models incorporate as much as possible of the behaviour and ecology of four species with contrasting life-history characteristics: skylark (Alauda arvensis), vole (Microtus agrestis), a ground beetle (Bembidion lampros) and a linyphiid spider (Erigone atra). This allows us to quantify the population implications of experimental modifications of landscape configuration and composition. Methodology/Principal Findings: Starting with a real agricultural landscape, we progressively reduced landscape complexity by (i) homogenizing habitat patch shapes, (ii) randomizing the locations of the patches, and (iii) randomizing the size of the patches. The first two steps increased landscape fragmentation. We assessed the effects of these manipulations on the long-term persistence of animal populations by measuring equilibrium population sizes and time to recovery after disturbance. Patch rearrangement and the presence of corridors had a large effect on the population dynamics of species whose local success depends on the surrounding terrain. Landscape modifications that reduced population sizes increased recovery times in the short-dispersing species, making small populations vulnerable to increasing disturbance. The species that were most strongly affected by large disturbances fluctuated little in population sizes in years when no perturbations took place. Significance: Traditional approaches to the management and conservation of populations use either classical methods of population analysis, which fail to adequately account for the spatial configurations of landscapes, or landscape ecology, which accounts for landscape structure but has difficulty predicting the dynamics of populations living in them. Here we show how realistic and replicable individual-based models can bridge the gap between non-spatial population theory and non-dynamic landscape ecology. A major strength of the approach is its ability to identify population vulnerabilities not detected by standard population viability analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the European Union, first-tier assessment of the long-term risk to birds and mammals from pesticides is based on calculation of a deterministic long-term toxicity/exposure ratio(TERlt). The ratio is developed from generic herbivores and insectivores and applied to all species. This paper describes two case studies that implement proposed improvements to the way long-term risk is assessed. These refined methods require calculation of a TER for each of five identified phases of reproduction (phase-specific TERs) and use of adjusted No Observed Effect Levels (NOELs)to incorporate variation in species sensitivity to pesticides. They also involve progressive refinement of the exposure estimate so that it applies to particular species, rather than generic indicators, and relates spraying date to onset of reproduction. The effect of using these new methods on the assessment of risk is described. Each refinement did not necessarily alter the calculated TER value in a way that was either predictable or consistent across both case studies. However, use of adjusted NOELs always reduced TERs, and relating spraying date to onset of reproduction increased most phase-specific TERs. The case studies suggested that the current first-tier TERlt assessment may underestimate risk in some circumstances and that phase-specific assessments can help identify appropriate risk-reduction measures. The way in which deterministic phase-specific assessments can currently be implemented to enhance first-tier assessment is outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The incidence of breast cancer has risen worldwide to unprecedented levels in recent decades, making it now the major cancer of women in many parts of the world.1 Although diet, alcohol, radiation and inherited loss of BRCA1/2 genes have all been associated with increased incidence, the main identified risk factors are life exposure to hormones including physiological variations associated with puberty/pregnancy/menopause,1 personal choice of use of hormonal contraceptives2 and/or hormone replacement therapy.3–6 On this basis, exposure of the human breast to the many environmental pollutant chemicals capable of mimicking or interfering with oestrogen action7 should also be of concern.8 Hundreds of such environmental chemicals have now been measured in human breast tissue from a range of dietary and domestic exposure sources7 ,9 including persistent organochlorine pollutants (POPs),10 polybrominated diphenylethers and polybromobiphenyls,11 polychlorinated biphenyls,12 dioxins,13 alkyl phenols,14 bisphenol-A and chlorinated derivatives,15 as well as other less lipophilic compounds such as parabens (alkyl esters of p-hydroxybenzoic acid),16 but studies investigating any association between raised levels of such compounds and the development of breast cancer remain inconclusive.7–16 However, the functionality of these chemicals has continued to be assessed on the basis of individual chemicals rather than the environmental reality of long-term low-dose exposure to complex mixtures. This misses the potential for individuals to have high concentrations of different compounds but with a common mechanism of action. It also misses the complex interactions between chemicals and physiological hormones which together may act to alter the internal homeostasis of the oestrogenic environment of mammary tissue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soluble reactive phosphorus (SRP) plays a key role in eutrophication, a global problem decreasing habitat quality and in-stream biodiversity. Mitigation strategies are required to prevent SRP fluxes from exceeding critical levels, and must be robust in the face of potential changes in climate, land use and a myriad of other influences. To establish the longevity of these strategies it is therefore crucial to consider the sensitivity of catchments to multiple future stressors. This study evaluates how the water quality and hydrology of a major river system in the UK (the River Thames) respond to alterations in climate, land use and water resource allocations, and investigates how these changes impact the relative performance of management strategies over an 80-year period. In the River Thames, the relative contributions of SRP from diffuse and point sources vary seasonally. Diffuse sources of SRP from agriculture dominate during periods of high runoff, and point sources during low flow periods. SRP concentrations rose under any future scenario which either increased a) surface runoff or b) the area of cultivated land. Under these conditions, SRP was sourced from agriculture, and the most effective single mitigation measures were those which addressed diffuse SRP sources. Conversely, where future scenarios reduced flow e.g. during winters of reservoir construction, the significance of point source inputs increased, and mitigation measures addressing these issues became more effective. In catchments with multiple point and diffuse sources of SRP, an all-encompassing effective mitigation approach is difficult to achieve with a single strategy. In order to attain maximum efficiency, multiple strategies might therefore be employed at different times and locations, to target the variable nature of dominant SRP sources and pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the long term development of networks of glia and neurons on patterns of Parylene-C on a SiO2 substrate. We harvested glia and neurons from the Sprague-Dawley (P1–P7) rat hippocampus and utilized an established cell patterning technique in order to investigate cellular migration, over the course of 3 weeks. This work demonstrates that uncontrolled glial mitosis gradually disrupts cellular patterns that are established early during culture. This effect is not attributed to a loss of protein from the Parylene-C surface, as nitrogen levels on the substrate remain stable over 3 weeks. The inclusion of the anti-mitotic cytarabine (Ara-C) in the culture medium moderates glial division and thus, adequately preserves initial glial and neuronal conformity to underlying patterns. Neuronal apoptosis, often associated with the use of Ara-C, is mitigated by the addition of brain derived neurotrophic factor (BDNF). We believe that with the right combination of glial inhibitors and neuronal promoters, the Parylene-C based cell patterning method can generate structured, active neural networks that can be sustained and investigated over extended periods of time. To our knowledge this is the first report on the concurrent application of Ara-C and BDNF on patterned cell cultures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many studies evaluating model boundary-layer schemes focus either on near-surface parameters or on short-term observational campaigns. This reflects the observational datasets that are widely available for use in model evaluation. In this paper we show how surface and long-term Doppler lidar observations, combined in a way to match model representation of the boundary layer as closely as possible, can be used to evaluate the skill of boundary-layer forecasts. We use a 2-year observational dataset from a rural site in the UK to evaluate a climatology of boundary layer type forecast by the UK Met Office Unified Model. In addition, we demonstrate the use of a binary skill score (Symmetric Extremal Dependence Index) to investigate the dependence of forecast skill on season, horizontal resolution and forecast leadtime. A clear diurnal and seasonal cycle can be seen in the climatology of both the model and observations, with the main discrepancies being the model overpredicting cumulus capped and decoupled stratocumulus capped boundary-layers and underpredicting well mixed boundary-layers. Using the SEDI skill score the model is most skillful at predicting the surface stability. The skill of the model in predicting cumulus capped and stratocumulus capped stable boundary layer forecasts is low but greater than a 24 hr persistence forecast. In contrast, the prediction of decoupled boundary-layers and boundary-layers with multiple cloud layers is lower than persistence. This process based evaluation approach has the potential to be applied to other boundary-layer parameterisation schemes with similar decision structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies of the variation of geomagnetic activity over the past 140 years have quantified the "coronal source" magnetic flux F-s that leaves the solar atmosphere and enters the heliosphere and have shown that it has risen, on average, by an estimated 34% since 1963 and by 140% since 1900. This variation of open solar flux has been reproduced by Solanki et al. [2000] using a model which demonstrates how the open flux accumulates and decays, depending on the rate of flux emergence in active regions and on the length of the solar cycle. We here use a new technique to evaluate solar cycle length and find that it does vary in association with the rate of change of F-s in the way predicted. The long-term variation of the rate of flux emergence is found to be very similar in form to that in F-s, which may offer a potential explanation of why F-s appears to be a useful proxy for extrapolating solar total irradiance back in time. We also find that most of the variation of cosmic ray fluxes incident on Earth is explained by the strength of the heliospheric field (quantified by F-s) and use observations of the abundance of the isotope Be-10 (produced by cosmic rays and deposited in ice sheets) to study the decrease in F-s during the Maunder minimum. The interior motions at the base of the convection zone, where the solar dynamo is probably located, have recently been revealed using the helioseismology technique and found to exhibit a 1.3-year oscillation. This periodicity is here reported in observations of the interplanetary magnetic field and geomagnetic activity but is only present after 1940, When present, it shows a strong 22-year variation, peaking near the maximum of even-numbered sunspot cycles and showing minima at the peaks of odd-numbered cycles. We discuss the implications of these long-term solar and heliospheric variations for Earth's environment.