71 resultados para Echinacea (Plants) Therapeutic use
Resumo:
Alpine grasslands are ecosystems with a great diversity of plant species. However, little is known about other levels of biodiversity, such as landscape diversity, diversity of biological interactions of plants with herbivores or fungal pathogens, and genetic diversity. We therefore explored natural and anthropogenic determinants of grassland biodiversity at several levels of biological integration, from the genetic to the landscape level in the Swiss Alps. Differences between cultural traditions (Romanic, Germanic, and Walser) turned out to still affect land use diversity and thus landscape diversity. Increasing land use diversity, in turn, increased plant species diversity per village. However, recent land use changes have reduced this diversity. Within grassland parcels, plant species diversity was higher on unfertilized mown grasslands than on fertilized or grazed ones. Most individual plants were affected by herbivores and fungal leaf pathogens, reflecting that parcels harbored a great diversity of herbivores and pathogens. However, as plant damage by herbivores and pathogens was not severe, conserving these biological interactions among plants is hardly compromising agricultural goals. A common-garden experiment revealed genetic differentiation of the important fodder grass Poa alpina between mown and grazed sites, suggesting adaptation. Per-village genetic diversity of Poa alpina was greater in villages with higher land use diversity, analogous to the higher plant species diversity there. Overall, landscape diversity and biodiversity within grassland parcels are currently declining. As this contradicts the intention of Swiss law and international agreements, financial incentives need to be re-allocated and should focus on promoting high biodiversity at the local and the landscape level. At the same time, this will benefit landscape attractiveness for tourists and help preserve a precious cultural heritage in the Swiss Alps.
Resumo:
Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.
Resumo:
In the Andean highlands, indigenous environmental knowledge is currently undergoing major changes as a result of various external and internal factors. As in other parts of the world, an overall process of erosion of local knowledge can be observed. In response to this trend, some initiatives that adopt a biocultural approach aim at actively strengthening local identities and revalorizing indigenous environmental knowledge and practices, assuming that such practices can contribute to more sustainable management of biodiversity. However, these initiatives usually lack a sound research basis, as few studies have focused on the dynamics of indigenous environmental knowledge in the Andes and on its links with biodiversity management. Against this background, the general objective of this research project was to contribute to the understanding of the dynamics of indigenous environmental knowledge in the Andean highlands of Peru and Bolivia by investigating how local medicinal knowledge is socially differentiated within rural communities, how it is transformed, and which external and internal factors influence these transformation processes. The project adopted an actor-oriented perspective and emphasized the concept of knowledge dialogue by analyzing the integration of traditional and formal medicinal systems within family therapeutic strategies. It also aimed at grasping some of the links between the dynamics of medicinal knowledge and the types of land use systems and biodiversity management. Research was conducted in two case study areas of the Andes, both Quechua-speaking and situated in comparable agro-ecological production belts - Pitumarca District, Department of Cusco (Southern Peruvian Highlands) and the Tunari National Park, Department of Cochabamba (Bolivian inner-Andean valleys). In each case study area, the land use systems and strategies of 18 families from two rural communities, their environmental knowledge related to medicine and to the local therapeutic flora, and an appreciation of the dynamics of this knowledge were assessed. Data were collected through a combination of disciplinary and participatory action-research methods. It was mostly analyzed using qualitative methods, though some quantitative ethnobotanical methods were also used. In both case studies, traditional medicine still constitutes the preferred option for the families interviewed, independently of their age, education level, economic status, religion, or migration status. Surprisingly and contrary to general assertions among local NGOs and researchers, results show that there is a revival of Andean medicine within the younger generation, who have greater knowledge of medicinal plants than the previous one, value this knowledge as an important element of their way of life and relationship with “Mother Earth” (Pachamama), and, at least in the Bolivian case, prefer to consult the traditional healer rather than go to the health post. Migration to the urban centres and the Amazon lowlands, commonly thought to be an important factor of local medicinal knowledge loss, only affects people’s knowledge in the case of families who migrate over half of the year or permanently. Migration does not influence the knowledge of medicinal plants or the therapeutic strategies of families who migrate temporarily for shorter periods of time. Finally, economic status influences neither the status of people’s medicinal knowledge, nor families’ therapeutic strategies, even though the financial factor is often mentioned by practitioners and local people as the main reason for not using the formal health system. The influence of the formal health system on traditional medicinal knowledge varies in each case study area. In the Bolivian case, where it was only introduced in the 1990s and access to it is still very limited, the main impact was to give local communities access to contraceptive methods and to vaccination. In the Peruvian case, the formal system had a much greater impact on families’ health practices, due to local and national policies that, for instance, practically prohibit some traditional practices such as home birth. But in both cases, biomedicine is not considered capable of responding to cultural illnesses such as “fear” (susto), “bad air” (malviento), or “anger” (colerina). As a consequence, Andean farmers integrate the traditional medicinal system and the formal one within their multiple therapeutic strategies, reflecting an inter-ontological dialogue between different conceptions of health and illness. These findings reflect a more general trend in the Andes, where indigenous communities are currently actively revalorizing their knowledge and taking up traditional practices, thus strengthening their indigenous collective identities in a process of cultural resistance.
Resumo:
Forests near the Mediterranean coast have been shaped by millennia of human disturbance. Consequently, ecological studies relying on modern observations or historical records may have difficulty assessing natural vegetation dynamics under current and future climate. We combined a sedimentary pollen record from Lago di Massacciucoli, Tuscany, Italy with simulations from the LandClim dynamic vegetation model to determine what vegetation preceded intense human disturbance, how past changes in vegetation relate to fire and browsing, and the potential of an extinct vegetation type under present climate. We simulated vegetation dynamics near Lago di Massaciucoli for the last 7,000 years using a local chironomid-inferred temperature reconstruction with combinations of three fire regimes (small infrequent, large infrequent, small frequent) and three browsing intensities (no browsing, light browsing, and moderate browsing), and compared model output to pollen data. Simulations with low disturbance support pollen-inferred evidence for a mixed forest dominated by Quercus ilex (a Mediterranean species) and Abies alba (a montane species). Whereas pollen data record the collapse of A. alba after 6000 cal yr bp, simulated populations expanded with declining summer temperatures during the late Holocene. Simulations with increased fire and browsing are consistent with evidence for expansion by deciduous species after A. alba collapsed. According to our combined paleo-environmental and modeling evidence, mixed Q. ilex and A. alba forests remain possible with current climate and limited disturbance, and provide a viable management objective for ecosystems near the Mediterranean coast and in regions that are expected to experience a mediterranean-type climate in the future.
Resumo:
Intensification of land use in semi-natural hay meadows has resulted in a decrease in species diversity. This is often thought to be caused by the reduced establishment of plant species due to high competition for light under conditions of increased productivity. Sowing experiments in grasslands have found reliable evidence that diversity can also be constrained by seed availability, implying that processes influencing the production and persistence of seeds may be important for the functioning of ecosystems. So far, the effects of land-use intensification on the seed rain and the persistence of seeds in the soil have been unclear. We selected six pairs of extensively managed (Festuco-Brometea) and intensively managed (Arrhenatheretalia) grassland with traditional late cutting regimes across Switzerland and covering an annual productivity gradient in the range 176–1211 gm−2. In each grassland community, we estimated seed rain and seed bank using eight pooled seed-trap or topsoil samples of 89 cm2 in each of six plots representing an area of c. 150 m2. The seed traps were established in spring 2010 and collected simultaneously with soil cores after an exposure of c. three months. We applied the emergence method in a cold frame over eight months to estimate density of viable seeds. With community productivity reflecting land-use intensification, the density and species richness in the seed rain increased, while mean seed size diminished and the proportions of persistent seeds and of species with persistent seeds in the topsoil declined. Stronger limitation of seeds in extensively managed semi-natural grasslands can explain the fact that such grasslands are not always richer in species than more intensively managed ones.
Resumo:
Changes in agricultural practices of semi-natural mountain grasslands are expected to modify plant community structure and shift dominance patterns. Using vegetation surveys of 11 sites in semi-natural grasslands of the Swiss Jura and Swiss and French Alps, we determined the relative contribution of dominant, subordinate and transient plant species in grazed and abandoned communities and observed their changes along a gradient of productivity and in response to abandonment of pasturing. The results confirm the humpbacked diversity–productivity relationship in semi-natural grassland, which is due to the increase of subordinate species number at intermediate productivity levels. Grazed communities, at the lower or higher end of the species diversity gradient, suffered higher species loss after grazing abandonment. Species loss after abandonment of pasturing was mainly due to a higher reduction in the number of subordinate species, as a consequence of the increasing proportion of dominant species. When plant biodiversity maintenance is the aim, our results have direct implications for the way grasslands should be managed. Indeed, while intensification and abandonment have been accelerated since few decades, our findings in this multi-site analysis confirm the importance of maintaining intermediate levels of pasturing to preserve biodiversity.
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
There is increasing evidence that species can evolve rapidly in response to environmental change. However, although land use is one of the key drivers of current environmental change, studies of its evolutionary consequences are still fairly scarce, in particular studies that examine land-use effects across large numbers of populations, and discriminate between different aspects of land use. Here, we investigated genetic differentiation in relation to land use in the annual grass Bromus hordeaceus. A common garden study with offspring from 51 populations from three regions and a broad range of land-use types and intensities showed that there was indeed systematic population differentiation of ecologically important plant traits in relation to land use, in particular due to increasing mowing and grazing intensities. We also found strong land-use-related genetic differentiation in plant phenology, where the onset of flowering consistently shifted away from the typical time of management. In addition, increased grazing intensity significantly increased the genetic variability within populations. Our study suggests that land use can cause considerable genetic differentiation among plant populations, and that the timing of land use may select for phenological escape strategies, particularly in monocarpic plant species.
Resumo:
Understanding factors driving the ecology of N cycling microbial communities is of central importance for sustainable land use. In this study we report changes of abundance of denitrifiers, nitrifiers and nitrogen-fixing microorganisms (based on qPCR data for selected functional genes) in response to different land use intensity levels and the consequences for potential turnover rates. We investigated selected grassland sites being comparable with respect to soil type and climatic conditions, which have been continuously treated for many years as intensely used meadows (IM), intensely used mown pastures (IP) and extensively used pastures (EP), respectively. The obtained data were linked to above ground biodiversity pattern as well as water extractable fractions of nitrogen and carbon in soil. Shifts in land use intensity changed plant community composition from systems dominated by s-strategists in extensive managed grasslands to c-strategist dominated communities in intensive managed grasslands. Along the different types of land use intensity, the availability of inorganic nitrogen regulated the abundance of bacterial and archaeal ammonia oxidizers. In contrast, the amount of dissolved organic nitrogen determined the abundance of denitrifiers (nirS and nirK). The high abundance of nifH carrying bacteria at intensive managed sites gave evidence that the amounts of substrates as energy source outcompete the high availability of inorganic nitrogen in these sites. Overall, we revealed that abundance and function of microorganisms involved in key processes of inorganic N cycling (nitrification, denitrification and N fixation) might be independently regulated by different abiotic and biotic factors in response to land use intensity.
Resumo:
Although temporal heterogeneity is a well-accepted driver of biodiversity, effects of interannual variation in land-use intensity (LUI) have not been addressed yet. Additionally, responses to land use can differ greatly among different organisms; therefore, overall effects of land-use on total local biodiversity are hardly known. To test for effects of LUI (quantified as the combined intensity of fertilization, grazing, and mowing) and interannual variation in LUI (SD in LUI across time), we introduce a unique measure of whole-ecosystem biodiversity, multidiversity. This synthesizes individual diversity measures across up to 49 taxonomic groups of plants, animals, fungi, and bacteria from 150 grasslands. Multidiversity declined with increasing LUI among grasslands, particularly for rarer species and aboveground organisms, whereas common species and belowground groups were less sensitive. However, a high level of interannual variation in LUI increased overall multidiversity at low LUI and was even more beneficial for rarer species because it slowed the rate at which the multidiversity of rare species declined with increasing LUI. In more intensively managed grasslands, the diversity of rarer species was, on average, 18% of the maximum diversity across all grasslands when LUI was static over time but increased to 31% of the maximum when LUI changed maximally over time. In addition to decreasing overall LUI, we suggest varying LUI across years as a complementary strategy to promote biodiversity conservation.
Resumo:
BACKGROUND Mechanical autotransfusion systems for washed shed blood (WSB) were introduced to reduce the need for postoperative allogenic blood transfusions (ABTs). Although some authors have postulated decreased requirements for ABT by using autologous retransfusion devices, other trials, mostly evaluating retransfusion devices for unwashed shed blood (USB), verified a small or no benefit in reducing the need for postoperative ABT. Because of these contradictory findings it is still unclear whether autologous retransfusion systems for WSB can reduce transfusion requirements. QUESTIONS/PURPOSES We therefore asked whether one such autologous transfusion system for WSB can reduce the requirements for postoperative ABT. METHODS In a prospective, randomized, controlled study, we enrolled 151 patients undergoing TKA. In Group A (n=76 patients), the autotransfusion system was used for a total of 6 hours (intraoperatively and postoperatively) and the WSB was retransfused after processing. In Control Group B (n=75 patients), a regular drain without suction was used. We used signs of anemia and/or a hemoglobin value less than 8 g/dL as indications for transfusion. If necessary, we administered one or two units of allogenic blood. RESULTS Twenty-three patients (33%) in Group A, who received an average of 283 mL (range, 160-406 mL) of salvaged blood, needed a mean of 2.1 units of allogenic blood, compared with 23 patients (33%) in Control Group B who needed a mean of 2.1 units of allogenic blood. CONCLUSIONS We found the use of an autotransfusion system did not reduce the rate of postoperative ABTs. LEVEL OF EVIDENCE Level II, therapeutic study. See the Guidelines for Authors for a complete description of levels of evidence.
Resumo:
Atrial fibrillation (AF) is associated with an increased risk of thromboembolism, and is the most prevalent factor for cardioembolic stroke. Vitamin K antagonists (VKAs) have been the standard of care for stroke prevention in patients with AF since the early 1990s. They are very effective for the prevention of cardioembolic stroke, but are limited by factors such as drug-drug interactions, food interactions, slow onset and offset of action, haemorrhage and need for routine anticoagulation monitoring to maintain a therapeutic international normalised ratio (INR). Multiple new oral anticoagulants have been developed as potential replacements for VKAs for stroke prevention in AF. Most are small synthetic molecules that target thrombin (e.g. dabigatran etexilate) or factor Xa (e.g. rivaroxaban, apixaban, edoxaban, betrixaban, YM150). These drugs have predictable pharmacokinetics that allow fixed dosing without routine laboratory monitoring. Dabigatran etexilate, the first of these new oral anticoagulants to be approved by the United States Food and Drug Administration and the European Medicines Agency for stroke prevention in patients with non-valvular AF, represents an effective and safe alternative to VKAs. Under the auspices of the Regional Anticoagulation Working Group, a multidisciplinary group of experts in thrombosis and haemostasis from Central and Eastern Europe, an expert panel with expertise in AF convened to discuss practical, clinically important issues related to the long-term use of dabigatran for stroke prevention in non-valvular AF. The practical information reviewed in this article will help clinicians make appropriate use of this new therapeutic option in daily clinical practice.
Resumo:
In the 1980s, leukaemia clusters were discovered around nuclear fuel reprocessing plants in Sellafield and Dounreay in the United Kingdom. This raised public concern about the risk of childhood leukaemia near nuclear power plants (NPPs). Since then, the topic has been well-studied, but methodological limitations make results difficult to interpret. Our review aims to: (1.) summarise current evidence on the relationship between NPPs and risk of childhood leukaemia, with a focus on the Swiss CANUPIS (Childhood cancer and nuclear power plants in Switzerland) study; (2.) discuss the limitations of previous research; and (3.) suggest directions for future research. There are various reasons that previous studies produced inconclusive results. These include: inadequate study designs and limited statistical power due to the low prevalence of exposure (living near a NPP) and outcome (leukaemia); lack of accurate exposure estimates; limited knowledge of the aetiology of childhood leukaemia, particularly of vulnerable time windows and latent periods; use of residential location at time of diagnosis only and lack of data on address histories; and inability to adjust for potential confounders. We conclude that risk of childhood leukaemia around NPPs should continue to be monitored and that study designs should be improved and standardised. Data should be pooled internationally to increase the statistical power. More research needs to be done on other putative risk factors for childhood cancer such as low-dose ionizing radiation, exposure to certain chemicals and exposure to infections. Studies should be designed to allow examining multiple exposures.
Resumo:
We hypothesized that biodiversity improves ecosystem functioning and services such as nutrient cycling because of increased complementarity. We examined N canopy budgets of 27 Central European forests of varying dominant tree species, stand density, and tree and shrub species diversity (Shannon index) in three study regions by quantifying bulk and fine particulate dry deposition and dissolved below canopy N fluxes. Average regional canopy N retention ranged from 16% to 51%, because of differences in the N status of the ecosystems. Canopy N budgets of coniferous forests differed from deciduous forest which we attribute to differences in biogeochemical N cycling, tree functional traits and canopy surface area. The canopy budgets of N were related to the Shannon index which explained 14% of the variance of the canopy budgets of N, suggesting complementary aboveground N use of trees and diverse understorey vegetation. The relationship between plant diversity and canopy N retention varied among regional site conditions and forest types. Our results suggest that the traditional view of belowground complementarity of nutrient uptake by roots in diverse plant communities can be transferred to foliar uptake in forest canopies.