60 resultados para Ecology, conservation, and management of grouse
Resumo:
Wildfires are very rare in central Europe, which is probably why fire effects on vegetation have been neglected by most central European ecologists and palaeoecologists. Presently, reconstructions of fire history and fire ecology are almost absent. We analysed sediment cores from lakes on the Swiss Plateau (Lobsigensee and Soppensee) for pollen and charcoal to investigate the relationship between vegetation and fire. Microscopic charcoal evidence suggests increasing regional fire frequencies during the Neolithic (7350-4150 cal. BP, 5400-2200 BC) and the subsequent prehistoric epochs at Lobsigensee, whereas at Soppensee burnings remained rather rare until modern times. Neolithic peaks of charcoal at 6200 and 5500 cal. BP (4250 and 3550 BC) coincided with declines of pollen of fire-sensitive taxa at both sites (e.g., Ulmus, Tilia, Hedera, Fagus), suggesting synchronous vegetational responses to fire at regional scales. However, correlation analysis between charcoal and pollen for the period 6600-4400 cal. BP (4650-2650 BC) revealed no significant link between fire and vegetation at Soppensee, whereas at Lobsigensee increases of Corylus and decreases of Fagus were related to fire events. Fire impact on vegetation increased during the subsequent epochs at both sites. Correlation analyses of charcoal and pollen data for the period 4250-1150 cal. BP (2300 BC -AD 800) suggest that fires were intentionally set to disrupt forests and to provide open areas for arable and pastoral farming (e.g., significant positive correlations between charcoal and Cerealia, Plantago lanceolata, Asteroideae). These results are compared with southern European records (Lago di Origlio, Lago di Muzzano), which are situated in particularly fire-prone environments. After the Mesolithic period (I1 200-7350 cal. BP, 9250-5400 BC), charcoal influx was higher by an order of magnitude in the south, suggesting more frequent fires. Neolithic fires caused similar though more pronounced responses of vegetation in the south (e.g., expansions of Corylus). Post-Neolithic land-use practices involving (controlled) burning culminated in both regions at about 2550 cal. BP (c. 600 BC). However, fire-caused disappearances of entire forest communities were confined to the southern sites. Such differences in fire effects among the sites are explained by the dissimilar importance of fire as a result of different climatic conditions and cultural activities. Our results imply that the remaining (fire-sensitive) fragments of central European vegetation north of the Alps are especially endangered by increasing fire frequencies resulting from predicted climatic change.
Resumo:
AIMS We aimed to assess the prevalence and management of clinical familial hypercholesterolaemia (FH) among patients with acute coronary syndrome (ACS). METHODS AND RESULTS We studied 4778 patients with ACS from a multi-centre cohort study in Switzerland. Based on personal and familial history of premature cardiovascular disease and LDL-cholesterol levels, two validated algorithms for diagnosis of clinical FH were used: the Dutch Lipid Clinic Network algorithm to assess possible (score 3-5 points) or probable/definite FH (>5 points), and the Simon Broome Register algorithm to assess possible FH. At the time of hospitalization for ACS, 1.6% had probable/definite FH [95% confidence interval (CI) 1.3-2.0%, n = 78] and 17.8% possible FH (95% CI 16.8-18.9%, n = 852), respectively, according to the Dutch Lipid Clinic algorithm. The Simon Broome algorithm identified 5.4% (95% CI 4.8-6.1%, n = 259) patients with possible FH. Among 1451 young patients with premature ACS, the Dutch Lipid Clinic algorithm identified 70 (4.8%, 95% CI 3.8-6.1%) patients with probable/definite FH, and 684 (47.1%, 95% CI 44.6-49.7%) patients had possible FH. Excluding patients with secondary causes of dyslipidaemia such as alcohol consumption, acute renal failure, or hyperglycaemia did not change prevalence. One year after ACS, among 69 survivors with probable/definite FH and available follow-up information, 64.7% were using high-dose statins, 69.0% had decreased LDL-cholesterol from at least 50, and 4.6% had LDL-cholesterol ≤1.8 mmol/L. CONCLUSION A phenotypic diagnosis of possible FH is common in patients hospitalized with ACS, particularly among those with premature ACS. Optimizing long-term lipid treatment of patients with FH after ACS is required.
Resumo:
The authors present the case of an 81-year-old patient with severe aortic stenosis who experienced left ventricular embolization of an aortic bioprosthesis during transapical aortic valve implantation. The authors discuss reasons for prosthesis embolization and reinforce the attention to technical details and the widespread use of multimodality imaging techniques. In this context, transesophageal echocardiography appears indispensable in the detection and management of procedure-related complications.
Resumo:
Solid organ transplant recipients (SOTR) have an increased risk of skin cancer due to their long-term immunosuppressive state. As the number of these patients is increasing, as well as their life expectancy, it is important to discuss the screening and management of skin cancer in this group of patients. The role of the dermatologist, in collaboration with the transplant team, is important both before transplantation, where patients are screened for skin lesions and the individual risk for skin cancer development is assessed, and after transplantation. Posttransplant management consists of regular dermatological consultations (the frequency depends on different factors discussed below), where early skin cancer screening and management, as well as patient education on sun protective behavior is taught and enforced. Indeed, SOTR are very sensitive to sun damage due to their immunosuppressive state, leading to cumulative sun damage which results in field cancerization with numerous lesions such as in situ squamous cell carcinoma, actinic keratosis and Bowen's disease. These lesions should be recognized and treated as early as possible. Therapeutic options discussed will involve topical therapy, surgical management, adjustment of the patient's immunosuppressive therapy (i.e. reduction of immunosuppression and/or switch to mammalian target of rapamycin inhibitors) and chemoprevention with the retinoid acitretin, which reduces the recurrence rate of squamous cell carcinoma. The dermatological follow-up of SOTR should be integrated into the comprehensive posttransplant care.
Resumo:
BACKGROUND Psoriatic arthritis (PsA) and co-morbidities of psoriasis represent a significant clinical and economic burden for patients with moderate-to-severe psoriasis. Often these co-morbidities may go unrecognized or undertreated. While published data are available on the incidence and impact of some of them, practical guidance for dermatologists on detection and management of these co-morbidities is lacking. OBJECTIVE To prepare expert recommendations to improve the detection and management of common co-morbidities in patients with moderate-to-severe psoriasis. METHODS A systematic literature review was conducted on some common co-morbidities of psoriasis-cardiovascular (CV) diseases (including obesity, hypertension, hyperglycaemia and dyslipidaemia), psychological co-morbidities (including depression, alcohol abuse and smoking) and PsA-to establish the incidence and impact of each. Data gaps were identified and a Delphi survey was carried out to obtain consensus on the detection and management of each co-morbidity. The expert panel members for the Delphi survey comprised 10 dermatologists with substantial clinical expertise in managing moderate-to-severe psoriasis patients, as well as a cardiologist and a psychologist (see appendix) with an interest in dermatology. Agreement was defined using a Likert scale of 1-7. Consensus regarding agreement for each statement was defined as ≥75% of respondents scoring either 1 (strongly agree) or 2 (agree). RESULTS The expert panel members addressed several topics including screening, intervention, monitoring frequency, and the effects of anti-psoriatic treatment on each co-morbidity. Consensus was achieved on 12 statements out of 22 (3 relating to PsA, 4 relating to psychological factors, 5 relating to CV factors). The panel members felt that dermatologists have an important role in screening their psoriasis patients for PsA and in assessing them for psychological and CV co-morbidities. In most cases, however, patients should be referred for specialist management if other co-morbidities are detected. CONCLUSION This article provides useful and practical guidance for the detection and management of common co-morbidities in patients with moderate-to-severe psoriasis.
Resumo:
OBJECTIVE Hunger strikers resuming nutritional intake may develop a life-threatening refeeding syndrome (RFS). Consequently, hunger strikers represent a core challenge for the medical staff. The objective of the study was to test the effectiveness and safety of evidence-based recommendations for prevention and management of RFS during the refeeding phase. METHODS This was a retrospective, observational data analysis of 37 consecutive, unselected cases of prisoners on a hunger strike during a 5-y period. The sample consisted of 37 cases representing 33 individual patients. RESULTS In seven cases (18.9%), the hunger strike was continued during the hospital stay, in 16 episodes (43.2%) cessation of the hunger strike occurred immediately after admission to the security ward, and in 14 episodes (37.9%) during hospital stay. In the refeed cases (n = 30), nutritional replenishment occurred orally, and in 25 (83.3%) micronutrients substitutions were made based on the recommendations. The gradual refeeding with fluid restriction occurred over 10 d. Uncomplicated dyselectrolytemia was documented in 12 cases (40%) within the refeeding phase. One case (3.3%) presented bilateral ankle edemas as a clinical manifestation of moderate RFS. Intensive medical treatment was not necessary and none of the patients died. Seven episodes of continued hunger strike were observed during the entire hospital stay without medical complications. CONCLUSIONS Our data suggested that seriousness and rate of medical complications during the refeeding phase can be kept at a minimum in a hunger strike population. This study supported use of recommendations to optimize risk management and to improve treatment quality and patient safety in this vulnerable population.
Resumo:
Currently no pharmacogenomics-based criteria exist to guide clinicians in identifying individuals who are at risk of hearing loss from cisplatin-based chemotherapy. This review summarizes findings from pharmacogenomic studies that report genetic polymorphisms associated with cisplatin-induced hearing loss and aims to (1) provide up-to-date information on new developments in the field; (2) provide recommendations for the use of pharmacogenetic testing in the prevention, assessment and management of cisplatin-induced hearing loss in children and adults; and (3) identify knowledge gaps to direct and prioritize future research. These practice recommendations for pharmacogenetic testing in the context of cisplatin-induced hearing loss reflect a review and evaluation of recent literature and are designed to assist clinicians in providing optimal clinical care for patients receiving cisplatin based chemotherapy.
Resumo:
Despite improvements in prevention and management of colorectal cancer (CRC), uncontrolled tumor growth with metastatic spread to distant organs remains an important clinical concern. Genetic deletion of CD39, the dominant vascular and immune cell ectonucleotidase, has been shown to delay tumor growth and blunt angiogenesis in mouse models of melanoma, lung and colonic malignancy. Here, we tested the influence of CD39 on CRC tumor progression and metastasis by investigating orthotopic transplanted and metastatic cancer models in wild-type BALB/c, human CD39 transgenic and CD39 deficient mice. We also investigated CD39 and P2 receptor expression patterns in human CRC biopsies. Murine CD39 was expressed by endothelium, stromal and mononuclear cells infiltrating the experimental MC-26 tumors. In the primary CRC model, volumes of tumors in the subserosa of the colon and/or rectum did not differ amongst the treatment groups at day 10, albeit these tumors rarely metastasized to the liver. In the dissemination model, MC-26 cell line-derived hepatic metastases grew significantly faster in CD39 over-expressing transgenics, when compared to CD39 deficient mice. Murine P2Y2 was significantly elevated at both mRNA and protein levels, within the larger liver metastases obtained from CD39 transgenic mice where changes in P2X7 levels were also noted. In clinical samples, lower levels of CD39 mRNA in malignant CRC tissues appeared associated with longer duration of survival and could be linked to less invasive tumors. The modulatory effects of CD39 on tumor dissemination and differential levels of CD39, P2Y2 and P2X7 expression in tumors suggest involvement of purinergic signalling in these processes. Our studies also suggest potential roles for purinergic-based therapies in clinical CRC.
Resumo:
Purpose The purpose of this study is to explore the periodical patterns of events and deaths related to cardiovascular disease (CVD), acute myocardial infarction (AMI) and stroke in Swiss adults (≥18years). Methods Mortality data for period 1969–2007 (N=869,863 CVD events) and hospitalization data for period 1997–2008 (N=959,990 CVD events) were used. The annual, weekly and circadian distribution of CVD-related deaths and events were assessed. Multivariate analysis was conducted using multinomial logistic regression adjusting for age, gender and calendar year and considering deaths from respiratory diseases, accidents or other causes as competitive events. Results CVD deaths and hospitalizations occurred less frequently in the summer months. Similar patterns were found for AMI and stroke. No significant weekly variation for CVD deaths was found. Stratification by age and gender showed subjects aged <65years to present a higher probability of dying on Mondays and Saturday, only for men. This finding was confirmed after multivariate adjustment. Finally, a circadian variation in CVD mortality was observed, with a first peak in the morning (8–12am) and a smaller second peak in the late afternoon (2–6pm). This pattern persisted after multivariate adjustment and was more pronounced for AMI than for stroke. Conclusion There is a periodicity of hospitalizations and deaths related to CVD, AMI and stroke in Switzerland. This pattern changes slightly according to the age and sex of the subjects. Although the underlying mechanisms are not fully identified, preventive measures should take into account these aspects to develop better strategies of prevention and management of CVD.
Resumo:
Mapping ecosystem services (ES) and their trade-offs is a key requirement for informed decision making for land use planning and management of natural resources that aim to move towards increasing the sustainability of landscapes. The negotiations of the purposes of landscapes and the services they should provide are difficult as there is an increasing number of stakeholders active at different levels with a variety of interests present on one particular landscape.Traditionally, land cover data is at the basis for mapping and spatial monitoring of ecosystem services. In light of complex landscapes it is however questionable whether land cover per se and as a spatial base unit is suitable for monitoring and management at the meso-scale. Often the characteristics of a landscape are defined by prevalence, composition and specific spatial and temporal patterns of different land cover types. The spatial delineation of shifting cultivation agriculture represents a prominent example of a land use system with its different land use intensities that requires alternative methodologies that go beyond the common remote sensing approaches of pixel-based land cover analysis due to the spatial and temporal dynamics of rotating cultivated and fallow fields.Against this background we advocate that adopting a landscape perspective to spatial planning and decision making offers new space for negotiation and collaboration, taking into account the needs of local resource users, and of the global community. For this purpose we introduce landscape mosaicsdefined as new spatial unit describing generalized land use types. Landscape mosaics have allowed us to chart different land use systems and land use intensities and permitted us to delineate changes in these land use systems based on changes of external claims on these landscapes. The underlying idea behindthe landscape mosaics is to use land cover data typically derived from remote sensing data and to analyse and classify spatial patterns of this land cover data using a moving window approach. We developed the landscape mosaics approach in tropical, forest dominated landscapesparticularly shifting cultivation areas and present examples ofour work from northern Laos, eastern Madagascarand Yunnan Province in China.
Resumo:
Within the past 15 years, significant advances in the imaging of multiorgan and complex trauma primarily due to the improvement of cross-sectional imaging have resulted in the optimization of the expedient diagnosis and management of the polytrauma patient. At the forefront, multidetector computed tomography (MDCT) has become the cornerstone of modern emergency departments and trauma centers. In many institutions, MDCT is the de facto diagnostic tool upon trauma activation. In the setting of pelvic imaging, MDCT (with its high spatial resolution and sensitivity as well as short acquisition times) allows for rapid identification and assessment of pelvic hemorrhage leading to faster triage and definitive management. In trauma centers throughout the world, angiography and minimally invasive catheter-based embolization techniques performed by interventional radiologists have become the standard of care for patients with acute pelvic trauma and related multiorgan hemorrhage. In an interdisciplinary setting, embolization may be performed either alone or as an adjunct procedure with open or closed reduction and stabilization techniques. A team-based approach involving multiple disciplines (e.g., radiology, traumatology, orthopedic surgery, intensive care medicine) is crucial to monitor and treat the actively bleeding patient appropriately.
Resumo:
A large body of empirical research shows that psychosocial risk factors (PSRFs) such as low socio-economic status, social isolation, stress, type-D personality, depression and anxiety increase the risk of incident coronary heart disease (CHD) and also contribute to poorer health-related quality of life (HRQoL) and prognosis in patients with established CHD. PSRFs may also act as barriers to lifestyle changes and treatment adherence and may moderate the effects of cardiac rehabilitation (CR). Furthermore, there appears to be a bidirectional interaction between PSRFs and the cardiovascular system. Stress, anxiety and depression affect the cardiovascular system through immune, neuroendocrine and behavioural pathways. In turn, CHD and its associated treatments may lead to distress in patients, including anxiety and depression. In clinical practice, PSRFs can be assessed with single-item screening questions, standardised questionnaires, or structured clinical interviews. Psychotherapy and medication can be considered to alleviate any PSRF-related symptoms and to enhance HRQoL, but the evidence for a definite beneficial effect on cardiac endpoints is inconclusive. A multimodal behavioural intervention, integrating counselling for PSRFs and coping with illness should be included within comprehensive CR. Patients with clinically significant symptoms of distress should be referred for psychological counselling or psychologically focused interventions and/or psychopharmacological treatment. To conclude, the success of CR may critically depend on the interdependence of the body and mind and this interaction needs to be reflected through the assessment and management of PSRFs in line with robust scientific evidence, by trained staff, integrated within the core CR team.
Resumo:
Since European settlement, there has been a dramatic increase in the density, cover and distribution of woody plants in former grassland and open woodland. There is a widespread belief that shrub encroachment is synonymous with declines in ecosystem functions, and often it is associated with landscape degradation or desertification. Indeed, this decline in ecosystem functioning is considered to be driven largely by the presence of the shrubs themselves. This prevailing paradigm has been the basis for an extensive program of shrub removal, based on the view that it is necessary to reinstate the original open woodland or grassland structure from which shrublands are thought to have been derived. We review existing scientific evidence, particularly focussed on eastern Australia, to question the notion that shrub encroachment leads to declines in ecosystem functions. We then summarise this scientific evidence into two conceptual models aimed at optimising landscape management to maximise the services provided by shrub-encroached areas. The first model seeks to reconcile the apparent conflicts between the patch- and landscape-level effects of shrubs. The second model identifies the ecosystem services derived from different stages of shrub encroachment. We also examined six ecosystem services provided by shrublands (biodiversity, soil C, hydrology, nutrient provision, grass growth and soil fertility) by using published and unpublished data. We demonstrated the following: (1) shrub effects on ecosystems are strongly scale-, species- and environment-dependent and, therefore, no standardised management should be applied to every case; (2) overgrazing dampens the generally positive effect of shrubs, leading to the misleading relationship between encroachment and degradation; (3) woody encroachment per se does not hinder any of the functions or services described above, rather it enhances many of them; (4) no single shrub-encroachment state (including grasslands without shrubs) will maximise all services; rather, the provision of ecosystem goods and services by shrublands requires a mixture of different states; and (5) there has been little rigorous assessment of the long-term effectiveness of removal and no evidence that this improves land condition in most cases. Our review provides the basis for an improved, scientifically based understanding and management of shrublands, so as to balance the competing goals of providing functional habitats, maintaining soil processes and sustaining pastoral livelihoods.
Resumo:
Tef, Eragrostis tef (Zucc.) Trotter, is the most important cereal in Ethiopia. Tef is cultivated by more than five million small-scale farmers annually and constitutes the staple food for more than half of the population of 80 million. The crop is preferred by both farmers and consumers due to its beneficial traits associated with its agronomy and utilization. The genetic and phenotypic diversity of tef in Ethiopia is a national treasure of potentially global importance. In order for this diversity to be effectively conserved and utilized, a better understanding at the genomic level is necessary. In the recent years, tef has become the subject of genomic research in Ethiopia and abroad. Genomic-assisted tef improvement holds tremendous potential for improving productivity, thereby benefiting the smallholder farmers who have cultivated and relied on the crop for thousands of years. It is hoped that such research endeavours will provide solutions to some of the age-old problems of tef's husbandry. In this review, we provide a brief description of the genesis and progress of tef genomic research to date, suggest ways to utilize the genomic tools developed so far, discuss the potential of genomics to enable sustainable conservation and use of tef genetic diversity and suggest opportunities for the future research.