999 resultados para Wildlife practice
Resumo:
This article draws on empirical material to reflect on what drives rapid change in flood risk management practice, reflecting wider interest in the way that scientific practices make risk landscapes and a specific focus on extreme events as drivers of rapid change. Such events are commonly referred to as a form of creative destruction, ones that reveal both the composition of socioenvironmental assemblages and provide a creative opportunity to remake those assemblages in alternate ways, therefore rapidly changing policy and practice. Drawing on wider thinking in complexity theory, we argue that what happens between events might be as, if not more, important than the events themselves. We use two empirical examples concerned with flood risk management practice: a rapid shift in the dominant technologies used to map flood risk in the United Kingdom and an experimental approach to public participation tested in two different locations, with dramatically different consequences. Both show that the state of the socioenvironmental assemblage in which the events take place matters as much as the magnitude of the events themselves. The periods between rapid changes are not simply periods of discursive consolidation but involve the ongoing mutation of such assemblages, which could either sensitize or desensitize them to rapid change. Understanding these intervening periods matters as much as the events themselves. If events matter, it is because of the ways in which they might bring into sharp focus the coding or framing of a socioenvironmental assemblage in policy or scientific practice irrespective of whether or not those events evolve the assemblage in subtle or more radical ways.
Resumo:
Background: Providing support for research is one of the key issues in the ongoing attempts to improve Primary Care. However, when patient care takes up a significant part of a GP's time, conducting research is difficult. In this study we examine the working conditions and profile of GPs who publish in three leading medical journals and propose possible remedial policy actions. Findings: The authors of all articles published in 2006 and 2007 in three international Family Medicine journals - Annals of Family Medicine, Family Practice, and Journal of Family Practice - were contacted by E-mail. They were asked to complete a questionnaire investigating the following variables: availability of specific time for research, time devoted to research, number of patients attended, and university affiliation. Only GPs were included in the study. Three hundred and ten relevant articles published between 2006 and 2007 were identified and the authors contacted using a survey tool. 124 researchers responded to our questionnaire; 45% of respondents who were not GPs were excluded. On average GPs spent 2.52 days per week and 6.9 hours per day on patient care, seeing 45 patients per week. Seventy-five per cent of GPs had specific time assigned to research, on average 13 hours per week; 79% were affiliated to a university and 69% held teaching positions. Conclusions: Most GPs who publish original articles in leading journals have time specifically assigned to research as part of their normal working schedule. They see a relatively small number of patients. Improving the working conditions of family physicians who intend to investigate is likely to lead to better research results.
Resumo:
Recent laboratory studies have suggested that heart rate variability (HRV) may be an appropriate criterion for training load (TL) quantification. The aim of this study was to validate a novel HRV index that may be used to assess TL in field conditions. Eleven well-trained long-distance male runners performed four exercises of different duration and intensity. TL was evaluated using Foster and Banister methods. In addition, HRV measurements were performed 5 minutes before exercise and 5 and 30 minutes after exercise. We calculated HRV index (TLHRV) based on the ratio between HRV decrease during exercise and HRV increase during recovery. HRV decrease during exercise was strongly correlated with exercise intensity (R = -0.70; p < 0.01) but not with exercise duration or training volume. TLHRV index was correlated with Foster (R = 0.61; p = 0.01) and Banister (R = 0.57; p = 0.01) methods. This study confirms that HRV changes during exercise and recovery phase are affected by both intensity and physiological impact of the exercise. Since the TLHRV formula takes into account the disturbance and the return to homeostatic balance induced by exercise, this new method provides an objective and rational TL index. However, some simplification of the protocol measurement could be envisaged for field use.
Resumo:
The EAUN Guidelines Working Group for indwelling catheters have prepared this guideline document to help nurses assess the evidence-based management of catheter care and to incorporate the guidelines’ recommendations into their clinical practice. These guidelines are not meant to be proscriptive, nor will adherence to these guidelines guarantee a successful outcome in all cases. Ultimately, decisions regarding care must be made on a case-by-case basis by healthcare professionals after consultation with their patients using their clinical judgement, knowledge and expertise.
Resumo:
BACKGROUND: Predicting outcome of breast cancer (BC) patients based on sentinel lymph node (SLN) status without axillary lymph node dissection (ALND) is an area of uncertainty. It influences the decision-making for regional nodal irradiation (RNI). The aim of the NORA (NOdal RAdiotherapy) survey was to examine the patterns of RNI. METHODS: A web-questionnaire, including several clinical scenarios, was distributed to 88 EORTC-affiliated centers. Responses were received between July 2013 and January 2014. RESULTS: A total of 84 responses were analyzed. While three-dimensional (3D) radiotherapy (RT) planning is carried out in 81 (96%) centers, nodal areas are delineated in only 51 (61%) centers. Only 14 (17%) centers routinely link internal mammary chain (IMC) and supraclavicular node (SCN) RT indications. In patients undergoing total mastectomy (TM) with ALND, SCN-RT is recommend by 5 (6%), 53 (63%) and 51 (61%) centers for patients with pN0(i+), pN(mi) and pN1, respectively. Extra-capsular extension (ECE) is the main factor influencing decision-making RNI after breast conserving surgery (BCS) and TM. After primary systemic therapy (PST), 49 (58%) centers take into account nodal fibrotic changes in ypN0 patients for RNI indications. In ypN0 patients with inner/central tumors, 23 (27%) centers indicate SCN-RT and IMC-RT. In ypN1 patients, SCN-RT is delivered by less than half of the centers in patients with ypN(i+) and ypN(mi). Twenty-one (25%) of the centers recommend ALN-RT in patients with ypN(mi) or 1-2N+ after ALND. Seventy-five (90%) centers state that age is not considered a limiting factor for RNI. CONCLUSION: The NORA survey is unique in evaluating the impact of SLNB/ALND status on adjuvant RNI decision-making and volumes after BCS/TM with or without PST. ALN-RT is often indicated in pN1 patients, particularly in the case of ECE. Besides the ongoing NSABP-B51/RTOG and ALLIANCE trials, NORA could help to design future specific RNI trials in the SLNB era without ALND in patients receiving or not PST.
Resumo:
Purpose: To analyze the therapeutic indications for off-label use of rituximab, the available evidence for its use, the outcomes, and the cost. Methods: This was a retrospective analysis of patients treated with rituximab for off-label indications from January 2007 to December 2009 in two tertiary hospitals. Information on patient characteristics, medical conditions, and therapeutic responses was collected from medical records. Available evidence for the efficacy of rituximab in each condition was reviewed, and the cost of treatment was calculated. Results: A total of 101 cases of off-label rituximab use were analyzed. The median age of the patients involved was 53 [interquartile range (IQR) 37.5-68.0] years; 55.4 % were women. The indications for prescribing rituximab were primarily hematological diseases (46 %), systemic connective tissue disorders (27 %), and kidney diseases (20 %). Available evidence supporting rituximab treatment for these indications mainly came from individual cohort studies (53.5 % of cases) and case series (25.7 %). The short-term outcome (median 3 months, IQR 2-4 months) was a complete response in 38 % of cases and partial response in 32.6 %. The highest short-term responses were observed for systemic lupus erythematosus and membranous glomerulonephritis, and the lowest was for neuromyelitis optica, idiopathic thrombocytopenic purpura, and miscellaneous indications. Some response was maintained in long-term follow-up (median 23 months IQR 12-30months) in 69.2%of patients showing a short-term response. Median cost per patient was 5,187.5 (IQR 5,187.5-7,781.3). Conclusions: In our study, off-label rituximab was mainly used for the treatment of hematological, kidney, and systemic connective tissue disorders, and the response among our patient cohort was variable depending on the specific disease. The level of evidence supporting the use of rituximab for these indications was low and the cost was very high. We conclude that more clinical trials on the off-label use of rituximab are needed, although these may be difficult to conduct in some rare diseases. Data from observational studies may provide useful information to assist prescribing in clinical practice.
Resumo:
Knee pain is a frequent complaint in ambulatory practice. Because of its complexity, the knee is prone to trauma, arthritis and the impact of aging. Septic arthritis is an emergency and has to be suspected when important knee pain is associated with fever, an alteration of the general condition, or in a particular social context. In most cases the clinical examination can identify the type of pathology. Conservative treatment is beneficial in most cases and physiotherapy a major component of the prognosis.
Resumo:
BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.
Resumo:
Anthropogenic disturbance of wildlife is of growing conservation concern, but we lack comprehensive approaches of its multiple negative effects. We investigated several effects of disturbance by winter outdoor sports on free-ranging alpine Black Grouse by simultaneously measuring their physiological and behavioral responses. We experimentally flushed radio-tagged Black Grouse from their snow burrows, once a day, during several successive days, and quantified their stress hormone levels (corticosterone metabolites in feces [FCM] collected from individual snow burrows). We also measured feeding time allocation (activity budgets reconstructed from radio-emitted signals) in response to anthropogenic disturbance. Finally, we estimated the related extra energy expenditure that may be incurred: based on activity budgets, energy expenditure was modeled from measures of metabolism obtained from captive birds subjected to different ambient temperatures. The pattern of FCM excretion indicated the existence of a funneling effect as predicted by the allostatic theory of stress: initial stress hormone concentrations showed a wide inter-individual variation, which decreased during experimental flushing. Individuals with low initial pre-flushing FCM values augmented their concentration, while individuals with high initial FCM values lowered it. Experimental disturbance resulted in an extension of feeding duration during the following evening foraging bout, confirming the prediction that Black Grouse must compensate for the extra energy expenditure elicited by human disturbance. Birds with low initial baseline FCM concentrations were those that spent more time foraging. These FCM excretion and foraging patterns suggest that birds with high initial FCM concentrations might have been experiencing a situation of allostatic overload. The energetic model provides quantitative estimates of extra energy expenditure. A longer exposure to ambient temperatures outside the shelter of snow burrows, following disturbance, could increase the daily energy expenditure by >10%, depending principally on ambient temperature and duration of exposure. This study confirms the predictions of allostatic theory and, to the best of our knowledge, constitutes the first demonstration of a funneling effect. It further establishes that winter recreation activities incur costly allostatic behavioral and energetic adjustments, which call for the creation of winter refuge areas together with the implementation of visitor-steering measures for sensitive wildlife.
Resumo:
Trabecular bone score (TBS) is a recently-developed analytical tool that performs novel grey-level texture measurements on lumbar spine dual X-ray absorptiometry (DXA) images, and thereby captures information relating to trabecular microarchitecture. In order for TBS to usefully add to bone mineral density (BMD) and clinical risk factors in osteoporosis risk stratification, it must be independently associated with fracture risk, readily obtainable, and ideally, present a risk which is amenable to osteoporosis treatment. This paper summarizes a review of the scientific literature performed by a Working Group of the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis. Low TBS is consistently associated with an increase in both prevalent and incident fractures that is partly independent of both clinical risk factors and areal BMD (aBMD) at the lumbar spine and proximal femur. More recently, TBS has been shown to have predictive value for fracture independent of fracture probabilities using the FRAX® algorithm. Although TBS changes with osteoporosis treatment, the magnitude is less than that of aBMD of the spine, and it is not clear how change in TBS relates to fracture risk reduction. TBS may also have a role in the assessment of fracture risk in some causes of secondary osteoporosis (e.g., diabetes, hyperparathyroidism and glucocorticoid-induced osteoporosis). In conclusion, there is a role for TBS in fracture risk assessment in combination with both aBMD and FRAX.
Resumo:
There are conflicting data on the prevalence of coronary events and the quality of the management of modifiable cardiovascular risk factors (CVRF) inHIV-infected patients. Methods.We performed a retrospective descriptive study to determine the prevalence of coronary events and to evaluate the management of CVRF in a Mediterranean cohort of 3760 HIV-1-infected patients from April 1983 through June 2011. Results.We identified 81 patients with a history of a coronary event (prevalence 2.15%); 83% of them suffered an acute myocardial infarction. At the time of the coronary event, CVRF were highly prevalent (60.5% hypertension, 48% dyslipidemia, and 16% diabetes mellitus).OtherCVRF, such as smoking, hypertension, lack of exercise, and body mass index, were not routinely assessed. After the coronary event, a significant decrease in total cholesterol ( � = 0.025) and LDLcholesterol ( � = 0.004) was observed. However, the percentage of patients whomaintained LDL-cholesterol > 100mg/dL remained stable (from 46% to 41%, � = 0.103). Patients using protease inhibitors associated with a favorable lipid profile increased over time ( � = 0.028). Conclusions.The prevalence of coronary events in our cohort is low. CVRF prevalence is high and theirmanagement is far from optimal. More aggressive interventions should be implemented to diminish cardiovascular risk in HIV-infected patients.
Resumo:
Data are urgently needed to better understand processes of care in Swiss primary care (PC). A total of 2027 PC physicians, stratified by canton, were invited to participate in the Swiss Primary care Active Monitoring network, of whom 200 accepted to join. There were no significant differences between participants and a random sample drawn from the same physician databases based on sex, year of obtaining medical school diploma, or location. The Swiss Primary care Active Monitoring network represents the first large-scale, nationally representative practice-based research network in Switzerland and will provide a unique opportunity to better understand the functioning of Swiss PC.