14 resultados para control practices
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Stenotrophomonas maltophilia has emerged as an important opportunistic pathogen in the debilitated host. S maltophilia is not an inherently virulent pathogen, but its ability to colonise respiratory-tract epithelial cells and surfaces of medical devices makes it a ready coloniser of hospitalised patients. S maltophilia can cause blood-stream infections and pneumonia with considerable morbidity in immunosuppressed patients. Management of infection is hampered by high-level intrinsic resistance to many antibiotic classes and the increasing occurrence of acquired resistance to the first-line drug co-trimoxazole. Prevention of acquisition and infection depends upon the application of modern infection-control practices, with emphasis on the control of antibiotic use and environmental reservoirs.
Resumo:
Deregulation strategies and their regulating effects: The case of the termination of Social Assistance for rejected asylum seekers in Switzerland. In Switzerland, rejected asylum seekers no longer have any residence rights. In 2003 the Swiss state decided to terminate the so far granted social assistance for people with a non-entry decision on their asylum request. In 2008 the termination of social assistance was expanded to all rejected asylum seekers. Nevertheless, facing the impossibility of deporting them, the Swiss state entitled this group of people to emergency assistance. It is a basic, which is stated in the Swiss Federal constitution. In this context, new structures were established specially for rejected asylum seekers. These structures had to be set up, financed, controlled, managed and legitimized. For example, collective centres were set up exclusively for rejected asylum seekers. In this speech, I want to analyze the political and bureaucratic process of terminating social assistance for rejected asylum seekers. The exclusion of rejected asylum seekers from social aid was embedded in a wider austerity program of the Federal State. The Federal Migration Office had been requested to save money. The main official goal was to reduce the support of these illegalized people, reduce any structures that would prolong their stay on Swiss ground and to set incentives so that they would leave the country on their own. But during the implementation, new regulating effects emerged. Drawing on ethnographic material, I will highlight these “messy procedures” (Sciortino 2004). First, I will analyze the means and goals developed by the Federal authorities while conceptualising the termination of social assistance. Second, I will focus on the new built structures and elaborate the practices and legitimating strategies of the authorities. As a conclusion, I will analyze the ambivalences of these processes which, at the end, established specific structures for the “unwanted”.
Resumo:
Characterization of third-generation-cephalosporin-resistant Klebsiella pneumoniae isolates originating mainly from one human hospital (n = 22) and one companion animal hospital (n = 25) in Bern (Switzerland) revealed the absence of epidemiological links between human and animal isolates. Human infections were not associated with the spread of any specific clone, while the majority of animal infections were due to K. pneumoniae sequence type 11 isolates producing plasmidic DHA AmpC. This clonal dissemination within the veterinary hospital emphasizes the need for effective infection control practices.
Resumo:
Soil conservation technologies that fit well to local scale and are acceptable to land users are increasingly needed. To achieve this at small-holder farm level, there is a need for an understanding of specific erosion processes and indicators, the land users’ knowledge and their willingness, ability and possibilities to respond to the respective problems to decide on control options. This study was carried out to assess local erosion and performance of earlier introduced conservation terraces from both technological and land users’ points of view. The study was conducted during July to August 2008 at Angereb watershed on 58 farm plots from three selected case-study catchments. Participatory erosion assessment and evaluation were implemented along with direct field measurement procedures. Our focus was to involve the land users in the action research to explore with them the effectiveness of existing conservation measures against the erosion hazard. Terrace characteristics measured and evaluated against the terrace implementation guideline of Hurni (1986). The long-term consequences of seasonal erosion indicators had often not been known and noticed by farmers. The cause and effect relationships of the erosion indicators and conservation measures have shown the limitations and gaps to be addressed towards sustainable erosion control strategies. Less effective erosion control has been observed and participants have believed the gaps are to be the result of lack of landusers’ genuine participation. The results of both local erosion observation and assessment of conservation efficacy using different aspects show the need to promote approaches for erosion evaluation and planning of interventions by the farmers themselves. This paper describes the importance of human factor involving in the empirical erosion assessment methods towards sustainable soil conservation.
Resumo:
BACKGROUND Prophylactic measures are key components of dairy herd mastitis control programs, but some are only relevant in specific housing systems. To assess the association between management practices and mastitis incidence, data collected in 2011 by a survey among 979 randomly selected Swiss dairy farms, and information from the regular test day recordings from 680 of these farms was analyzed. RESULTS The median incidence of farmer-reported clinical mastitis (ICM) was 11.6 (mean 14.7) cases per 100 cows per year. The median annual proportion of milk samples with a composite somatic cell count (PSCC) above 200,000 cells/ml was 16.1 (mean 17.3) %. A multivariable negative binomial regression model was fitted for each of the mastitis indicators for farms with tie-stall and free-stall housing systems separately to study the effect of other (than housing system) management practices on the ICM and PSCC events (above 200,000 cells/ml). The results differed substantially by housing system and outcome. In tie-stall systems, clinical mastitis incidence was mainly affected by region (mountainous production zone; incidence rate ratio (IRR) = 0.73), the dairy herd replacement system (1.27) and farmers age (0.81). The proportion of high SCC was mainly associated with dry cow udder controls (IRR = 0.67), clean bedding material at calving (IRR = 1.72), using total merit values to select bulls (IRR = 1.57) and body condition scoring (IRR = 0.74). In free-stall systems, the IRR for clinical mastitis was mainly associated with stall climate/temperature (IRR = 1.65), comfort mats as resting surface (IRR = 0.75) and when no feed analysis was carried out (IRR = 1.18). The proportion of high SSC was only associated with hand and arm cleaning after calving (IRR = 0.81) and beef producing value to select bulls (IRR = 0.66). CONCLUSIONS There were substantial differences in identified risk factors in the four models. Some of the factors were in agreement with the reported literature while others were not. This highlights the multifactorial nature of the disease and the differences in the risks for both mastitis manifestations. Attempting to understand these multifactorial associations for mastitis within larger management groups continues to play an important role in mastitis control programs.
Resumo:
Due to widespread development of anthelmintic resistance in equine parasites, recommendations for their control are currently undergoing marked changes with a shift of emphasis toward more coprological surveillance and reduced treatment intensity. Denmark was the first nation to introduce prescription-only restrictions of anthelmintic drugs in 1999, but other European countries have implemented similar legislations over recent years. A questionnaire survey was performed in 2008 among Danish horse owners to provide a current status of practices and perceptions with relation to parasite control. Questions aimed at describing the current use of coprological surveillance and resulting anthelmintic treatment intensities, evaluating knowledge and perceptions about the importance of various attributes of parasite control, and assessing respondents' willingness to pay for advice and parasite surveillance services from their veterinarians. A total of 1060 respondents completed the questionnaire. A large majority of respondents (71.9%) were familiar with the concept of selective therapy. Results illustrated that the respondents' self-evaluation of their knowledge about parasites and their control associated significantly with their level of interest in the topic and their type of education (P<0.0001). The large majority of respondents either dewormed their horses twice a year and/or performed two fecal egg counts per horse per year. This approach was almost equally pronounced in foals, horses aged 1-3 years old, and adult horses. The respondents rated prevention of parasitic disease and prevention of drug resistance as the most important attributes, while cost and frequent fecal testing were rated least important. Respondents' actual spending on parasite control per horse in the previous year correlated significantly with the amount they declared themselves willing to spend (P<0.0001). However, 44.4% declared themselves willing to pay more than what they were spending. Altogether, results indicate that respondents were generally familiar with equine parasites and the concept of selective therapy, although there was some confusion over the terms small and large strongyles. They used a large degree of fecal surveillance in all age groups, with a majority of respondents sampling and/or treating around twice a year. Finally, respondents appeared willing to spend money on parasite control for their horses. It is of concern that the survey suggested that foals and young horses are treated in a manner very similar to adult horses, which is against current recommendations. Thus, the survey illustrates the importance of clear communication of guidelines for equine parasite control.
Resumo:
Objective. To identify current outpatient parenteral antibiotic therapy practice patterns and complications. Methods. We administered an 11-question survey to adult infectious disease physicians participating in the Emerging Infections Network (EIN), a Centers for Disease Control and Prevention-sponsored sentinel event surveillance network in North America. The survey was distributed electronically or via facsimile in November and December 2012. Respondent demographic characteristics were obtained from EIN enrollment data. Results. Overall, 555 (44.6%) of EIN members responded to the survey, with 450 (81%) indicating that they treated 1 or more patients with outpatient parenteral antimicrobial therapy (OPAT) during an average month. Infectious diseases consultation was reported to be required for a patient to be discharged with OPAT by 99 respondents (22%). Inpatient (282 [63%] of 449) and outpatient (232 [52%] of 449) infectious diseases physicians were frequently identified as being responsible for monitoring laboratory results. Only 26% (118 of 448) had dedicated OPAT teams at their clinical site. Few infectious diseases physicians have systems to track errors, adverse events, or "near misses" associated with OPAT (97 [22%] of 449). OPAT-associated complications were perceived to be rare. Among respondents, 80% reported line occlusion or clotting as the most common complication (occurring in 6% of patients or more), followed by nephrotoxicity and rash (each reported by 61%). Weekly laboratory monitoring of patients who received vancomycin was reported by 77% of respondents (343 of 445), whereas 19% of respondents (84 of 445) reported twice weekly laboratory monitoring for these patients. Conclusions. Although use of OPAT is common, there is significant variation in practice patterns. More uniform OPAT practices may enhance patient safety.
Resumo:
BACKGROUND The nine equivalents of nursing manpower use score (NEMS) is used to evaluate critical care nursing workload and occasionally to define hospital reimbursements. Little is known about the caregivers' accuracy in scoring, about factors affecting this accuracy and how validity of scoring is assured. METHODS Accuracy in NEMS scoring of Swiss critical care nurses was assessed using case vignettes. An online survey was performed to assess training and quality control of NEMS scoring and to collect structural and organizational data of participating intensive care units (ICUs). Aggregated structural and procedural data of the Swiss ICU Minimal Data Set were used for matching. RESULTS Nursing staff from 64 (82%) of the 78 certified adult ICUs participated in this survey. Training and quality control of scoring shows large variability between ICUs. A total of 1378 nurses scored one out of 20 case vignettes: accuracy ranged from 63.7% (intravenous medications) to 99.1% (basic monitoring). Erroneous scoring (8.7% of all items) was more frequent than omitted scoring (3.2%). Mean NEMS per case was 28.0 ± 11.8 points (reference score: 25.7 ± 14.2 points). Mean bias was 2.8 points (95% confidence interval: 1.0-4.7); scores below 37.1 points were generally overestimated. Data from units with a greater nursing management staff showed a higher bias. CONCLUSION Overall, nurses assess the NEMS score within a clinically acceptable range. Lower scores are generally overestimated. Inaccurate assessment was associated with a greater size of the nursing management staff. Swiss head nurses consider themselves motivated to assure appropriate scoring and its validation.
Resumo:
Abstract BACKGROUND: Many studies have been conducted to define risk factors for the transmission of bovine paratuberculosis, mostly in countries with large herds. Little is known about the epidemiology in infected Swiss herds and risk factors important for transmission in smaller herds. Therefore, the presence of known factors which might favor the spread of paratuberculosis and could be related to the prevalence at animal level of fecal shedding of Mycobacterium avium subsp. paratuberculosis were assessed in 17 infected herds (10 dairy, 7 beef). Additionally, the level of knowledge of herd managers about the disease was assessed. In a case-control study with 4 matched negative control herds per infected herd, the association of potential risk factors with the infection status of the herd was investigated. RESULTS: Exposure of the young stock to feces of older animals was frequently observed in infected and in control herds. The farmers' knowledge about paratuberculosis was very limited, even in infected herds. An overall prevalence at animal level of fecal shedding of Mycobacterium avium subsp. paratuberculosis of 6.1% was found in infected herds, whereby shedders younger than 2 years of age were found in 46.2% of the herds where the young stock was available for testing. Several factors related to contamination of the heifer area with cows' feces and the management of the calving area were found to be significantly associated with the within-herd prevalence. Animal purchase was associated with a positive herd infection status (OR = 7.25, p = 0.004). CONCLUSIONS: Numerous risk factors favoring the spread of Mycobacterium avium subsp. paratuberculosis from adult animals to the young stock were observed in infected Swiss dairy and beef herds, which may be amenable to improvement in order to control the disease. Important factors were contamination of the heifer and the calving area, which were associated with higher within-herd prevalence of fecal shedding. The awareness of farmers of paratuberculosis was very low, even in infected herds. Animal purchase in a herd was significantly associated with the probability of a herd to be infected and is thus the most important factor for the control of the spread of disease between farms.
Resumo:
The dual-effects model of social control proposes that social control leads to increased psychological distress but also to better health practices. However, findings are inconsistent, and recent research suggests that the most effective control is unnoticed by the receiver (i. e., invisible). Yet, investigations of the influence of invisible control on daily negative affect and smoking have been limited. Using daily diaries, we investigated how invisible social control was associated with negative affect and smoking. Overall, 100 smokers (72.0 % men, age M = 40.48, SD = 9.82) and their nonsmoking partners completed electronic diaries from a self-set quit date for 22 consecutive days, reporting received and provided social control, negative affect, and daily smoking. We found in multilevel analyses of the within-person process that on days with higher-than-average invisible control, smokers reported more negative affect and fewer cigarettes smoked. Findings are in line with the assumptions of the dual-effects model of social control: Invisible social control increased daily negative affect and simultaneously reduced smoking at the within-person level.
Resumo:
The dual-effects model of social control not only assumes that social control leads to better health practices but also arouses psychological distress. However, findings are inconsistent. The present study advances the current literature by examining social control from a dyadic perspective in the context of smoking. In addition, the study examines whether control, continuous smoking abstinence, and affect are differentially related for men and women. Before and three weeks after a self-set quit attempt, we examined 106 smokers (77 men, mean age: 40.67, average number of cigarettes smoked per day: 16.59 [SD=8.52, range=1-40] at baseline and 5.27 [SD=6.97, range=0-40] at follow-up) and their nonsmoking heterosexual partners, assessing received and provided control, continuous abstinence, and affect. With regard to smoker's affective reactions, partner's provided control was related to an increase in positive and to a decrease in negative affect, but only for female smokers. Moreover, the greater the discrepancy between smoker received and partner's provided control was the more positive affect increased and the more negative affect decreased, but again only for female smokers. These findings demonstrate that female smokers' well-being was raised over time if they were not aware of the control attempts of their nonsmoking partners, indicating positive effects of invisible social control. This study's results emphasize the importance of applying a dyadic perspective and taking gender differences in the dual-effects model of social control into account.
Resumo:
Objectives: The dual-effects model of social control proposes that social control leads to better health practices, but also arouses psychological distress. However, findings are inconsistent in relation to health behavior and psychological distress. Recent research suggests that the most effective control is unnoticed by the receiver (i.e., invisible). There is some evidence that invisible social control is beneficial for positive and negative affective reactions. Yet, investigations of the influence of invisible social control on daily smoking and distress have been limited. In daily diaries, we investigated how invisible social control is associated with number of cigarettes smoked and negative affect on a daily basis. Methods: Overall, 99 smokers (72.0% men, mean age M = 40.48, SD = 9.82) and their non-smoking partners completed electronic diaries from a self-set quit date for 22 consecutive days within the hour before going to bed, reporting received and provided social control, daily number of cigarettes smoked, and negative affect. Results: Multilevel analyses indicated that between-person levels of invisible social control were associated with lower negative affect, whereas they were unrelated to number of cigarettes smoked. On days with higher-than-average invisible social control, smokers reported less cigarettes smoked and more negative affect. Conclusions: Between-person level findings indicate that invisible social control can be beneficial for negative affect. However, findings on the within-person level are in line with the assumptions of the dual-effects model of social control: Invisible social control reduced daily smoking and simultaneously increased daily negative affect within person.
Resumo:
BACKGROUND Patients requiring anticoagulation suffer from comorbidities such as hypertension. On the occasion of INR monitoring, general practitioners (GPs) have the opportunity to control for blood pressure (BP). We aimed to evaluate the impact of Vitamin-K Antagonist (VKA) monitoring by GPs on BP control in patients with hypertension. METHODS We cross-sectionally analyzed the database of the Swiss Family Medicine ICPC Research using Electronic Medical Records (FIRE) of 60 general practices in a primary care setting in Switzerland. This database includes 113,335 patients who visited their GP between 2009 and 2013. We identified patients with hypertension based on antihypertensive medication prescribed for ≥6 months. We compared patients with VKA for ≥3 months and patients without such treatment regarding BP control. We adjusted for age, sex, observation period, number of consultations and comorbidity. RESULTS We identified 4,412 patients with hypertension and blood pressure recordings in the FIRE database. Among these, 569 (12.9 %) were on Phenprocoumon (VKA) and 3,843 (87.1 %) had no anticoagulation. Mean systolic and diastolic BP was significantly lower in the VKA group (130.6 ± 14.9 vs 139.8 ± 15.8 and 76.6 ± 7.9 vs 81.3 ± 9.3 mm Hg) (p < 0.001 for both). The difference remained after adjusting for possible confounders. Systolic and diastolic BP were significantly lower in the VKA group, reaching a mean difference of -8.4 mm Hg (95 % CI -9.8 to -7.0 mm Hg) and -1.5 mm Hg (95 % CI -2.3 to -0.7 mm Hg), respectively (p < 0.001 for both). CONCLUSIONS In a large sample of hypertensive patients in Switzerland, VKA treatment was independently associated with better systolic and diastolic BP control. The observed effect could be due to better compliance with antihypertensive medication in patients treated with VKA. Therefore, we conclude to be aware of this possible benefit especially in patients with lower expected compliance and with multimorbidity.
Resumo:
OBJECTIVE The first description of the simplified acute physiology score (SAPS) II dates back to 1993, but little is known about its accuracy in daily practice. Our purpose was to evaluate the accuracy of scoring and the factors that affect it in a nationwide survey. METHODS Twenty clinical scenarios, covering a broad range of illness severities, were randomly assigned to a convenience sample of physicians or nurses in Swiss adult intensive care units (ICUs), who were asked to assess the SAPS II score for a single scenario. These data were compared to a reference that was defined by five experienced researchers. The results were cross-matched with demographic characteristics and data on the training and quality control for the scoring, structural and organisational properties of each participating ICU. RESULTS A total of 345 caregivers from 53 adult ICU providers completed the SAPS II evaluation of one clinical scenario. The mean SAPS II scoring was 42.6 ± 23.4, with a bias of +5.74 (95%CI 2.0-9.5) compared to the reference score. There was no evidence of bias variation according to the case severity, ICU size, linguistic area, profession (physician vs. nurse), experience, initial SAPS II training, or presence of a quality control system. CONCLUSION This nationwide survey revealed substantial variability in the SAPS II scoring results. On average, SAPS II scoring was overestimated by more than 13%, irrespective of the profession or experience of the scorer or of the structural characteristics of the ICUs.