14 resultados para Green ITC management factors
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Antimicrobial resistance is an emerging concern to public health, and food-producing animals are known to be a potential source for transmission of resistant bacteria to humans. As legislation of the European Union requires to ban conventional cages for the housing of laying hens on the one hand, and a high food safety standard for eggs on the other hand, further investigations about the occurrence of antimicrobial resistance in alternative housing types are required. In this study, we determined antimicrobial resistance in indicator bacteria from 396 cloacal swabs from 99 Swiss laying hen farms among four alternative housing types during a cross-sectional study. On each farm, four hens were sampled and exposure to potential risk factors was identified with a questionnaire. The minimal inhibitory concentration was determined using broth microdilution in Escherichia coli (n=371) for 18 antimicrobials and in Enterococcus faecalis (n=138) and Enterococcus faecium (n=153) for 16 antimicrobials. All antimicrobial classes recommended by the European Food Safety Authority for E. coli and enterococci were included in the resistance profile. Sixty per cent of the E. coli isolates were susceptible to all of the considered antimicrobials and 30% were resistant to at least two antimicrobials. In E. faecalis, 33% of the strains were susceptible to all tested antimicrobials and 40% were resistant to two or more antimicrobials, whereas in E. faecium these figures were 14% and 39% respectively. Risk factor analyses were carried out for bacteria species and antimicrobials with a prevalence of resistance between 15% and 85%. In these analyses, none of the considered housing and management factors showed a consistent association with the prevalence of resistance for more than two combinations of bacteria and antimicrobial. Therefore we conclude that the impact of the considered housing and management practices on the egg producing farms on resistance in laying hens is low.
Resumo:
BackgroundThe aim of the present study was to evaluate the feasibility of using a telephone survey in gaining an understanding of the possible herd and management factors influencing the performance (i.e. safety and efficacy) of a vaccine against porcine circovirus type 2 (PCV2) in a large number of herds and to estimate customers¿ satisfaction.ResultsDatasets from 227 pig herds that currently applied or have applied a PCV2 vaccine were analysed. Since 1-, 2- and 3-site production systems were surveyed, the herds were allocated in one of two subsets, where only applicable variables out of 180 were analysed. Group 1 was comprised of herds with sows, suckling pigs and nursery pigs, whereas herds in Group 2 in all cases kept fattening pigs. Overall 14 variables evaluating the subjective satisfaction with one particular PCV2 vaccine were comingled to an abstract dependent variable for further models, which was characterized by a binary outcome from a cluster analysis: good/excellent satisfaction (green cluster) and moderate satisfaction (red cluster). The other 166 variables comprised information about diagnostics, vaccination, housing, management, were considered as independent variables. In Group 1, herds using the vaccine due to recognised PCV2 related health problems (wasting, mortality or porcine dermatitis and nephropathy syndrome) had a 2.4-fold increased chance (1/OR) of belonging to the green cluster. In the final model for Group 1, the diagnosis of diseases other than PCV2, the reason for vaccine administration being other than PCV2-associated diseases and using a single injection of iron had significant influence on allocating into the green cluster (P¿<¿0.05). In Group 2, only unchanged time or delay of time of vaccination influenced the satisfaction (P¿<¿0.05).ConclusionThe methodology and statistical approach used in this study were feasible to scientifically assess ¿satisfaction¿, and to determine factors influencing farmers¿ and vets¿ opinion about the safety and efficacy of a new vaccine.
Resumo:
Fog is a potential source of water that could be exploited using the innovative technology of fog collection. Naturally, the potential of fog has proven its significance in cloud forests that are thriving from fog interception. Historically, the remains of artificial structures in different countries prove that fog has been collected as an alternative and/or supplementary water source. In the beginning of the 19th century, fog collection was investigated as a potential natural resource. After the mid-1980s, following success in Chile, fog-water collection commenced in a number of developing countries. Most of these countries are located in arid and semi-arid regions with topographic and climatic conditions that favour fog-water collection. This paper reviews the technology of fog collection with initial background information on natural fog collection and its historical development. It reviews the climatic and topographic features that dictate fog formation (mainly advection and orographic) and the innovative technology to collect it, focusing on the amount collected, the quality of fog water, and the impact of the technology on the livelihoods of beneficiary communities. By and large, the technology described is simple, cost-effective, and energy-free. However, fog-water collection has disadvantages in that it is seasonal, localised, and the technology needs continual maintenance. Based on the experience in several countries, the sustainability of the technology could be guaranteed if technical, economic, social, and management factors are addressed during its planning and implementation.
Resumo:
A model is developed to describe transport and loss of methyl bromide (MeBr) in soil following application as a soil fumigant. The model is used to investigate the effect of soil and management factors on MeBr volatilization. Factors studied include depth of injection, soil water content, presence or absence of tarp, depth to downward barrier, and irrigation after injection. Of these factors, the most important was irrigation after injection followed by covering with the tarp, which increased the diffusive resistance of the soil and prevented early loss of MeBr. The model offers an explanation for the apparently contradictory observations of earlier field studies of MeBr volatilization from soils under different conditions. The model was also used to calculate the concentration-time index for various management alternatives, showing that the irrigation application did not make the surface soil more difficult to fumigate, except at very early times. Therefore, irrigation shows promise for reducing fumigant loss while at the same time permitting control of target organisms during fumigation.
Resumo:
The southernmost European natural and planted pine forests are among the most vulnerable areas to warming-induced drought decline. Both drought stress and management factors (e.g., stand origin or reduced thinning) may induce decline by reducing the water available to trees but their relative importances have not been properly assessed. The role of stand origin - densely planted vs. naturally regenerated stands - as a decline driver can be assessed by comparing the growth and vigor responses to drought of similar natural vs. planted stands. Here, we compare these responses in natural and planted Black pine (Pinus nigra) stands located in southern Spain. We analyze how environmental factors - climatic (temperature and precipitation anomalies) and site conditions - and biotic factors - stand structure (age, tree size, density) and defoliation by the pine processionary moth - drive radial growth and crown condition at stand and tree levels. We also assess the climatic trends in the study area over the last 60 years. We use dendrochronology, linear mixed-effects models of basal area increment and structural equation models to determine how natural and planted stands respond to drought and current competition intensity. We observed that a temperature rise and a decrease in precipitation during the growing period led to increasing drought stress during the late 20th century. Trees from planted stands experienced stronger growth reductions and displayed more severe crown defoliation after severe droughts than those from natural stands. High stand density negatively drove growth and enhanced crown dieback, particularly in planted stands. Also pine processionary moth defoliation was more severe in the growth of natural than in planted stands but affected tree crown condition similarly in both stand types. In response to drought, sharp growth reduction and widespread defoliation of planted Mediterranean pine stands indicate that they are more vulnerable and less resilient to drought stress than natural stands. To mitigate forest decline of planted stands in xeric areas such as the Mediterranean Basin, less dense and more diverse stands should be created through selective thinning or by selecting species or provenances that are more drought tolerant. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
BACKGROUND It is often assumed that horses with mild respiratory clinical signs, such as mucous nasal discharge and occasional coughing, have an increased risk of developing recurrent airway obstruction (RAO). HYPOTHESIS Compared to horses without any clinical signs of respiratory disease, those with occasional coughing, mucous nasal discharge, or both have an increased risk of developing signs of RAO (frequent coughing, increased breathing effort, exercise intolerance, or a combination of these) as characterized by the Horse Owner Assessed Respiratory Signs Index (HOARSI 1-4). ANIMALS Two half-sibling families descending from 2 RAO-affected stallions (n = 65 and n = 47) and an independent replication population of unrelated horses (n = 88). METHODS In a retrospective cohort study, standardized information on occurrence and frequency of coughing, mucous nasal discharge, poor performance, and abnormal breathing effort-and these factors combined in the HOARSI-as well as management factors were collected at intervals of 1.3-5 years. RESULTS Compared to horses without clinical signs of respiratory disease (half-siblings 7%; unrelated horses 3%), those with mild respiratory signs developed clinical signs of RAO more frequently: half-siblings with mucous nasal discharge 35% (P < .001, OR: 7.0, sensitivity: 62%, specificity: 81%), with mucous nasal discharge and occasional coughing 43% (P < .001, OR: 9.9, sensitivity: 55%, specificity: 89%); unrelated horses with occasional coughing: 25% (P = .006, OR = 9.7, sensitivity: 75%, specificity: 76%). CONCLUSIONS AND CLINICAL IMPORTANCE Occasional coughing and mucous nasal discharge might represent an increased risk of developing RAO.
Resumo:
Critical limb ischaemia (CLI) is a particularly severe manifestation of lower limb atherosclerosis posing a major threat to both limb and life of affected patients. Besides arterial revascularisation, risk-factor modification and administration of antiplatelet therapy is a major goal in the treatment of CLI patients. Key elements of cardiovascular risk management are smoking cessation and treatment of hyperlipidaemia with dietary modification or statins. Moreover, arterial hypertension and diabetes mellitus should be adequately treated. In CLI patients not suitable for arterial revascularisation or subsequent to unsuccessful revascularisation, parenteral prostanoids may be considered. CLI patients undergoing surgical revascularisation should be treated with beta blockers. At present, neither gene nor stem-cell therapy can be recommended outside clinical trials. Of note, walking exercise is contraindicated in CLI patients due to the risk of worsening pre-existing or causing new ischaemic wounds. CLI patients are oftentimes medically frail and exhibit significant comorbidities. Co-existing coronary heart and carotid as well as renal artery disease should be managed according to current guidelines. Considering the above-mentioned treatment goals, interdisciplinary treatment approaches for CLI patients are warranted. Aim of the present manuscript is to discuss currently existing evidence for both the management of cardiovascular risk factors and treatment of co-existing disease and to deduct specific treatment recommendations.
Resumo:
The role of endovascular interventions in managing dural arteriovenous fistulas (DAVFs) is increasing. Furthermore, in patients with aggressive DAVFs, different surgical interventions are required for complete obliteration or disconnection. Our objective was to evaluate the management of patients with intracranial DAVFs treated in our institution to identify the parameters that may help guide the long-term management of these lesions.
Resumo:
Urolithiasis is one of the most common conditions seen in emergency departments (ED) worldwide, with an increasing frequency in geriatric patients (>65 years). Given the high costs of emergency medical urolithiasis treatment, the need to optimise management is obvious. We aimed to determine risk factors for hospitalisation and evaluate diagnostic and emergency treatment patterns by ED physicians in geriatric urolithiasis patients to assist in optimising treatment.
Resumo:
BACKGROUND : Comparisons between younger and older stroke patients including comorbidities are limited. METHODS : Prospective data of consecutive patients with first ever acute ischemic stroke were compared between younger (= 45 years) and older patients (> 45 years). RESULTS : Among 1004 patients, 137 (14 %) were = 45 years. Younger patients were more commonly female (57 % versus 34 %; p < 0.0001), had a lower frequency of diabetes (1 % versus 15 %; p < 0.0001), hypercholesterolemia (26 % versus 56 %; p < 0.0001), hypertension (19 % versus 65 %; p < 0.0001), coronary heart disease (14 % versus 40 %; p < 0.0001), and a lower mean Charlson co-morbidity index (CCI), (0.18 versus 0.84; p < 0.0001). Tobacco use was more prevalent in the young (39 % versus 26 %; P < 0.0001). Large artery disease (2 % versus 21 %; p < 0.0001), small artery disease (3 % versus 12 %; p = 0.0019) and atrial fibrillation (1 % versus 17 %; p = 0.001) were less common in young patients, while other etiologies (31 % versus 9 %; p < 0.0001), patent foramen ovale or atrial septal defect (44 % versus 26 %; p < 0.0001), and cervical artery dissection (26 % versus 7 %; p < 0.0001) were more frequent. A favorable outcome (mRS 0 or 1) was more common (57.4 % versus 46.9 %; p = 0.023), and mortality (5.1 % versus 12 %; p = 0.009) was lower in the young. After regression analysis, there was no independent association between age and outcome (p = 0.206) or mortality (p = 0.073). Baseline NIHSS score (p < 0.0001), diabetes (p = 0.041), and CCI (p = 0.002) independently predicted an unfavorable outcome. CONCLUSIONS : Younger patients were more likely to be female, had different risk factors and etiologies and fewer co-morbidities. There was no independent association between age and clinical outcome or mortality.
Resumo:
A study was conducted on the highlands of Ethiopia to identify and analyse the factors determining the adoption of environmental management measures. In 1985, Ethiopia was classified into low –and high-potential areas based on the suitability of the natural environment for rain-fed agriculture. To address these objectives, case study areas were selected from low-potential and high-potential areas randomly. Data were collected through face-to-face interview and key informants, focus group discussion and field observation. In the low-potential areas, the physical environment ‒ particularly soil and forest environments have shown substantial recovery. Similarly, the water environment has improved. However, in the high-potential areas sampled, these resources are still being degraded. Clear understanding of the benefits of soil conservation structures by farmers, active involvement and technical support from the government and full and genuine participation of farmers in communal environmental resources management activities were found to be main factors in the adoption of environmental management measures.
Resumo:
We present a case of laparoscopic surgical management of an iatrogenic lymphorrhea using indocyanine green (ICG). A case of a patient who developed recurrent symptomatic lymphorrhea after laparoscopic radical hysterectomy and bilateral pelvic lymphadenectomy for an early stage cervical cancer is presented. Intraoperative bipedal interdigital subcutaneous injection of ICG exactly localized the disrupted lymphatic duct on fluorescence imaging performed with a near-infrared laparoscopic fluorescent optic device, thus allowing a successful surgical repair.