16 resultados para behaviours of concern

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study analyzes the trend of environmental concern in Switzerland using data from the International Social Survey Program (ISSP) 1993, 2000, and 2010. First, we compare the observed trend with indicators of the intensity of public debate regarding the environment. The results show that both the number of articles dealing with environmental issues in print newspapers and the debates in the Swiss parliament strongly increased during the observed period. The ecological awareness of the population, however, remained constant over this time. Second, we scrutinize the "social basis" of environmental concern paying particular attention to individuals' time preferences. Third, we investigate the relationship between environmental concern and proenvironmental behavior, on the one hand, and the relation of concern and the acceptance of governmental regulations, on the other hand.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The concept of warning behaviors offers an additional perspective in threat assessment. Warning behaviors are acts which constitute evidence of increasing or accelerating risk. They are acute, dynamic, and particularly toxic changes in patterns of behavior which may aid in structuring a professional's judgment that an individual of concern now poses a threat - whether the actual target has been identified or not. They require an operational response. A typology of eight warning behaviors for assessing the threat of intended violence is proposed: pathway, fixation, identification, novel aggression, energy burst, leakage, directly communicated threat, and last resort warning behaviors. Previous research on risk factors associated with such warning behaviors is reviewed, and examples of each warning behavior from various intended violence cases are presented, including public figure assassination, adolescent and adult mass murder, corporate celebrity stalking, and both domestic and foreign acts of terrorism. Practical applications and future research into warning behaviors are suggested. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ninety strains of a collection of well-identified clinical isolates of gram-negative nonfermentative rods collected over a period of 5 years were evaluated using the new colorimetric VITEK 2 card. The VITEK 2 colorimetric system identified 53 (59%) of the isolates to the species level and 9 (10%) to the genus level; 28 (31%) isolates were misidentified. An algorithm combining the colorimetric VITEK 2 card and 16S rRNA gene sequencing for adequate identification of gram-negative nonfermentative rods was developed. According to this algorithm, any identification by the colorimetric VITEK 2 card other than Achromobacter xylosoxidans, Acinetobacter sp., Burkholderia cepacia complex, Pseudomonas aeruginosa, and Stenotrophomonas maltophilia should be subjected to 16S rRNA gene sequencing when accurate identification of nonfermentative rods is of concern.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Steam pops are a risk of irrigated radiofrequency catheter ablation (RFA) and may cause cardiac perforation. Data to guide radiofrequency (RF) energy titration to avoid steam pops are limited. OBJECTIVE: This study sought to assess the frequency and consequence of audible pops and to determine the feasibility of using the magnitude of impedance change to predict pops. METHODS: We reviewed consecutive endocardial open-irrigated RFA for ventricular tachycardia (VT) with continuously recorded ablation data in 142 patients with structural heart disease. Steam pops were defined as an audible pop associated with a sudden spike in impedance. Ablation lesions before or after pops served as controls. RESULTS: From a total of 4,107 ablation lesions, 62 (1.5%) steam pops occurred in 42 procedures in 38 patients. Perforation with tamponade occurred with 1 of 62 (2%) pops. Applications with pops had a greater impedance decrease (22 +/- 7 Omega vs. 18 +/- 8 Omega, P = .001) and a higher maximum power (45 +/- 5 W vs. 43 +/- 6 W, P = .011), but did not differ in maximum catheter tip temperature (40 degrees C +/- 4 degrees C vs. 40 degrees C +/- 4 degrees C, P = .180) from applications without pops. Eighty percent of pops occurred after impedance decreased by at least 18 Omega. CONCLUSION: During VT ablation with open irrigation, audible pops are infrequent and do not usually cause perforation. Limiting RF power to achieve an impedance decrease of <18 Omega is a feasible method of reducing the likelihood of a pop when perforation risk is of concern.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Yellow fever vaccine (17DV) has been investigated incompletely in human immunodeficiency virus (HIV)-infected patients, and adequate immunogenicity and safety are of concern in this population. METHODS: In the Swiss HIV Cohort Study, we identified 102 patients who received 17DV while they were HIV infected. We analyzed neutralization titers (NTs) after 17DV administration using the plaque reduction neutralization test. NTs of 1:>or=10 were defined as reactive, and those of 1:<10 were defined as nonreactive, which was considered to be nonprotective. The results were compared with data for HIV-uninfected individuals. Serious adverse events were defined as hospitalization or death within 6 weeks after receipt of 17DV. RESULTS: At the time of 17DV administration, the median CD4 cell count was 537 cells/mm(3) (range, 11-1730 cells/mm(3)), and the HIV RNA level was undetectable in 41 of 102 HIV-infected patients. During the first year after vaccination, fewer HIV-infected patients (65 [83%] of 78; P = .01) than HIV-uninfected patients revealed reactive NTs, and their NTs were significantly lower (P < .001) than in HIV-uninfected individuals. Eleven patients with initially reactive NTs lost these reactive NTs often demonstrate nonprotective NTs, and may experience a more rapid decline in NTs during follow-up. Vaccination with 17DV appears to be safe in HIV-infected individuals who have high CD4 cell counts, although rate of serious adverse events of up to 3% cannot be excluded.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Avoidable hospitalizations (AH) are hospital admissions for diseases and conditions that could have been prevented by appropriate ambulatory care. We examine regional variation of AH in Switzerland and the factors that determine AH. METHODS We used hospital service areas, and data from 2008-2010 hospital discharges in Switzerland to examine regional variation in AH. Age and sex standardized AH were the outcome variable, and year of admission, primary care physician density, medical specialist density, rurality, hospital bed density and type of hospital reimbursement system were explanatory variables in our multilevel poisson regression. RESULTS Regional differences in AH were as high as 12-fold. Poisson regression showed significant increase of all AH over time. There was a significantly lower rate of all AH in areas with more primary care physicians. Rates increased in areas with more specialists. Rates of all AH also increased where the proportion of residences in rural communities increased. Regional hospital capacity and type of hospital reimbursement did not have significant associations. Inconsistent patterns of significant determinants were found for disease specific analyses. CONCLUSION The identification of regions with high and low AH rates is a starting point for future studies on unwarranted medical procedures, and may help to reduce their incidence. AH have complex multifactorial origins and this study demonstrates that rurality and physician density are relevant determinants. The results are helpful to improve the performance of the outpatient sector with emphasis on local context. Rural and urban differences in health care delivery remain a cause of concern in Switzerland.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction The global prevalence of pathologic myopia is 0.9-3.1%, and visual impairment is found in 0.1-0.5% of European and 0.2-1.4% of Asian studies. Myopic choroidal neovascularization (mCNV) affects 5.2-11.3% of pathologic myopia patients and is a leading cause of vision impairment in the working-age population. Characteristic morphological changes and visual-acuity decrease are diagnostic features. Vascular-Endothelial-Growth-Factor (VEGF) has been identified as a trigger for pathologic neovascularization in these highly myopic patients. Areas Covered We cover the epidemiology, pathology and diagnostic aspects of mCNV. The history of therapeutic interventions is described, followed by an overview of current standard-of-care (SOC)-blocking VEGF using bevacizumab (off-label), ranibizumab or aflibercept and improving vision up to 13.5-14.4 letters. Despite good efficacy, an unmet medical need remains. We summarize ongoing and future developments of new drugs to treat or potentially cure mCNV. Expert Opinion mCNV is a major global health concern. Early detection and treatment is key for a satisfying outcome. The current SOC, VEGF inhibitors, affords good therapeutic efficacy and reasonable disease stabilization with few intravitreal treatments per year. However, the long-term prognosis is still unsatisfactory, and side-effects like chorioretinal atrophy development are of concern. Therefore, efforts should be intensified to develop more effective therapies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

While many myxozoan parasites produce asymptomatic infections in fish hosts, several species cause diseases whose patterns of prevalence and pathogenicity are highly dependent on host and environmental factors. This chapter reviews how these factors influence pathogenicity and disease prevalence. Influential host factors include age, size and nutritional state. There is also strong evidence for host strains that vary in resistance to infection and that there is a genetic basis for resistance. A lack of co-evolutionary processes appears to generally underly the devastating impacts of diseases caused by myxozoans when introduced fish are exposed to novel parasites (e.g. PKD in rainbow trout in Europe) or when native fish are exposed to an introduced parasite (e.g. whirling disease in North America). Most available information on abiotic factors relates to water temperature, which has been shown to play a crucial role in several host parasite systems (e.g. whirling disease, PKD) and is therefore of concern in view of global warming, fish health and food sustainability. Eutrophication may also influence disease development. Abiotic factors may also drive fish disease via their impact on parasite development in invertebrate hosts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Zoonoses, diseases affecting both humans and animals, can exert tremendous pressures on human and veterinary health systems, particularly in resource limited countries. Anthrax is one such zoonosis of concern and is a disease requiring greater public health attention in Nigeria. Here we describe the genetic diversity of Bacillus anthracis in Nigeria and compare it to Chad, Cameroon and a broader global dataset based on the multiple locus variable number tandem repeat (MLVA-25) genetic typing system. Nigerian B. anthracis isolates had identical MLVA genotypes and could only be resolved by measuring highly mutable single nucleotide repeats (SNRs). The Nigerian MLVA genotype was identical or highly genetically similar to those in the neighboring countries, confirming the strains belong to this unique West African lineage. Interestingly, sequence data from a Nigerian isolate shares the anthrose deficient genotypes previously described for strains in this region, which may be associated with vaccine evasion. Strains in this study were isolated over six decades, indicating a high level of temporal strain stability regionally. Ecological niche models were used to predict the geographic distribution of the pathogen for all three countries. We describe a west-east habitat corridor through northern Nigeria extending into Chad and Cameroon. Ecological niche models and genetic results show B. anthracis to be ecologically established in Nigeria. These findings expand our understanding of the global B. anthracis population structure and can guide regional anthrax surveillance and control planning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Environmental quality monitoring of water resources is challenged with providing the basis for safeguarding the environment against adverse biological effects of anthropogenic chemical contamination from diffuse and point sources. While current regulatory efforts focus on monitoring and assessing a few legacy chemicals, many more anthropogenic chemicals can be detected simultaneously in our aquatic resources. However, exposure to chemical mixtures does not necessarily translate into adverse biological effects nor clearly shows whether mitigation measures are needed. Thus, the question which mixtures are present and which have associated combined effects becomes central for defining adequate monitoring and assessment strategies. Here we describe the vision of the international, EU-funded project SOLUTIONS, where three routes are explored to link the occurrence of chemical mixtures at specific sites to the assessment of adverse biological combination effects. First of all, multi-residue target and non-target screening techniques covering a broader range of anticipated chemicals co-occurring in the environment are being developed. By improving sensitivity and detection limits for known bioactive compounds of concern, new analytical chemistry data for multiple components can be obtained and used to characterise priority mixtures. This information on chemical occurrence will be used to predict mixture toxicity and to derive combined effect estimates suitable for advancing environmental quality standards. Secondly, bioanalytical tools will be explored to provide aggregate bioactivity measures integrating all components that produce common (adverse) outcomes even for mixtures of varying compositions. The ambition is to provide comprehensive arrays of effect-based tools and trait-based field observations that link multiple chemical exposures to various environmental protection goals more directly and to provide improved in situ observations for impact assessment of mixtures. Thirdly, effect-directed analysis (EDA) will be applied to identify major drivers of mixture toxicity. Refinements of EDA include the use of statistical approaches with monitoring information for guidance of experimental EDA studies. These three approaches will be explored using case studies at the Danube and Rhine river basins as well as rivers of the Iberian Peninsula. The synthesis of findings will be organised to provide guidance for future solution-oriented environmental monitoring and explore more systematic ways to assess mixture exposures and combination effects in future water quality monitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

INTRODUCTION Extended-spectrum beta-lactamases (ESBL) and AmpC beta-lactamases (AmpC) are of concern for veterinary and public health because of their ability to cause treatment failure due to antimicrobial resistance in Enterobacteriaceae. The main objective was to assess the relative contribution (RC) of different types of meat to the exposure of consumers to ESBL/AmpC and their potential importance for human infections in Denmark. MATERIAL AND METHODS The prevalence of each genotype of ESBL/AmpC-producing E. coli in imported and nationally produced broiler meat, pork and beef was weighted by the meat consumption patterns. Data originated from the Danish surveillance program for antibiotic use and antibiotic resistance (DANMAP) from 2009 to 2011. DANMAP also provided data about human ESBL/AmpC cases in 2011, which were used to assess a possible genotype overlap. Uncertainty about the occurrence of ESBL/AmpC-producing E. coli in meat was assessed by inspecting beta distributions given the available data of the genotypes in each type of meat. RESULTS AND DISCUSSION Broiler meat represented the largest part (83.8%) of the estimated ESBL/AmpC-contaminated pool of meat compared to pork (12.5%) and beef (3.7%). CMY-2 was the genotype with the highest RC to human exposure (58.3%). However, this genotype is rarely found in human infections in Denmark. CONCLUSION The overlap between ESBL/AmpC genotypes in meat and human E. coli infections was limited. This suggests that meat might constitute a less important source of ESBL/AmpC exposure to humans in Denmark than previously thought - maybe because the use of cephalosporins is restricted in cattle and banned in poultry and pigs. Nonetheless, more detailed surveillance data are required to determine the contribution of meat compared to other sources, such as travelling, pets, water resources, community and hospitals in the pursuit of a full source attribution model.