901 resultados para POTENTIAL HEALTH-RISK
Resumo:
I vegetali appartenenti alla famiglia delle Brassicaceae, sono ricchi di molecole biologicamente attive note per le numerose proprietà salutari. L’effetto di un estratto di germogli di cavolo nero toscano (TBCSE) è stato investigato, in termini chemiopreventivi, sugli enzimi epatici del metabolismo degli xenobiotici e antiossidanti, in ratti trattati con TBCSE. I risultati hanno mostrato un complesso pattern di modulazione, con una prevalente inibizione, del sistema citocromo P450-dipendente, e induzioni significative degli enzimi di fase II (glutatione transferasi e glucuronosiltransferasi) e antiossidanti (catalasi, NAD(P)H:chinone reduttasi, glutatione reduttasi e perossidasi). Successivamente, l’effetto di TBCSE è stato studiato nei confronti delle alterazioni provocate da un’alimentazione iperlipidica nel ratto. Il trattamento si è dimostrato efficace nel contrastare gli effetti deleteri dei grassi presenti nella dieta, come l’iperlipidemia, l’aumento del peso corporeo e del fegato, l’indebolimento delle attività degli enzimi antiossidanti e del potenziale detossificante a livello epatico. Complessivamente, TBCSE emerge essere un promettente prodotto nutraceutico con potenziali effetti chemiopreventivi, e da impiegare come strategia alimentare per contrastare gli effetti correlati ad una dieta iperlipidica. Il consumo di dosi sovralimentari di molecole isolate dalle Brassicaceae, tramite per esempio integratori dietetici, come strategia alimentare preventiva, potrebbe tuttavia rappresentare un rischio per la salute. La potenziale tossicità del sulforafane, glucorafanina, indolo-3-carbinolo, e 3,3'-diindolimetano, è stata valutata in epatociti primari di ratto. La citotossicità e l’induzione di stress ossidativo, osservate a concentrazioni non lontane da quelle che potrebbero essere raggiunte in vivo, insieme ad una forte modulazione dell’espressione genica, riguardante principalmente il metabolismo degli xenobiotici, risposte ad alterazioni dello stato ossidoredutivo, eventi di riparazione del DNA e di proteine, induzione dell’apoptosi, e meccanismi (co)cancerogeni, sottolineano la potenzialità di queste molecole di determinare un rischio tossicologico, in seguito ad un’assunzione prolungata e ad alte dosi.
Resumo:
Cellulose nanofibers are an attractive component of a broad range of nanomaterials. Their intriguing mechanical properties and low cost, as well as the renewable nature of cellulose make them an appealing alternative to carbon nanotubes (CNTs), which may pose a considerable health risk when inhaled. Little is known, however, concerning the potential toxicity of aerosolized cellulose nanofibers. Using a 3D in vitro triple cell coculture model of the human epithelial airway barrier, it was observed that cellulose nanofibers isolated from cotton (CCN) elicited a significantly (p < 0.05) lower cytotoxicity and (pro-)inflammatory response than multiwalled CNTs (MWCNTs) and crocidolite asbestos fibers (CAFs). Electron tomography analysis also revealed that the intracellular localization of CCNs is different from that of both MWCNTs and CAFs, indicating fundamental differences between each different nanofibre type in their interaction with the human lung cell coculture. Thus, the data shown in the present study highlights that not only the length and stiffness determine the potential detrimental (biological) effects of any nanofiber, but that the material used can significantly affect nanofiber-cell interactions.
Resumo:
INTRODUCTION: Surgical site infections (SSI) are the most common hospital-acquired infections among surgical patients, with significant impact on patient morbidity and health care costs. The Basel SSI Cohort Study was performed to evaluate risk factors and validate current preventive measures for SSI. The objective of the present article was to review the main results of this study and its implications for clinical practice and future research. SUMMARY OF METHODS OF THE BASEL SSI COHORT STUDY: The prospective observational cohort study included 6,283 consecutive general surgery procedures closely monitored for evidence of SSI up to 1 year after surgery. The dataset was analysed for the influence of various potential SSI risk factors, including timing of surgical antimicrobial prophylaxis (SAP), glove perforation, anaemia, transfusion and tutorial assistance, using multiple logistic regression analyses. In addition, post hoc analyses were performed to assess the economic burden of SSI, the efficiency of the clinical SSI surveillance system, and the spectrum of SSI-causing pathogens. REVIEW OF MAIN RESULTS OF THE BASEL SSI COHORT STUDY: The overall SSI rate was 4.7% (293/6,283). While SAP was administered in most patients between 44 and 0 minutes before surgical incision, the lowest risk of SSI was recorded when the antibiotics were administered between 74 and 30 minutes before surgery. Glove perforation in the absence of SAP increased the risk of SSI (OR 2.0; CI 1.4-2.8; p <0.001). No significant association was found for anaemia, transfusion and tutorial assistance with the risk of SSI. The mean additional hospital cost in the event of SSI was CHF 19,638 (95% CI, 8,492-30,784). The surgical staff documented only 49% of in-hospital SSI; the infection control team registered the remaining 51%. Staphylococcus aureus was the most common SSI-causing pathogen (29% of all SSI with documented microbiology). No case of an antimicrobial-resistant pathogen was identified in this series. CONCLUSIONS: The Basel SSI Cohort Study suggested that SAP should be administered between 74 and 30 minutes before surgery. Due to the observational nature of these data, corroboration is planned in a randomized controlled trial, which is supported by the Swiss National Science Foundation. Routine change of gloves or double gloving is recommended in the absence of SAP. Anaemia, transfusion and tutorial assistance do not increase the risk of SSI. The substantial economic burden of in-hospital SSI has been confirmed. SSI surveillance by the surgical staff detected only half of all in-hospital SSI, which prompted the introduction of an electronic SSI surveillance system at the University Hospital of Basel and the Cantonal Hospital of Aarau. Due to the absence of multiresistant SSI-causing pathogens, the continuous use of single-shot single-drug SAP with cefuroxime (plus metronidazole in colorectal surgery) has been validated.
Resumo:
Soil erosion models and soil erosion risk maps are often used as indicators to assess potential soil erosion in order to assist policy decisions. This paper shows the scientific basis of the soil erosion risk map of Switzerland and its application in policy and practice. Linking a USLE/RUSLE-based model approach (AVErosion) founded on multiple flow algorithms and the unit contributing area concept with an extremely precise and high-resolution digital terrain model (2 m × 2 m grid) using GIS allows for a realistic assessment of the potential soil erosion risk, on single plots, i.e. uniform and comprehensive for the agricultural area of Switzerland (862,579 ha in the valley area and the lower mountain regions). The national or small-scale soil erosion prognosis has thus reached a level heretofore possible only in smaller catchment areas or single plots. Validation was carried out using soil loss data from soil erosion damage mappings in the field from long-term monitoring in different test areas. 45% of the evaluated agricultural area of Switzerland was classified as low potential erosion risk, 12% as moderate potential erosion risk, and 43% as high potential erosion risk. However, many of the areas classified as high potential erosion risk are located at the transition from valley to mountain zone, where many areas are used as permanent grassland, which drastically lowers their current erosion risk. The present soil erosion risk map serves on the one hand to identify and prioritise the high-erosion risk areas, and on the other hand to promote awareness amongst farmers and authorities. It was published on the internet and will be made available to the authorities in digital form. It is intended as a tool for simplifying and standardising enforcement of the legal framework for soil erosion prevention in Switzerland. The work therefore provides a successful example of cooperation between science, policy and practice.
Resumo:
OBJECTIVES: To validate the Probability of Repeated Admission (Pra) questionnaire, a widely used self-administered tool for predicting future healthcare use in older persons, in three European healthcare systems. DESIGN: Prospective study with 1-year follow-up. SETTING: Hamburg, Germany; London, United Kingdom; Canton of Solothurn, Switzerland. PARTICIPANTS: Nine thousand seven hundred thirteen independently living community-dwelling people aged 65 and older. MEASUREMENTS: Self-administered eight-item Pra questionnaire at baseline. Self-reported number of hospital admissions and physician visits during 1 year of follow-up. RESULTS: In the combined sample, areas under the receiver operating characteristic curves (AUCs) were 0.64 (95% confidence interval (CI)=0.62-0.66) for the prediction of one or more hospital admissions and 0.68 (95% CI=0.66-0.69) for the prediction of more than six physician visits during the following year. AUCs were similar between sites. In comparison, prediction models based on a person's age and sex alone exhibited poor predictive validity (AUC
Resumo:
The potential health effects of inhaled engineered nanoparticles are almost unknown. To avoid and replace toxicity studies with animals, a triple cell co-culture system composed of epithelial cells, macrophages and dendritic cells was established, which simulates the most important barrier functions of the epithelial airway. Using this model, the toxic potential of titanium dioxide was assessed by measuring the production of reactive oxygen species and the release of tumour necrosis factor alpha. The intracellular localisation of titanium dioxide nanoparticles was analyzed by energy filtering transmission electron microscopy. Titanium dioxide nanoparticles were detected as single particles without membranes and in membrane-bound agglomerates. Cells incubated with titanium dioxide particles showed an elevated production of reactive oxygen species but no increase of the release of tumour necrosis factor alpha. Our in vitro model of the epithelial airway barrier offers a valuable tool to study the interaction of particles with lung cells at a nanostructural level and to investigate the toxic potential of nanoparticles.
Resumo:
BACKGROUND: Several epidemiological studies show that inhalation of particulate matter may cause increased pulmonary morbidity and mortality. Of particular interest are the ultrafine particles that are particularly toxic. In addition more and more nanoparticles are released into the environment; however, the potential health effects of these nanoparticles are yet unknown. OBJECTIVES: To avoid particle toxicity studies with animals many cell culture models have been developed during the past years. METHODS: This review focuses on the most commonly used in vitro epithelial airway and alveolar models to study particle-cell interactions and particle toxicity and highlights advantages and disadvantages of the different models. RESULTS/CONCLUSION: There are many lung cell culture models but none of these models seems to be perfect. However, they might be a great tool to perform basic research or toxicity tests. The focus here is on 3D and co-culture models, which seem to be more realistic than monocultures.
Resumo:
Meat and meat products can be contaminated with different species of bacteria resistant to various antimicrobials. The human health risk of a type of meat or meat product carry by emerging antimicrobial resistance depends on (i) the prevalence of contamination with resistant bacteria, (ii) the human health consequences of an infection with a specific bacterium resistant to a specific antimicrobial and (iii) the consumption volume of a specific product. The objective of this study was to compare the risk for consumers arising from their exposure to antibiotic resistant bacteria from meat of four different types (chicken, pork, beef and veal), distributed in four different product categories (fresh meat, frozen meat, dried raw meat products and heat-treated meat products). A semi-quantitative risk assessment model, evaluating each food chain step, was built in order to get an estimated score for the prevalence of Campylobacter spp., Enterococcus spp. and Escherichia coli in each product category. To assess human health impact, nine combinations of bacterial species and antimicrobial agents were considered based on a published risk profile. The combination of the prevalence at retail, the human health impact and the amount of meat or product consumed, provided the relative proportion of total risk attributed to each category of product, resulting in a high, medium or low human health risk. According to the results of the model, chicken (mostly fresh and frozen meat) contributed 6.7% of the overall risk in the highest category and pork (mostly fresh meat and dried raw meat products) contributed 4.0%. The contribution of beef and veal was of 0.4% and 0.1% respectively. The results were tested and discussed for single parameter changes of the model. This risk assessment was a useful tool for targeting antimicrobial resistance monitoring to those meat product categories where the expected risk for public health was greater.
Resumo:
Calf losses (CL, mortality and unwanted early slaughter) in veal production are of great economic importance and an indicator of welfare. The objective of the present study was to evaluate CL and the causes of death on farms with a specific animal welfare standard (SAW) which exceeds the Swiss statutory regulations. Risk factors for CL were identified based on information about management, housing, feeding, and medication. In total, 74 production cohorts (2783 calves) of 15 farms were investigated. CL was 3.6%, the main causes of death were digestive disorders (52%), followed by respiratory diseases (28%). Factors significantly associated with an increased risk for CL were a higher number of individual daily doses of antibiotics (DDA), insufficient wind deflection in winter, and male gender. For administration of antibiotics to all calves of the cohort, a DDA of 14-21 was associated with a decreased risk for CL compared to a DDA of 7-13 days.
Resumo:
Background Few data exist on tuberculosis (TB) incidence according to time from HIV seroconversion in high-income countries and whether rates following initiation of a combination of antiretroviral treatments (cARTs) differ from those soon after seroconversion. Methods Data on individuals with well estimated dates of HIV seroconversion were used to analyse post-seroconversion TB rates, ending at the earliest of 1 January 1997, death or last clinic visit. TB rates were also estimated following cART initiation, ending at the earliest of death or last clinic visit. Poisson models were used to examine the effect of current and past level of immunosuppression on TB risk after cART initiation. Results Of 19 815 individuals at risk during 1982–1996, TB incidence increased from 5.89/1000 person-years (PY) (95% CI 3.77 to 8.76) in the first year after seroconversion to 10.56 (4.83 to 20.04, p=0.01) at 10 years. Among 11 178 TB-free individuals initiating cART, the TB rate in the first year after cART initiation was 4.23/1000 PY (3.07 to 5.71) and dropped thereafter, remaining constant from year 2 onwards averaging at 1.64/1000 PY (1.29 to 2.05). Current CD4 count was inversely associated with TB rates, while nadir CD4 count was not associated with TB rates after adjustment for current CD4 count, HIV-RNA at cART initiation. Conclusions TB risk increases with duration of HIV infection in the absence of cART. Following cART initiation, TB incidence rates were lower than levels immediately following seroconversion. Implementation of current recommendations to prevent TB in early HIV infection could be beneficial.
Resumo:
Research on endocrine disruption in fish has been dominated by studies on estrogen-active compounds which act as mimics of the natural estrogen, 17β-estradiol (E2), and generally exert their biological actions by binding to and activation of estrogen receptors (ERs). Estrogens play central roles in reproductive physiology and regulate (female) sexual differentiation. In line with this, most adverse effects reported for fish exposed to environmental estrogens relate to sexual differentiation and reproduction. E2, however, utilizes a variety of signaling mechanisms, has multifaceted functions and targets, and therefore the toxicological and ecological effects of environmental estrogens in fish will extend beyond those associated with the reproduction. This review first describes the diversity of estrogen receptor signaling in fish, including both genomic and non-genomic mechanisms, and receptor crosstalk. It then considers the range of non-reproductive physiological processes in fish that are known to be responsive to estrogens, including sensory systems, the brain, the immune system, growth, specifically through the growth hormone/insulin-like growth factor system, and osmoregulation. The diversity in estrogen responses between fish species is then addressed, framed within evolutionary and ecological contexts, and we make assessments on their relevance for toxicological sensitivity as well as ecological vulnerability. The diversity of estrogen actions raises questions whether current risk assessment strategies, which focus on reproductive endpoints, and a few model fish species only, are protective of the wider potential health effects of estrogens. Available - although limited - evidence nevertheless suggests that quantitative environmental threshold concentrations for environmental protection derived from reproductive tests with model fish species are protective for non-reproductive effects as well. The diversity of actions of estrogens across divergent physiological systems, however, may lead to and underestimation of impacts on fish populations as their effects are generally considered on one functional process only and this may underrepresent the impact on the different physiological processes collectively.
Resumo:
OBJECT The risk of recurrence of cerebrovascular events within the first 72 hours of admission in patients hospitalized with symptomatic carotid artery (CA) stenoses and the risks and benefits of emergency CA intervention within the first hours after the onset of symptoms are not well known. Therefore, the authors aimed to assess (1) the ipsilateral recurrence rate within 72 hours of admission, in the period from 72 hours to 7 days, and after 7 days in patients presenting with nondisabling stroke, transient ischemic attack (TIA), or amaurosis fugax (AF), and with an ipsilateral symptomatic CA stenosis of 50% or more, and (2) the risk of stroke in CA interventions within 48 hours of admission versus the risk in interventions performed after 48 hours. METHODS Ninety-four patients were included in this study. These patients were admitted to hospital within 48 hours of a nondisabling stroke, TIA, or AF resulting from a symptomatic CA stenosis of 50% or more. The patients underwent carotid endarterectomy (85 patients) or CA stenting (9 patients). At baseline, the cardiovascular risk factors of the patients, the degree of symptomatic CA stenosis, and the type of secondary preventive treatment were assessed. The in-hospital recurrence rate of stroke, TIA, or AF ipsilateral to the symptomatic CA stenosis was determined for the first 72 hours after admission, from 72 hours to 7 days, and after 7 days. Procedure-related cerebrovascular events were also recorded. RESULTS The median time from symptom onset to CA intervention was 5 days (interquartile range 3.00-9.25 days). Twenty-one patients (22.3%) underwent CA intervention within 48 hours after being admitted. Overall, 15 recurrent cerebrovascular events were observed in 12 patients (12.8%) in the period between admission and CA intervention: 3 strokes (2 strokes in progress and 1 stroke) (3.2%), 5 TIAs (5.3%), and 1 AF (1.1%) occurred within the first 72 hours (total 9.6%) of admission; 1 TIA (1.1%) occurred between 72 hours and 7 days, and 5 TIAs (5.3%) occurred after more than 7 days. The corresponding actuarial cerebrovascular recurrence rates were 11.4% (within 72 hours of admission), 2.4% (between 72 hours and 7 days), and 7.9% (after 7 days). Among baseline characteristics, no predictive factors for cerebrovascular recurrence were identified. Procedure-related cerebrovascular events occurred at a rate of 4.3% (3 strokes and 1 TIA), and procedures performed within the first 48 hours and procedures performed after 48 hours had a similar frequency of these events (4.5% vs. 4.1%, respectively; p = 0.896). CONCLUSIONS The in-hospital recurrence of cerebrovascular events was quite low, but all recurrent strokes occurred within 72 hours. The risk of stroke associated with a CA intervention performed within the first 48 hours was not increased compared with that for later interventions. This raises the question of the optimal timing of CA intervention in symptomatic CA stenosis. To answer this question, more data are needed, preferably from large randomized trials.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.
Resumo:
OBJECTIVE To explore the risk of ischemic stroke, hemorrhagic stroke, or TIA in patients with Alzheimer disease (AD) or vascular dementia (VD). METHODS We conducted a follow-up study with a nested case-control analysis using the UK-based General Practice Research Database. We included patients aged 65 years and older with an incident diagnosis of AD or VD between 1998 and 2008 and a comparison group of dementia-free patients. We estimated incidence rates of ischemic stroke, hemorrhagic stroke, or TIA in patients with AD, VD, or without dementia, and we calculated odds ratios with 95% confidence intervals (CIs) of developing such an outcome in patients with AD or VD, stratified by use of antipsychotic drugs. RESULTS We followed 6,443 cases with AD, 2,302 with VD, and 9,984 dementia-free patients over time and identified 281 cases with incident ischemic stroke, 139 with hemorrhagic stroke, and 379 with TIA. The incidence rates of ischemic stroke for patients with AD, VD, or no dementia were 4.7/1,000 person-years (PYs) (95% CI 3.8-5.9), 12.8/1,000 PYs (95% CI 9.8-16.8), and 5.1/1,000 PYs (95% CI 4.3-5.9), respectively. Compared with dementia-free patients, the odds ratio of developing a TIA for patients with AD treated with atypical antipsychotic drugs was 4.5 (95% CI 2.1-9.2). CONCLUSIONS Patients with VD, but not AD, have a markedly higher risk of developing an ischemic stroke than those without dementia. In patients with AD, but not VD, use of atypical antipsychotic drugs was associated with an increased risk of TIA.
Resumo:
Diarrhea disease is a leading cause of morbidity and mortality, especially in children in developing countries. An estimate of the global mortality caused by diarrhea among children under five years of age was 3.3 million deaths per year. Cryptosporidium parvum was first identified in 1907, but it was not until 1970 that this organism was recognized as a cause of diarrhea in calves. Then it was as late as 1976 that the first reported case of human Cryptosporidiosis occurred. This study was conducted to ascertain the risk factors of first symptomatic infection with Cryptosporidium parvum in a cohort of infants in a rural area of Egypt. The cohort was followed from birth through the first year of life. Univariate and multivariate analyses of data demonstrated that infants greater than six months of age had a two-fold risk of infection compared with infants less than six months of age (RR = 2.17; 95% C.I. = 1.01-4.82). When stratified, male infants greater than six months of age were four times more likely to become infected than male infants less than six months of age. Among female infants, there was no difference in risk between infants greater than six months of age and infants less than six months of age. Female infants less than six months of age were twice more likely to become infected than male infants less than six months of age. The reverse occurred for infants greater than six months of age, i.e., male infants greater than six months of age had twice the risk of infection compared to females of the same age group. Further analysis of the data revealed an increased risk of Cryptosporidiosis infection in infants who were attended in childbirth by traditional childbirth attendants compared to infants who were attended by modern childbirth attendants (nurses, trained midwives, physicians) (RR = 4. 18; 95% C.I. = 1.05-36.06). The final risk factor of significance was the number of people residing in the household. Infants in households which housed more than seven persons had an almost two-fold risk of infection compared with infants in homes with fewer than seven persons. Other risk factors which suggested increased risk were lack of education among the mothers, absence of latrines and faucets in the homes, and mud used as building material for walls and floors in the homes. ^