899 resultados para Mortality.
Resumo:
Background and Study Aim Intra- and paraventricular tumors are frequently associated with cerebrospinal fluid (CSF) pathway obstruction. Thus the aim of an endoscopic approach is to restore patency of the CSF pathways and to obtain a tumor biopsy. Because endoscopic tumor biopsy may increase tumor cell dissemination, this study sought to evaluate this risk. Patients, Materials, and Methods Forty-four patients who underwent endoscopic biopsies for ventricular or paraventricular tumors between 1993 and 2011 were included in the study. Charts and images were reviewed retrospectively to evaluate rates of adverse events, mortality, and tumor cell dissemination. Adverse events, mortality, and tumor cell dissemination were evaluated. Results Postoperative clinical condition improved in 63.0% of patients, remained stable in 30.4%, and worsened in 6.6%. One patient (2.2%) had a postoperative thalamic stroke leading to hemiparesis and hemineglect. No procedure-related deaths occurred. Postoperative tumor cell dissemination was observed in 14.3% of patients available for follow-up. Conclusions For patients presenting with occlusive hydrocephalus due to tumors in or adjacent to the ventricular system, endoscopic CSF diversion is the procedure of first choice. Tumor biopsy in the current study did not affect safety or efficacy.
Resumo:
Subarachnoid hemorrhage is a stroke subtype with particularly bad outcome. Recent findings suggest that constrictions of pial arterioles occurring early after hemorrhage may be responsible for cerebral ischemia and - subsequently - unfavorable outcome after subarachnoid hemorrhage. Since we recently hypothesized that the lack of nitric oxide may cause post-hemorrhagic microvasospasms, our aim was to investigate whether inhaled nitric oxide, a treatment paradigm selectively delivering nitric oxide to ischemic microvessels, is able to dilate post-hemorrhagic microvasospasms; thereby improving outcome after experimental subarachnoid hemorrhage. C57BL/6 mice were subjected to experimental SAH. Three hours after subarachnoid hemorrhage pial artery spasms were quantified by intravital microscopy, then mice received inhaled nitric oxide or vehicle. For induction of large artery spasms mice received an intracisternal injection of autologous blood. Inhaled nitric oxide significantly reduced number and severity of subarachnoid hemorrhage-induced post-hemorrhage microvasospasms while only having limited effect on large artery spasms. This resulted in less brain-edema-formation, less hippocampal neuronal loss, lack of mortality, and significantly improved neurological outcome after subarachnoid hemorrhage. This suggests that spasms of pial arterioles play a major role for the outcome after subarachnoid hemorrhage and that lack of nitric oxide is an important mechanism of post-hemorrhagic microvascular dysfunction. Reversing microvascular dysfunction by inhaled nitric oxide might be a promising treatment strategy for subarachnoid hemorrhage.
Resumo:
Survivors of childhood cancer have a higher mortality than the general population. We describe cause-specific long-term mortality in a population-based cohort of childhood cancer survivors. We included all children diagnosed with cancer in Switzerland (1976-2007) at age 0-14 years, who survived ≥5 years after diagnosis and followed survivors until December 31, 2012. We obtained causes of death (COD) from the Swiss mortality statistics and used data from the Swiss general population to calculate age-, calendar year- and sex-standardized mortality ratios (SMR), and absolute excess risks (AER) for different COD, by Poisson regression. We included 3'965 survivors and 49'704 person years at risk. Of these, 246 (6.2%) died, which was 11 times higher than expected (SMR 11.0). Mortality was particularly high for diseases of the respiratory (SMR 14.8) and circulatory system (SMR 12.7), and for second cancers (SMR 11.6). The pattern of cause-specific mortality differed by primary cancer diagnosis, and changed with time since diagnosis. In the first 10 years after 5-year survival, 78.9% of excess deaths were caused by recurrence of the original cancer (AER 46.1). Twenty-five years after diagnosis, only 36.5% (AER 9.1) were caused by recurrence, 21.3% by second cancers (AER 5.3) and 33.3% by circulatory diseases (AER 8.3). Our study confirms an elevated mortality in survivors of childhood cancer for at least 30 years after diagnosis with an increased proportion of deaths caused by late toxicities of the treatment. The results underline the importance of clinical follow-up continuing years after the end of treatment for childhood cancer. This article is protected by copyright. All rights reserved.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.
Resumo:
Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.
Resumo:
Systems for the identification and registration of cattle have gradually been receiving attention for use in syndromic surveillance, a relatively recent approach for the early detection of infectious disease outbreaks. Real or near real-time monitoring of deaths or stillbirths reported to these systems offer an opportunity to detect temporal or spatial clusters of increased mortality that could be caused by an infectious disease epidemic. In Switzerland, such data are recorded in the "Tierverkehrsdatenbank" (TVD). To investigate the potential of the Swiss TVD for syndromic surveillance, 3 years of data (2009-2011) were assessed in terms of data quality, including timeliness of reporting and completeness of geographic data. Two time-series consisting of reported on-farm deaths and stillbirths were retrospectively analysed to define and quantify the temporal patterns that result from non-health related factors. Geographic data were almost always present in the TVD data; often at different spatial scales. On-farm deaths were reported to the database by farmers in a timely fashion; stillbirths were less timely. Timeliness and geographic coverage are two important features of disease surveillance systems, highlighting the suitability of the TVD for use in a syndromic surveillance system. Both time series exhibited different temporal patterns that were associated with non-health related factors. To avoid false positive signals, these patterns need to be removed from the data or accounted for in some way before applying aberration detection algorithms in real-time. Evaluating mortality data reported to systems for the identification and registration of cattle is of value for comparing national data systems and as a first step towards a European-wide early detection system for emerging and re-emerging cattle diseases.
Resumo:
BACKGROUND Strategies to improve risk prediction are of major importance in patients with heart failure (HF). Fibroblast growth factor 23 (FGF-23) is an endocrine regulator of phosphate and vitamin D homeostasis associated with an increased cardiovascular risk. We aimed to assess the prognostic effect of FGF-23 on mortality in HF patients with a particular focus on differences between patients with HF with preserved ejection fraction and patients with HF with reduced ejection fraction (HFrEF). METHODS AND RESULTS FGF-23 levels were measured in 980 patients with HF enrolled in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study including 511 patients with HFrEF and 469 patients with HF with preserved ejection fraction and a median follow-up time of 8.6 years. FGF-23 was additionally measured in a second cohort comprising 320 patients with advanced HFrEF. FGF-23 was independently associated with mortality with an adjusted hazard ratio per 1-SD increase of 1.30 (95% confidence interval, 1.14-1.48; P<0.001) in patients with HFrEF, whereas no such association was found in patients with HF with preserved ejection fraction (for interaction, P=0.043). External validation confirmed the significant association with mortality with an adjusted hazard ratio per 1 SD of 1.23 (95% confidence interval, 1.02-1.60; P=0.027). FGF-23 demonstrated an increased discriminatory power for mortality in addition to N-terminal pro-B-type natriuretic peptide (C-statistic: 0.59 versus 0.63) and an improvement in net reclassification index (39.6%; P<0.001). CONCLUSIONS FGF-23 is independently associated with an increased risk of mortality in patients with HFrEF but not in those with HF with preserved ejection fraction, suggesting a different pathophysiologic role for both entities.
Resumo:
Predicting the timing and amount of tree mortality after a forest fire is of paramount importance for post-fire management decisions, such as salvage logging or reforestation. Such knowledge is particularly needed in mountainous regions where forest stands often serve as protection against natural hazards (e.g., snow avalanches, rockfalls, landslides). In this paper, we focus on the drivers and timing of mortality in fire-injured beech trees (Fagus sylvatica L.) in mountain regions. We studied beech forests in the southwestern European Alps, which burned between 1970 and 2012. The results show that beech trees, which lack fire-resistance traits, experience increased mortality within the first two decades post-fire with a timing and amount strongly related to the burn severity. Beech mortality is fast and ubiquitous in high severity sites, whereas small- (DBH <12 cm) and intermediate-diameter (DBH 12–36 cm) trees face a higher risk to die in moderate-severity sites. Large-diameter trees mostly survive, representing a crucial ecological legacy for beech regeneration. Mortality remains low and at a level similar to unburnt beech forests for low burn severity sites. Beech trees diameter, the presence of fungal infestation and elevation are the most significant drivers of mortality. The risk of beech to die increases toward higher elevation and is higher for small-diameter than for large-diameter trees. In case of secondary fungi infestation beech faces generally a higher risk to die. Interestingly, fungi that initiate post-fire tree mortality differ from fungi occurring after mechanical injury. From a management point of view, the insights about the controls of post-fire mortality provided by this study should help in planning post-fire silvicultural measures in montane beech forests.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.
Resumo:
The brown alga Ascophyllum nodosum is a dominant rocky intertidal organism throughout much of the North Atlantic Ocean, yet its inability to colonize exposed or denuded shores is well recognized. Our experimental data show that wave action is a major source of mortality to recently settled zygotes. Artificially recruited zygotes consistently exhibited a Type IV survivorship curve in the presence of moving water. As few as 10, but often only 1 relatively low energy wave removed 85 to 99% of recently settled zygotes. Increasing the setting time for attachment of zygotes (prior to disturbance from water movement) had a positive effect on survival. However, survival was significantly lower at high densities, and decreased at long (24 h) setting times, probably as a result of bacteria on the surface of zygotes. Spatial refuges provided significant protection from gentle water movement but relatively little protection from waves.
Resumo:
Worker populations are potentially exposed to multiple chemical substances simultaneously during the performance of routine tasks. The acute health effects from exposure to toxic concentrations of these substances are usually well-described. However, very little is known about the long-term health effects of chronic low dose exposure to all except a few chemical substances. A mortality study was performed on a population of workers employed at a butyl rubber manufacturing plant in Baton Rouge, Louisiana for the period 1943-1978, with special emphasis on potential exposure to methyl chloride.^ The study population was enumerated using company records. The mortality experience among the population was evaluated by comparing the number of observed deaths (total and cause-specific) to the expected number of deaths, based on the U.S. general age, race, sex specific rates. An internal comparison population was assembled to address the issue of lack of comparability when the U.S. rates are used to calculate expected deaths in an employed population.^ There were 18% fewer total observed deaths compared to the expected when the U.S. death rates were used to obtain the expected. Deaths from specific causes were also less than expected except when numbers of observed and expected deaths were small. Similar results were obtained when the population was characterized by intensity and duration of potential exposure to methyl chloride. When the internal comparison population was utilized to evaluate overall mortality of the study population, the relative risk was about 1.2.^ The study results were discussed and conclusions drawn in light of certain limitations of the methodology and study population size. ^
Resumo:
Severe liver injury (SLI) due to drugs is a frequent cause of catastrophic illness and hospitalization. Due to significant morbidity, mortality, and excess medical care costs, this poses a challenge as a public health problem. The role of associated risk factors like alcohol consumption in contributing to the high mortality remains to be studied. This study was conducted to assess the impact of alcohol use on mortality in IDILI patients, while adjusting for age, gender, race/ethnicity, and education level. The data from this study indicate only a small excess risk of death among IDILI patients using alcohol, but the difference was not statistically significant. The major contribution of this study to the field of public health is that it excludes a large hazard of alcohol consumption on the mortality among idiosyncratic drug induced liver injury (IDILI) patients. ^
Resumo:
The aim of this study was to determine cancer mortality rates for the United Arab Emirates (UAE) and to create an atlas of cancer mortality for the UAE. This atlas is the first of its kind in the Gulf country and the Middle East. Death certificates were reviewed for a period from January 1, 1990 to December 31, 1999 and cancer deaths were identified. Cancer mortality cases were verified by comparing with medical records. Age-adjusted cancer mortality rates were calculated by gender, emirate/medical district and nationality (UAE nationals and overall UAE population). Individual rates for each emirate were compared to the overall rate of the corresponding population for the same cancer site and gender. Age-adjusted rates were mapped using MapInfo software. High rates for liver, lung and stomach cancer were observed in Abu Dhabi, Dubai and the northern emirates, respectively. Rates for UAE nationals were greater compared to the overall UAE population. Several factors were suggested that may account for high rates of specific cancers observed in certain emirates. It is hoped that this atlas will provide leads that will guide further epidemiologic and public health activities aimed at preventing cancer. ^