193 resultados para Mortality.
Resumo:
QUESTIONS UNDER STUDY Many persons are travelling all over the world; the elderly with pre-existing diseases also travel to places with less developed health systems. Reportedly, fewer than 0.5% of all travellers need repatriation. We aimed to analyse and examine people who are injured or ill while abroad, where they travelled to and by what means they were repatriated. METHODS Retrospective cross-sectional study with adult patients repatriated to a single level 1 trauma centre in Switzerland (2000-2011). RESULTS A total of 372 patients were repatriated, with an increasing trend per year. Of these, 67% were male; the median age was 56 years. Forty-nine percent sustained an injury, and 13% had surgical and 38% medical pathologies. Patients with medical conditions were older than those with injuries or surgical emergencies (p <0.001). Seventy-three percent were repatriated from Europe. For repatriation from Africa trauma was slightly more frequent (53%, n = 17) than illnesses, whereas for most other countries illnesses and trauma were equally distributed. Injured patients had a median Injury Severity Score of 8. The majority of illnesses involved the nervous system (38%), mainly stroke. Forty-five percent were repatriated by Swiss Air Ambulance, 26% by ground ambulance, 18% by scheduled flights with or without medical assistance and two patients injured near the Swiss boarder by helicopter. The 28-day mortality was 4%. CONCLUSIONS The numbers of travellers repatriated increased from 2000 to 2011. About half were due to illnesses and half due to injuries. The largest group were elderly Swiss nationals repatriated from European countries. As mortality is relatively high, special consideration to this group of patients is warranted.
Resumo:
BACKGROUND Management of tuberculosis in patients with HIV in eastern Europe is complicated by the high prevalence of multidrug-resistant tuberculosis, low rates of drug susceptibility testing, and poor access to antiretroviral therapy (ART). We report 1 year mortality estimates from a multiregional (eastern Europe, western Europe, and Latin America) prospective cohort study: the TB:HIV study. METHODS Consecutive HIV-positive patients aged 16 years or older with a diagnosis of tuberculosis between Jan 1, 2011, and Dec 31, 2013, were enrolled from 62 HIV and tuberculosis clinics in 19 countries in eastern Europe, western Europe, and Latin America. The primary endpoint was death within 12 months after starting tuberculosis treatment; all deaths were classified according to whether or not they were tuberculosis related. Follow-up was either until death, the final visit, or 12 months after baseline, whichever occurred first. Risk factors for all-cause and tuberculosis-related deaths were assessed using Kaplan-Meier estimates and Cox models. FINDINGS Of 1406 patients (834 in eastern Europe, 317 in western Europe, and 255 in Latin America), 264 (19%) died within 12 months. 188 (71%) of these deaths were tuberculosis related. The probability of all-cause death was 29% (95% CI 26-32) in eastern Europe, 4% (3-7) in western Europe, and 11% (8-16) in Latin America (p<0·0001) and the corresponding probabilities of tuberculosis-related death were 23% (20-26), 1% (0-3), and 4% (2-8), respectively (p<0·0001). Patients receiving care outside eastern Europe had a 77% decreased risk of death: adjusted hazard ratio (aHR) 0·23 (95% CI 0·16-0·31). In eastern Europe, compared with patients who started a regimen with at least three active antituberculosis drugs, those who started fewer than three active antituberculosis drugs were at a higher risk of tuberculosis-related death (aHR 3·17; 95% CI 1·83-5·49) as were those who did not have baseline drug-susceptibility tests (2·24; 1·31-3·83). Other prognostic factors for increased tuberculosis-related mortality were disseminated tuberculosis and a low CD4 cell count. 18% of patients were receiving ART at tuberculosis diagnosis in eastern Europe compared with 44% in western Europe and 39% in Latin America (p<0·0001); 12 months later the proportions were 67% in eastern Europe, 92% in western Europe, and 85% in Latin America (p<0·0001). INTERPRETATION Patients with HIV and tuberculosis in eastern Europe have a risk of death nearly four-times higher than that in patients from western Europe and Latin America. This increased mortality rate is associated with modifiable risk factors such as lack of drug susceptibility testing and suboptimal initial antituberculosis treatment in settings with a high prevalence of drug resistance. Urgent action is needed to improve tuberculosis care for patients living with HIV in eastern Europe. FUNDING EU Seventh Framework Programme.
Resumo:
BACKGROUND In an effort to reduce firearm mortality rates in the USA, US states have enacted a range of firearm laws to either strengthen or deregulate the existing main federal gun control law, the Brady Law. We set out to determine the independent association of different firearm laws with overall firearm mortality, homicide firearm mortality, and suicide firearm mortality across all US states. We also projected the potential reduction of firearm mortality if the three most strongly associated firearm laws were enacted at the federal level. METHODS We constructed a cross-sectional, state-level dataset from Nov 1, 2014, to May 15, 2015, using counts of firearm-related deaths in each US state for the years 2008-10 (stratified by intent [homicide and suicide]) from the US Centers for Disease Control and Prevention's Web-based Injury Statistics Query and Reporting System, data about 25 firearm state laws implemented in 2009, and state-specific characteristics such as firearm ownership for 2013, firearm export rates, and non-firearm homicide rates for 2009, and unemployment rates for 2010. Our primary outcome measure was overall firearm-related mortality per 100 000 people in the USA in 2010. We used Poisson regression with robust variances to derive incidence rate ratios (IRRs) and 95% CIs. FINDINGS 31 672 firearm-related deaths occurred in 2010 in the USA (10·1 per 100 000 people; mean state-specific count 631·5 [SD 629·1]). Of 25 firearm laws, nine were associated with reduced firearm mortality, nine were associated with increased firearm mortality, and seven had an inconclusive association. After adjustment for relevant covariates, the three state laws most strongly associated with reduced overall firearm mortality were universal background checks for firearm purchase (multivariable IRR 0·39 [95% CI 0·23-0·67]; p=0·001), ammunition background checks (0·18 [0·09-0·36]; p<0·0001), and identification requirement for firearms (0·16 [0·09-0·29]; p<0·0001). Projected federal-level implementation of universal background checks for firearm purchase could reduce national firearm mortality from 10·35 to 4·46 deaths per 100 000 people, background checks for ammunition purchase could reduce it to 1·99 per 100 000, and firearm identification to 1·81 per 100 000. INTERPRETATION Very few of the existing state-specific firearm laws are associated with reduced firearm mortality, and this evidence underscores the importance of focusing on relevant and effective firearms legislation. Implementation of universal background checks for the purchase of firearms or ammunition, and firearm identification nationally could substantially reduce firearm mortality in the USA. FUNDING None.
Resumo:
Background and Study Aim Intra- and paraventricular tumors are frequently associated with cerebrospinal fluid (CSF) pathway obstruction. Thus the aim of an endoscopic approach is to restore patency of the CSF pathways and to obtain a tumor biopsy. Because endoscopic tumor biopsy may increase tumor cell dissemination, this study sought to evaluate this risk. Patients, Materials, and Methods Forty-four patients who underwent endoscopic biopsies for ventricular or paraventricular tumors between 1993 and 2011 were included in the study. Charts and images were reviewed retrospectively to evaluate rates of adverse events, mortality, and tumor cell dissemination. Adverse events, mortality, and tumor cell dissemination were evaluated. Results Postoperative clinical condition improved in 63.0% of patients, remained stable in 30.4%, and worsened in 6.6%. One patient (2.2%) had a postoperative thalamic stroke leading to hemiparesis and hemineglect. No procedure-related deaths occurred. Postoperative tumor cell dissemination was observed in 14.3% of patients available for follow-up. Conclusions For patients presenting with occlusive hydrocephalus due to tumors in or adjacent to the ventricular system, endoscopic CSF diversion is the procedure of first choice. Tumor biopsy in the current study did not affect safety or efficacy.
Resumo:
Subarachnoid hemorrhage is a stroke subtype with particularly bad outcome. Recent findings suggest that constrictions of pial arterioles occurring early after hemorrhage may be responsible for cerebral ischemia and - subsequently - unfavorable outcome after subarachnoid hemorrhage. Since we recently hypothesized that the lack of nitric oxide may cause post-hemorrhagic microvasospasms, our aim was to investigate whether inhaled nitric oxide, a treatment paradigm selectively delivering nitric oxide to ischemic microvessels, is able to dilate post-hemorrhagic microvasospasms; thereby improving outcome after experimental subarachnoid hemorrhage. C57BL/6 mice were subjected to experimental SAH. Three hours after subarachnoid hemorrhage pial artery spasms were quantified by intravital microscopy, then mice received inhaled nitric oxide or vehicle. For induction of large artery spasms mice received an intracisternal injection of autologous blood. Inhaled nitric oxide significantly reduced number and severity of subarachnoid hemorrhage-induced post-hemorrhage microvasospasms while only having limited effect on large artery spasms. This resulted in less brain-edema-formation, less hippocampal neuronal loss, lack of mortality, and significantly improved neurological outcome after subarachnoid hemorrhage. This suggests that spasms of pial arterioles play a major role for the outcome after subarachnoid hemorrhage and that lack of nitric oxide is an important mechanism of post-hemorrhagic microvascular dysfunction. Reversing microvascular dysfunction by inhaled nitric oxide might be a promising treatment strategy for subarachnoid hemorrhage.
Resumo:
Survivors of childhood cancer have a higher mortality than the general population. We describe cause-specific long-term mortality in a population-based cohort of childhood cancer survivors. We included all children diagnosed with cancer in Switzerland (1976-2007) at age 0-14 years, who survived ≥5 years after diagnosis and followed survivors until December 31, 2012. We obtained causes of death (COD) from the Swiss mortality statistics and used data from the Swiss general population to calculate age-, calendar year- and sex-standardized mortality ratios (SMR), and absolute excess risks (AER) for different COD, by Poisson regression. We included 3'965 survivors and 49'704 person years at risk. Of these, 246 (6.2%) died, which was 11 times higher than expected (SMR 11.0). Mortality was particularly high for diseases of the respiratory (SMR 14.8) and circulatory system (SMR 12.7), and for second cancers (SMR 11.6). The pattern of cause-specific mortality differed by primary cancer diagnosis, and changed with time since diagnosis. In the first 10 years after 5-year survival, 78.9% of excess deaths were caused by recurrence of the original cancer (AER 46.1). Twenty-five years after diagnosis, only 36.5% (AER 9.1) were caused by recurrence, 21.3% by second cancers (AER 5.3) and 33.3% by circulatory diseases (AER 8.3). Our study confirms an elevated mortality in survivors of childhood cancer for at least 30 years after diagnosis with an increased proportion of deaths caused by late toxicities of the treatment. The results underline the importance of clinical follow-up continuing years after the end of treatment for childhood cancer. This article is protected by copyright. All rights reserved.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.
Resumo:
Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.
Resumo:
Systems for the identification and registration of cattle have gradually been receiving attention for use in syndromic surveillance, a relatively recent approach for the early detection of infectious disease outbreaks. Real or near real-time monitoring of deaths or stillbirths reported to these systems offer an opportunity to detect temporal or spatial clusters of increased mortality that could be caused by an infectious disease epidemic. In Switzerland, such data are recorded in the "Tierverkehrsdatenbank" (TVD). To investigate the potential of the Swiss TVD for syndromic surveillance, 3 years of data (2009-2011) were assessed in terms of data quality, including timeliness of reporting and completeness of geographic data. Two time-series consisting of reported on-farm deaths and stillbirths were retrospectively analysed to define and quantify the temporal patterns that result from non-health related factors. Geographic data were almost always present in the TVD data; often at different spatial scales. On-farm deaths were reported to the database by farmers in a timely fashion; stillbirths were less timely. Timeliness and geographic coverage are two important features of disease surveillance systems, highlighting the suitability of the TVD for use in a syndromic surveillance system. Both time series exhibited different temporal patterns that were associated with non-health related factors. To avoid false positive signals, these patterns need to be removed from the data or accounted for in some way before applying aberration detection algorithms in real-time. Evaluating mortality data reported to systems for the identification and registration of cattle is of value for comparing national data systems and as a first step towards a European-wide early detection system for emerging and re-emerging cattle diseases.
Resumo:
BACKGROUND Strategies to improve risk prediction are of major importance in patients with heart failure (HF). Fibroblast growth factor 23 (FGF-23) is an endocrine regulator of phosphate and vitamin D homeostasis associated with an increased cardiovascular risk. We aimed to assess the prognostic effect of FGF-23 on mortality in HF patients with a particular focus on differences between patients with HF with preserved ejection fraction and patients with HF with reduced ejection fraction (HFrEF). METHODS AND RESULTS FGF-23 levels were measured in 980 patients with HF enrolled in the Ludwigshafen Risk and Cardiovascular Health (LURIC) study including 511 patients with HFrEF and 469 patients with HF with preserved ejection fraction and a median follow-up time of 8.6 years. FGF-23 was additionally measured in a second cohort comprising 320 patients with advanced HFrEF. FGF-23 was independently associated with mortality with an adjusted hazard ratio per 1-SD increase of 1.30 (95% confidence interval, 1.14-1.48; P<0.001) in patients with HFrEF, whereas no such association was found in patients with HF with preserved ejection fraction (for interaction, P=0.043). External validation confirmed the significant association with mortality with an adjusted hazard ratio per 1 SD of 1.23 (95% confidence interval, 1.02-1.60; P=0.027). FGF-23 demonstrated an increased discriminatory power for mortality in addition to N-terminal pro-B-type natriuretic peptide (C-statistic: 0.59 versus 0.63) and an improvement in net reclassification index (39.6%; P<0.001). CONCLUSIONS FGF-23 is independently associated with an increased risk of mortality in patients with HFrEF but not in those with HF with preserved ejection fraction, suggesting a different pathophysiologic role for both entities.
Resumo:
Predicting the timing and amount of tree mortality after a forest fire is of paramount importance for post-fire management decisions, such as salvage logging or reforestation. Such knowledge is particularly needed in mountainous regions where forest stands often serve as protection against natural hazards (e.g., snow avalanches, rockfalls, landslides). In this paper, we focus on the drivers and timing of mortality in fire-injured beech trees (Fagus sylvatica L.) in mountain regions. We studied beech forests in the southwestern European Alps, which burned between 1970 and 2012. The results show that beech trees, which lack fire-resistance traits, experience increased mortality within the first two decades post-fire with a timing and amount strongly related to the burn severity. Beech mortality is fast and ubiquitous in high severity sites, whereas small- (DBH <12 cm) and intermediate-diameter (DBH 12–36 cm) trees face a higher risk to die in moderate-severity sites. Large-diameter trees mostly survive, representing a crucial ecological legacy for beech regeneration. Mortality remains low and at a level similar to unburnt beech forests for low burn severity sites. Beech trees diameter, the presence of fungal infestation and elevation are the most significant drivers of mortality. The risk of beech to die increases toward higher elevation and is higher for small-diameter than for large-diameter trees. In case of secondary fungi infestation beech faces generally a higher risk to die. Interestingly, fungi that initiate post-fire tree mortality differ from fungi occurring after mechanical injury. From a management point of view, the insights about the controls of post-fire mortality provided by this study should help in planning post-fire silvicultural measures in montane beech forests.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.