960 resultados para 360 Social problems
Resumo:
BACKGROUND Cardiac and thoracic surgery are associated with an increased risk of venous thromboembolism (VTE). The safety and efficacy of primary thromboprophylaxis in patients undergoing these types of surgery is uncertain. OBJECTIVES To assess the effects of primary thromboprophylaxis on the incidence of symptomatic VTE and major bleeding in patients undergoing cardiac or thoracic surgery. SEARCH METHODS The Cochrane Peripheral Vascular Diseases Group Trials Search Co-ordinator searched the Specialised Register (last searched May 2014) and CENTRAL (2014, Issue 4). The authors searched the reference lists of relevant studies, conference proceedings, and clinical trial registries. SELECTION CRITERIA Randomised controlled trials (RCTs) and quasi-RCTs comparing any oral or parenteral anticoagulant or mechanical intervention to no intervention or placebo, or comparing two different anticoagulants. DATA COLLECTION AND ANALYSIS We extracted data on methodological quality, participant characteristics, interventions, and outcomes including symptomatic VTE and major bleeding as the primary effectiveness and safety outcomes, respectively. MAIN RESULTS We identified 12 RCTs and one quasi-RCT (6923 participants), six for cardiac surgery (3359 participants) and seven for thoracic surgery (3564 participants). No study evaluated fondaparinux, the new oral direct thrombin, direct factor Xa inhibitors, or caval filters. All studies had major study design flaws and most lacked a placebo or no treatment control group. We typically graded the quality of the overall body of evidence for the various outcomes and comparisons as low, due to imprecise estimates of effect and risk of bias. We could not pool data because of the different comparisons and the lack of data. In cardiac surgery, 71 symptomatic VTEs occurred in 3040 participants from four studies. In a study of 2551 participants, representing 85% of the review population in cardiac surgery, the combination of unfractionated heparin with pneumatic compression stockings was associated with a 61% reduction of symptomatic VTE compared to unfractionated heparin alone (1.5% versus 4.0%; risk ratio (RR) 0.39; 95% confidence interval (CI) 0.23 to 0.64). Major bleeding was only reported in one study, which found a higher incidence with vitamin K antagonists compared to platelet inhibitors (11.3% versus 1.6%, RR 7.06; 95% CI 1.64 to 30.40). In thoracic surgery, 15 symptomatic VTEs occurred in 2890 participants from six studies. In the largest study evaluating unfractionated heparin versus an inactive control the rates of symptomatic VTE were 0.7% versus 0%, respectively, giving a RR of 6.71 (95% CI 0.40 to 112.65). There was insufficient evidence to determine if there was a difference in the risk of major bleeding from two studies evaluating fixed-dose versus weight-adjusted low molecular weight heparin (2.7% versus 8.1%, RR 0.33; 95% CI 0.07 to 1.60) and unfractionated heparin versus low molecular weight heparin (6% and 4%, RR 1.50; 95% CI 0.26 to 8.60). AUTHORS' CONCLUSIONS The evidence regarding the efficacy and safety of thromboprophylaxis in cardiac and thoracic surgery is limited. Data for important outcomes such as pulmonary embolism or major bleeding were often lacking. Given the uncertainties around the benefit-to-risk balance, no conclusions can be drawn and a case-by-case risk evaluation of VTE and bleeding remains preferable.
Resumo:
Ataxia telangiectasia (A-T) is a rare, progressive, multisystem disease that has a large number of complex and diverse manifestations which vary with age. Patients with A-T die prematurely with the leading causes of death being respiratory diseases and cancer. Respiratory manifestations include immune dysfunction leading to recurrent upper and lower respiratory infections; aspiration resulting from dysfunctional swallowing due to neurodegenerative deficits; inefficient cough; and interstitial lung disease/pulmonary fibrosis. Malnutrition is a significant comorbidity. The increased radiosensitivity and increased risk of cancer should be borne in mind when requesting radiological investigations. Aggressive proactive monitoring and treatment of these various aspects of lung disease under multidisciplinary expertise in the experience of national multidisciplinary clinics internationally forms the basis of this statement on the management of lung disease in A-T. Neurological management is outwith the scope of this document.
Resumo:
The role of endothelial progenitor cells (EPCs) in peripheral artery disease (PAD) remains unclear. We hypothesized that EPC mobilization and function play a central role in the development of endothelial dysfunction and directly influence the degree of atherosclerotic burden in peripheral artery vessels. The number of circulating EPCs, defined as CD34(+)/KDR(+) cells, were assessed by flow cytometry in 91 subjects classified according to a predefined sample size of 31 non-diabetic PAD patients, 30 diabetic PAD patients, and 30 healthy volunteers. Both PAD groups had undergone endovascular treatment in the past. As a functional parameter, EPC colony-forming units were determined ex vivo. Apart from a broad laboratory analysis, a series of clinical measures using the ankle-brachial index (ABI), flow-mediated dilatation (FMD) and carotid intima-media thickness (cIMT) were investigated. A significant reduction of EPC counts and proliferation indices in both PAD groups compared to healthy subjects were observed. Low EPC number and pathological findings in the clinical assessment were strongly correlated to the group allocation. Multivariate statistical analysis revealed these findings to be independent predictors of disease appearance. Linear regression analysis showed the ABI to be a predictor of circulating EPC number (p=0.02). Moreover, the functionality of EPCs was correlated by linear regression (p=0.017) to cIMT. The influence of diabetes mellitus on EPCs in our study has to be considered marginal in already disease-affected patients. This study demonstrated that EPCs could predict the prevalence and severity of symptomatic PAD, with ABI as the determinant of the state of EPC populations in disease-affected groups.
Resumo:
Mycobacterium tuberculosis strains of the Beijing lineage are globally distributed and are associated with the massive spread of multidrug-resistant (MDR) tuberculosis in Eurasia. Here we reconstructed the biogeographical structure and evolutionary history of this lineage by genetic analysis of 4,987 isolates from 99 countries and whole-genome sequencing of 110 representative isolates. We show that this lineage initially originated in the Far East, from where it radiated worldwide in several waves. We detected successive increases in population size for this pathogen over the last 200 years, practically coinciding with the Industrial Revolution, the First World War and HIV epidemics. Two MDR clones of this lineage started to spread throughout central Asia and Russia concomitantly with the collapse of the public health system in the former Soviet Union. Mutations identified in genes putatively under positive selection and associated with virulence might have favored the expansion of the most successful branches of the lineage.
Resumo:
This work deals with parallel optimization of expensive objective functions which are modelled as sample realizations of Gaussian processes. The study is formalized as a Bayesian optimization problem, or continuous multi-armed bandit problem, where a batch of q > 0 arms is pulled in parallel at each iteration. Several algorithms have been developed for choosing batches by trading off exploitation and exploration. As of today, the maximum Expected Improvement (EI) and Upper Confidence Bound (UCB) selection rules appear as the most prominent approaches for batch selection. Here, we build upon recent work on the multipoint Expected Improvement criterion, for which an analytic expansion relying on Tallis’ formula was recently established. The computational burden of this selection rule being still an issue in application, we derive a closed-form expression for the gradient of the multipoint Expected Improvement, which aims at facilitating its maximization using gradient-based ascent algorithms. Substantial computational savings are shown in application. In addition, our algorithms are tested numerically and compared to state-of-the-art UCB-based batchsequential algorithms. Combining starting designs relying on UCB with gradient-based EI local optimization finally appears as a sound option for batch design in distributed Gaussian Process optimization.
Resumo:
BACKGROUND National safety alert systems publish relevant information to improve patient safety in hospitals. However, the information has to be transformed into local action to have an effect on patient safety. We studied three research questions: How do Swiss healthcare quality and risk managers (qm/rm(1)) see their own role in learning from safety alerts issued by the Swiss national voluntary reporting and analysis system? What are their attitudes towards and evaluations of the alerts, and which types of improvement actions were fostered by the safety alerts? METHODS A survey was developed and applied to Swiss healthcare risk and quality managers, with a response rate of 39Â % (n=116). Descriptive statistics are presented. RESULTS The qm/rm disseminate and communicate with a broad variety of professional groups about the alerts. While most respondents felt that they should know the alerts and their contents, only a part of them felt responsible for driving organizational change based on the recommendations. However, most respondents used safety alerts to back up their own patient safety goals. The alerts were evaluated positively on various dimensions such as usefulness and were considered as standards of good practice by the majority of the respondents. A range of organizational responses was applied, with disseminating information being the most common. An active role is related to using safety alerts for backing up own patient safety goals. CONCLUSIONS To support an active role of qm/rm in their hospital's learning from safety alerts, appropriate organizational structures should be developed. Furthermore, they could be given special information or training to act as an information hub on the issues discussed in the alerts.
Resumo:
BACKGROUND The aim of newborn screening (NBS) for CF is to detect children with 'classic' CF where early treatment is possible and improves prognosis. Children with inconclusive CF diagnosis (CFSPID) should not be detected, as there is no evidence for improvement through early treatment. No algorithm in current NBS guidelines explains what to do when sweat test (ST) fails. This study compares the performance of three different algorithms for further diagnostic evaluations when first ST is unsuccessful, regarding the numbers of children detected with CF and CFSPID, and the time until a definite diagnosis. METHODS In Switzerland, CF-NBS was introduced in January 2011 using an IRT-DNA-IRT algorithm followed by a ST. In children, in whom ST was not possible (no or insufficient sweat), 3 different protocols were applied between 2011 and 2014: in 2011, ST was repeated until it was successful (protocol A), in 2012 we proceeded directly to diagnostic DNA testing (protocol B), and 2013-2014, fecal elastase (FE) was measured in the stool, in order to determine a pancreas insufficiency needing immediate treatment (protocol C). RESULTS The ratio CF:CFSPID was 7:1 (27/4) with protocol A, 2:1 (22/10) with protocol B, and 14:1 (54/4) with protocol C. The mean time to definite diagnosis was significantly shorter with protocol C (33days) compared to protocol A or B (42 and 40days; p=0.014 compared to A, and p=0.036 compared to B). CONCLUSIONS The algorithm for the diagnostic part of the newborn screening used in the CF centers is important and affects the performance of a CF-NBS program with regard to the ratio CF:CFSPID and the time until definite diagnosis. Our results suggest to include FE after initial sweat test failure in the CF-NBS guidelines to keep the proportion of CFSPID low and the time until definite diagnosis short.
Resumo:
OBJECTIVE Glycerophospholipids and sphingolipids are structurally heterogeneous due to differences in the O- and N-linked fatty acids and head groups. Sphingolipids also show a heterogeneity in their sphingoid base composition which up to now has been little appreciated. The aim of this study was to investigate the association of certain glycerophospholipid and sphingolipid species with stable coronary artery disease (CAD) and acute myocardial infarction (AMI). METHODS The lipid profile in plasma from patients with stable CAD (n = 18) or AMI (n = 17) was compared to healthy subjects (n = 14). Sixty five glycerophospholipid and sphingolipid species were quantified by LC-MS. The relative distribution of these lipids into lipoprotein fractions was analyzed. RESULTS In the CAD cohort, 45 glycerophospholipid and sphingolipid species were significantly lower compared to healthy controls. In the AMI group, 42 glycerophospholipid and sphingolipid species were reduced. Four PC plasmalogens (PC33:1, PC33:2, PC33:3 and PC35:3) showed the most significant difference. Out of eleven analyzed sphingoid bases, four were lower in the CAD and six in the AMI group. Sphingosine-1-phosphate (S1P) levels were reduced in the AMI group whereas an atypical C16:1 S1P was lower in both groups. Phosphatidylcholine and sphingomyelin species were exclusively present in lipoprotein particles, whereas lysophosphatidylcholines were mainly found in the lipoprotein-free fraction. The observed differences were not explained by the use of statins as confirmed in a second, independent cohort. CONCLUSIONS Reduced levels of four PC plasmalogens (PC33:1, PC33:2, PC33:3 and PC35:3) were identified as a putatively novel lipid signature for CAD and AMI.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.
Resumo:
BACKGROUND Biomarkers of myocardial injury increase frequently during transcatheter aortic valve implantation (TAVI). The impact of postprocedural cardiac troponin (cTn) elevation on short-term outcomes remains controversial, and the association with long-term prognosis is unknown. METHODS AND RESULTS We evaluated 577 consecutive patients with severe aortic stenosis treated with TAVI between 2007 and 2012. Myocardial injury, defined according to the Valve Academic Research Consortium (VARC)-2 as post-TAVI cardiac troponin T (cTnT) >15× the upper limit of normal, occurred in 338 patients (58.1%). In multivariate analyses, myocardial injury was associated with higher risk of all-cause mortality at 30 days (adjusted hazard ratio [HR], 8.77; 95% CI, 2.07-37.12; P=0.003) and remained a significant predictor at 2 years (adjusted HR, 1.98; 95% CI, 1.36-2.88; P<0.001). Higher cTnT cutoffs did not add incremental predictive value compared with the VARC-2-defined cutoff. Whereas myocardial injury occurred more frequently in patients with versus without coronary artery disease (CAD), the relative impact of cTnT elevation on 2-year mortality did not differ between patients without CAD (adjusted HR, 2.59; 95% CI, 1.27-5.26; P=0.009) and those with CAD (adjusted HR, 1.71; 95% CI, 1.10-2.65; P=0.018; P for interaction=0.24). Mortality rates at 2 years were lowest in patients without CAD and no myocardial injury (11.6%) and highest in patients with complex CAD (SYNTAX score >22) and myocardial injury (41.1%). CONCLUSIONS VARC-2-defined cTnT elevation emerged as a strong, independent predictor of 30-day mortality and remained a modest, but significant, predictor throughout 2 years post-TAVI. The prognostic value of cTnT elevation was modified by the presence and complexity of underlying CAD with highest mortality risk observed in patients combining SYNTAX score >22 and evidence of myocardial injury.
Resumo:
BACKGROUND Little is known on the risk of cancer in HIV-positive children in sub-Saharan Africa. We examined incidence and risk factors of AIDS-defining and other cancers in pediatric antiretroviral therapy (ART) programs in South Africa. METHODS We linked the records of five ART programs in Johannesburg and Cape Town to those of pediatric oncology units, based on name and surname, date of birth, folder and civil identification numbers. We calculated incidence rates and obtained hazard ratios (HR) with 95% confidence intervals (CI) from Cox regression models including ART, sex, age, and degree of immunodeficiency. Missing CD4 counts and CD4% were multiply imputed. Immunodeficiency was defined according to World Health Organization 2005 criteria. RESULTS Data of 11,707 HIV-positive children were included in the analysis. During 29,348 person-years of follow-up 24 cancers were diagnosed, for an incidence rate of 82 per 100,000 person-years (95% CI 55-122). The most frequent cancers were Kaposi Sarcoma (34 per 100,000 person-years) and Non Hodgkin Lymphoma (31 per 100,000 person-years). The incidence of non AIDS-defining malignancies was 17 per 100,000. The risk of developing cancer was lower on ART (HR 0.29, 95%CI 0.09-0.86), and increased with age at enrolment (>10 versus <3 years: HR 7.3, 95% CI 2.2-24.6) and immunodeficiency at enrolment (advanced/severe versus no/mild: HR 3.5, 95%CI 1.1-12.0). The HR for the effect of ART from complete case analysis was similar but ceased to be statistically significant (p=0.078). CONCLUSIONS Early HIV diagnosis and linkage to care, with start of ART before advanced immunodeficiency develops, may substantially reduce the burden of cancer in HIV-positive children in South Africa and elsewhere.
Resumo:
BACKGROUND Tuberculosis (TB) is a poverty-related disease that is associated with poor living conditions. We studied TB mortality and living conditions in Bern between 1856 and 1950. METHODS We analysed cause-specific mortality based on mortality registers certified by autopsies, and public health reports 1856 to 1950 from the city council of Bern. RESULTS TB mortality was higher in the Black Quarter (550 per 100,000) and in the city centre (327 per 100,000), compared to the outskirts (209 per 100,000 in 1911-1915). TB mortality correlated positively with the number of persons per room (r = 0.69, p = 0.026), the percentage of rooms without sunlight (r = 0.72, p = 0.020), and negatively with the number of windows per apartment (r = -0.79, p = 0.007). TB mortality decreased 10-fold from 330 per 100,000 in 1856 to 33 per 100,000 in 1950, as housing conditions improved, indoor crowding decreased, and open-air schools, sanatoria, systematic tuberculin skin testing of school children and chest radiography screening were introduced. CONCLUSIONS Improved living conditions and public health measures may have contributed to the massive decline of the TB epidemic in the city of Bern even before effective antibiotic treatment became finally available in the 1950s.