927 resultados para Subgroup
Resumo:
QUESTIONS UNDER STUDY: Hospitality workers are a population particularly at risk from the noxious effects of environmental tobacco smoke (ETS). The Canton of Vaud, Switzerland banned smoking in public places in September 2009. This prospective study addresses the impact of the ban on the health of hospitality workers. METHODS: ETS exposure was evaluated using a passive sampling device that measures airborne nicotine; lung function was assessed by spirometry; health-related quality of life, ETS exposure symptoms and satisfaction were measured by questionnaire. RESULTS: 105 participants (smokers and non-smokers) were recruited initially and 66 were followed up after one year. ETS exposure was significantly lower after the ban. Hospitality workers had lower pre-ban forced expiratory volume in one second (FEV1) and forced vital capacity (FVC) values than expected. FEV1 remained stable after the ban, with a near-significant increase in the subgroup of asthmatics only. FVC increased at one year follow-up from 90.42% to 93.05% (p = 0.02) in the entire cohort; women, non-smokers and older participants gained the greatest benefit. The health survey showed an increase in physical wellbeing after the ban, the greatest benefit being observed in non-smokers. ETS exposure symptoms were less frequent after the ban, especially red and irritated eyes and sneezing. The new law was judged useful and satisfactory by the vast majority of employees, including smokers. CONCLUSION: The recent cantonal ban on smoking in public places brought about an improvement in lung function, physical well-being and ETS symptoms of hospitality workers, including smokers.
Resumo:
The clinical relevance of accurately diagnosing pleomorphic sarcomas has been shown, especially in cases of undifferentiated pleomorphic sarcomas with myogenic differentiation, which appear significantly more aggressive. To establish a new smooth muscle differentiation classification and to test its prognostic value, 412 sarcomas with complex genetics were examined by immunohistochemistry using four smooth muscle markers (calponin, h-caldesmon, transgelin and smooth muscle actin). Two tumor categories were first defined: tumors with positivity for all four markers and tumors with no or incomplete phenotypes. Multivariate analysis demonstrated that this classification method exhibited the strongest prognostic value compared with other prognostic factors, including histological classification. Secondly, incomplete or absent smooth muscle phenotype tumor group was then divided into subgroups by summing for each tumor the labeling intensities of all four markers for each tumors. A subgroup of tumors with an incomplete but strong smooth muscle differentiation phenotype presenting an intermediate metastatic risk was thus identified. Collectively, our results show that the smooth muscle differentiation classification method may be a useful diagnostic tool as well as a relevant prognostic tool for undifferentiated pleomorphic sarcomas.
Resumo:
Peripheral T-cell lymphoma (PTCL) encompasses a heterogeneous group of neoplasms with generally poor clinical outcome. Currently 50% of PTCL cases are not classifiable: PTCL-not otherwise specified (NOS). Gene-expression profiles on 372 PTCL cases were analyzed and robust molecular classifiers and oncogenic pathways that reflect the pathobiology of tumor cells and their microenvironment were identified for major PTCL-entities, including 114 angioimmunoblastic T-cell lymphoma (AITL), 31 anaplastic lymphoma kinase (ALK)-positive and 48 ALK-negative anaplastic large cell lymphoma, 14 adult T-cell leukemia/lymphoma and 44 extranodal NK/T-cell lymphoma that were further separated into NK-cell and gdT-cell lymphomas. Thirty-seven percent of morphologically diagnosed PTCL-NOS cases were reclassified into other specific subtypes by molecular signatures. Reexamination, immunohistochemistry, and IDH2 mutation analysis in reclassified cases supported the validity of the reclassification. Two major molecular subgroups can be identified in the remaining PTCL-NOS cases characterized by high expression of either GATA3 (33%; 40/121) or TBX21 (49%; 59/121). The GATA3 subgroup was significantly associated with poor overall survival (P = .01). High expression of cytotoxic gene-signature within the TBX21 subgroup also showed poor clinical outcome (P = .05). In AITL, high expression of several signatures associated with the tumor microenvironment was significantly associated with outcome. A combined prognostic score was predictive of survival in an independent cohort (P = .004).
Resumo:
Among the soils in the Mato Grosso do Sul, stand out in the Pantanal biome, the Spodosols. Despite being recorded in considerable extensions, few studies aiming to characterize and classify these soils were performed. The purpose of this study was to characterize and classify soils in three areas of two physiographic types in the Taquari river basin: bay and flooded fields. Two trenches were opened in the bay area (P1 and P2) and two in the flooded field (P3 and P4). The third area (saline) with high sodium levels was sampled for further studies. In the soils in both areas the sand fraction was predominant and the texture from sand to sandy loam, with the main constituent quartz. In the bay area, the soil organic carbon in the surface layer (P1) was (OC) > 80 g kg-1, being diagnosed as Histic epipedon. In the other profiles the surface horizons had low OC levels which, associated with other properties, classified them as Ochric epipedons. In the soils of the bay area (P1 and P2), the pH ranged from 5.0 to 7.5, associated with dominance of Ca2+ and Mg2+, with base saturation above 50 % in some horizons. In the flooded fields (P3 and P4) the soil pH ranged from 4.9 to 5.9, H+ contents were high in the surface horizons (0.8-10.5 cmol c kg-1 ), Ca2+ and Mg² contents ranged from 0.4 to 0.8 cmol c kg-1 and base saturation was < 50 %. In the soils of the bay area (P1 and P2) iron was accumulated (extracted by dithionite - Fed) and OC in the spodic horizon; in the P3 and P4 soils only Fed was accumulated (in the subsurface layers). According to the criteria adopted by the Brazilian System of Soil Classification (SiBCS) at the subgroup level, the soils were classified as: P1: Organic Hydromorphic Ferrohumiluvic Spodosol. P2: Typical Orthic Ferrohumiluvic Spodosol. P3: Typical Hydromorphic Ferroluvic Spodosol. P4: Arenic Orthic Ferroluvic Spodosol.
Resumo:
BACKGROUND: Maternal pregestational diabetes is a well-known risk factor for congenital anomalies. This study analyses the spectrum of congenital anomalies associated with maternal diabetes using data from a large European database for the population-based surveillance of congenital anomalies. METHODS: Data from 18 population-based EUROCAT registries of congenital anomalies in 1990-2005. All malformed cases occurring to mothers with pregestational diabetes (diabetes cases) were compared to all malformed cases in the same registry areas to mothers without diabetes (non-diabetes cases). RESULTS: There were 669 diabetes cases and 92,976 non diabetes cases. Odds ratios in diabetes pregnancies relative to non-diabetes pregnancies comparing each EUROCAT subgroup to all other non-chromosomal anomalies combined showed significantly increased odds ratios for neural tube defects (anencephaly and encephalocele, but not spina bifida) and several subgroups of congenital heart defects. Other subgroups with significantly increased odds ratios were anotia, omphalocele and bilateral renal agenesis. Frequency of hip dislocation was significantly lower among diabetes (odds ratio 0.15, 95% CI 0.05-0.39) than non-diabetes cases. Multiple congenital anomalies were present in 13.6 % of diabetes cases and 6.1 % of non-diabetes cases. The odds ratio for caudal regression sequence was very high (26.40,95% CI 8.98-77.64), but only 17% of all caudal regression cases resulted from a pregnancy with pregestational diabetes. CONCLUSIONS: The increased risk of congenital anomalies in pregnancies with pregestational diabetes is related to specific non-chromosomal congenital anomalies and multiple congenital anomalies and not a general increased risk.
Resumo:
Background: Lung transplant recipients are frequently exposed to respiratory viruses and are particularly at risk for severe complications. The aim of this study was to assess the association among the presence of a respiratory virus detected by molecular assays in bronchoalveolar lavage (BAL) fluid, respiratory symptoms, and acute rejection in adult lung transplant recipients. Methods: Upper (nasopharyngeal swab) and lower (BAL) respiratory tract specimens from 77 lung transplant recipients enrolled in a cohort study and undergoing bronchoscopy with BAL and transbronchial biopsies were screened using 17 different polymerase chain reaction-based assays. Result: BAL fluid and biopsy specimens from 343 bronchoscopic procedures performed in 77 patients were analyzed. We also compared paired nasopharyngeal and BAL fluid specimens collected in a subgroup of 283 cases. The overall viral positivity rate was 29.3% in the upper respiratory tract specimens and 17.2% in the BAL samples (). We observed a significant association P < .001 between the presence of respiratory symptoms and positive viral detection in the lower respiratory tract (Pp. 012). Conversely, acute rejection was not associated with the presence of viral infection (odds ratio, 0.41; 95% confidence interval, 0.20-0.88). The recovery of lung function was significantly slower when acute rejection and viral infection were both present. Conclusions: A temporal relationship exists between acute respiratory symptoms and positive viral nucleic acid detection in BAL fluid from lung transplant recipients. We provide evidence suggesting that respiratory viruses are not associated with acute graft rejection during the acute phase of infection.
Resumo:
BACKGROUND: Refinements in stent design affecting strut thickness, surface polymer, and drug release have improved clinical outcomes of drug-eluting stents. We aimed to compare the safety and efficacy of a novel, ultrathin strut cobalt-chromium stent releasing sirolimus from a biodegradable polymer with a thin strut durable polymer everolimus-eluting stent. METHODS: We did a randomised, single-blind, non-inferiority trial with minimum exclusion criteria at nine hospitals in Switzerland. We randomly assigned (1:1) patients aged 18 years or older with chronic stable coronary artery disease or acute coronary syndromes undergoing percutaneous coronary intervention to treatment with biodegradable polymer sirolimus-eluting stents or durable polymer everolimus-eluting stents. Randomisation was via a central web-based system and stratified by centre and presence of ST segment elevation myocardial infarction. Patients and outcome assessors were masked to treatment allocation, but treating physicians were not. The primary endpoint, target lesion failure, was a composite of cardiac death, target vessel myocardial infarction, and clinically-indicated target lesion revascularisation at 12 months. A margin of 3·5% was defined for non-inferiority of the biodegradable polymer sirolimus-eluting stent compared with the durable polymer everolimus-eluting stent. Analysis was by intention to treat. The trial is registered with ClinicalTrials.gov, number NCT01443104. FINDINGS: Between Feb 24, 2012, and May 22, 2013, we randomly assigned 2119 patients with 3139 lesions to treatment with sirolimus-eluting stents (1063 patients, 1594 lesions) or everolimus-eluting stents (1056 patients, 1545 lesions). 407 (19%) patients presented with ST-segment elevation myocardial infarction. Target lesion failure with biodegradable polymer sirolimus-eluting stents (69 cases; 6·5%) was non-inferior to durable polymer everolimus-eluting stents (70 cases; 6·6%) at 12 months (absolute risk difference -0·14%, upper limit of one-sided 95% CI 1·97%, p for non-inferiority <0·0004). No significant differences were noted in rates of definite stent thrombosis (9 [0·9%] vs 4 [0·4%], rate ratio [RR] 2·26, 95% CI 0·70-7·33, p=0·16). In pre-specified stratified analyses of the primary endpoint, biodegradable polymer sirolimus-eluting stents were associated with improved outcome compared with durable polymer everolimus-eluting stents in the subgroup of patients with ST-segment elevation myocardial infarction (7 [3·3%] vs 17 [8·7%], RR 0·38, 95% CI 0·16-0·91, p=0·024, p for interaction=0·014). INTERPRETATION: In a patient population with minimum exclusion criteria and high adherence to dual antiplatelet therapy, biodegradable polymer sirolimus-eluting stents were non-inferior to durable polymer everolimus-eluting stents for the combined safety and efficacy outcome target lesion failure at 12 months. The noted benefit in the subgroup of patients with ST-segment elevation myocardial infarction needs further study. FUNDING: Clinical Trials Unit, University of Bern, and Biotronik, Bülach, Switzerland.
Resumo:
Rationale: Clinical and electrophysiological prognostic markers of brain anoxia have been mostly evaluated in comatose survivors of out hospital cardiac arrest (OHCA) after standard resuscitation, but their predictive value in patients treated with mild induced hypothermia (IH) is unknown. The objective of this study was to identify a predictive score of independent clinical and electrophysiological variables in comatose OHCA survivors treated with IH, aiming at a maximal positive predictive value (PPV) and a high negative predictive value (NPV) for mortality. Methods: We prospectively studied consecutive adult comatose OHCA survivors from April 2006 to May 2009, treated with mild IH to 33-34_C for 24h at the intensive care unit of the Lausanne University Hospital, Switzerland. IH was applied using an external cooling method. As soon as subjects passively rewarmed (body temperature >35_C) they underwent EEG and SSEP recordings (off sedation), and were examined by experienced neurologists at least twice. Patients with status epilepticus were treated with AED for at least 24h. A multivariable logistic regression was performed to identify independent predictors of mortality at hospital discharge. These were used to formulate a predictive score. Results: 100 patients were studied; 61 died. Age, gender and OHCA etiology (cardiac vs. non-cardiac) did not differ among survivors and nonsurvivors. Cardiac arrest type (non-ventricular fibrillation vs. ventricular fibrillation), time to return of spontaneous circulation (ROSC) >25min, failure to recover all brainstem reflexes, extensor or no motor response to pain, myoclonus, presence of epileptiform discharges on EEG, EEG background unreactive to pain, and bilaterally absent N20 on SSEP, were all significantly associated with mortality. Absent N20 was the only variable showing no false positive results. Multivariable logistic regression identified four independent predictors (Table). These were used to construct the score, and its predictive values were calculated after a cut-off of 0-1 vs. 2-4 predictors. We found a PPV of 1.00 (95% CI: 0.93-1.00), a NPV of 0.81 (95% CI: 0.67-0.91) and an accuracy of 0.93 for mortality. Among 9 patients who were predicted to survive by the score but eventually died, only 1 had absent N20. Conclusions: Pending validation in a larger cohort, this simple score represents a promising tool to identify patients who will survive, and most subjects who will not, after OHCA and IH. Furthermore, while SSEP are 100% predictive of poor outcome but not available in most hospitals, this study identifies EEG background reactivity as an important predictor after OHCA. The score appears robust even without SSEP, suggesting that SSEP and other investigations (e.g., mismatch negativity, serum NSE) might be principally needed to enhance prognostication in the small subgroup of patients failing to improve despite a favorable score.
Resumo:
ABSTRACT Preservation of mangroves, a very significant ecosystem from a social, economic, and environmental viewpoint, requires knowledge on soil composition, genesis, morphology, and classification. These aspects are of paramount importance to understand the dynamics of sustainability and preservation of this natural resource. In this study mangrove soils in the Subaé river basin were described and classified and inorganic waste concentrations evaluated. Seven pedons of mangrove soil were chosen, five under fluvial influence and two under marine influence and analyzed for morphology. Samples of horizons and layers were collected for physical and chemical analyses, including heavy metals (Pb, Cd, Mn, Zn, and Fe). The moist soils were suboxidic, with Eh values below 350 mV. The pH level of the pedons under fluvial influence ranged from moderately acid to alkaline, while the pH in pedons under marine influence was around 7.0 throughout the profile. The concentration of cations in the sorting complex for all pedons, independent of fluvial or marine influence, indicated the following order: Na+>Mg2+>Ca2+>K+. Mangrove soils from the Subaé river basin under fluvial and marine influence had different morphological, physical, and chemical characteristics. The highest Pb and Cd concentrations were found in the pedons under fluvial influence, perhaps due to their closeness to the mining company Plumbum, while the concentrations in pedon P7 were lowest, due to greater distance from the factory. For containing at least one metal above the reference levels established by the National Oceanic and Atmospheric Administration (United States Environmental Protection Agency), the pedons were classified as potentially toxic. The soils were classified as Gleissolos Tiomórficos Órticos (sálicos) sódico neofluvissólico in according to the Brazilian Soil Classification System, indicating potential toxicity and very poor drainage, except for pedon P7, which was classified in the same subgroup as the others, but different in that the metal concentrations met acceptable standards.
Resumo:
Quantitative information from magnetic resonance imaging (MRI) may substantiate clinical findings and provide additional insight into the mechanism of clinical interventions in therapeutic stroke trials. The PERFORM study is exploring the efficacy of terutroban versus aspirin for secondary prevention in patients with a history of ischemic stroke. We report on the design of an exploratory longitudinal MRI follow-up study that was performed in a subgroup of the PERFORM trial. An international multi-centre longitudinal follow-up MRI study was designed for different MR systems employing safety and efficacy readouts: new T2 lesions, new DWI lesions, whole brain volume change, hippocampal volume change, changes in tissue microstructure as depicted by mean diffusivity and fractional anisotropy, vessel patency on MR angiography, and the presence of and development of new microbleeds. A total of 1,056 patients (men and women ≥ 55 years) were included. The data analysis included 3D reformation, image registration of different contrasts, tissue segmentation, and automated lesion detection. This large international multi-centre study demonstrates how new MRI readouts can be used to provide key information on the evolution of cerebral tissue lesions and within the macrovasculature after atherothrombotic stroke in a large sample of patients.
Resumo:
A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.
Resumo:
Chronic obstructive pulmonary disease (COPD) is the primary indication for lung transplantation (LTx), but survival benefit is still under debate. We analysed the survival impact of LTx in COPD with a new approach, using the BODE (body mass index, airway obstruction, dyspnoea, exercise capacity) index. We retrospectively reviewed 54 consecutive lung transplants performed for COPD. The pre-transplant BODE score was calculated for each patient and a predicted survival was derived from the survival functions of the original BODE index validation cohort. Predicted and observed post-transplant survival was then compared. In the subgroups with a BODE score >or=7 and <7, a majority of patients (66% and 69%, respectively) lived for longer after LTx than predicted by their individual BODE index. The median survival was significantly improved in the entire cohort and in the subgroup with a BODE score >or=7. 4 yrs after LTx a survival benefit was only apparent in patients with a pre-transplant BODE score of >or=7. In conclusion, while a majority of COPD patients had an individual survival benefit from LTx regardless of their pre-transplant BODE score, a global survival benefit was seen only in patients with more severe disease. This supports the use of the BODE index as a selection criteria for LTx candidates.
Resumo:
Background: Hospitals in countries with public health systems have recently adopted organizational changes to improve efficiency and resource allocation, and reducing inappropriate hospitalizations has been established as an important goal. AIMS: Our goal was to describe the functioning of a Quick Diagnosis Unit in a Spanish public university hospital after evaluating 1,000 consecutive patients. We also aimed to ascertain the degree of satisfaction among Quick Diagnosis Unit patients and the costs of the model compared to conventional hospitalization practices. DESIGN: Observational, descriptive study. METHODS: Our sample comprised 1,000 patients evaluated between November 2008 and January 2010 in the Quick Diagnosis Unit of a tertiary university public hospital in Barcelona. Included patients were those who had potentially severe diseases and would normally require hospital admission for diagnosis but whose general condition allowed outpatient treatment. We analyzed several variables, including time to diagnosis, final diagnoses and hospitalizations avoided, and we also investigated the mean cost (as compared to conventional hospitalization) and the patients' satisfaction. RESULTS: In 88% of cases, the reasons for consultation were anemia, anorexia-cachexia syndrome, febrile syndrome, adenopathies, abdominal pain, chronic diarrhea and lung abnormalities. The most frequent diagnoses were cancer (18.8%; mainly colon cancer and lymphoma) and Iron-deficiency anemia (18%). The mean time to diagnosis was 9.2 days (range 1 to 19 days). An estimated 12.5 admissions/day in a one-year period (in the internal medicine department) were avoided. In a subgroup analysis, the mean cost per process (admission-discharge) for a conventional hospitalization was 3,416.13 Euros, while it was 735.65 Euros in the Quick Diagnosis Unit. Patients expressed a high degree of satisfaction with Quick Diagnosis Unit care. CONCLUSIONS: Quick Diagnosis Units represent a useful and cost-saving model for the diagnostic study of patients with potentially severe diseases. Future randomized study designs involving comparisons between controls and intervention groups would help elucidate the usefulness of Quick Diagnosis Units as an alternative to conventional hospitalization.
Resumo:
Background/Aims: The epidemiology of Chagas disease, until recently confined to areas of continental Latin America, has undergone considerable changes in recent decades due to migration to other parts of the world, including Spain. We studied the prevalence of Chagas disease in Latin American patients treated at a health center in Barcelona and evaluated its clinical phase. We make some recommendations for screening for the disease. Methodology/Principal Findings: We performed an observational, cross-sectional prevalence study by means of an immunochromatographic test screening of all continental Latin American patients over the age of 14 years visiting the health centre from October 2007 to October 2009. The diagnosis was confirmed by serological methods: conventional in-house ELISA (cELISA), a commercial kit (rELISA) and ELISA using T cruzi lysate (Ortho-Clinical Diagnostics) (oELISA). Of 766 patients studied, 22 were diagnosed with T. cruzi infection, showing a prevalence of 2.87% (95% CI, 1.6-4.12%). Of the infected patients, 45.45% men and 54.55% women, 21 were from Bolivia, showing a prevalence in the Bolivian subgroup (n = 127) of 16.53% (95% CI, 9.6-23.39%). All the infected patients were in a chronic phase of Chagas disease: 81% with the indeterminate form, 9.5% with the cardiac form and 9.5% with the cardiodigestive form. All patients infected with T. cruzi had heard of Chagas disease in their country of origin, 82% knew someone affected, and 77% had a significant history of living in adobe houses in rural areas. Conclusions: We found a high prevalence of T. cruzi infection in immigrants from Bolivia. Detection of T. cruzi¿infected persons by screening programs in non-endemic countries would control non-vectorial transmission and would benefit the persons affected, public health and national health systems.
Resumo:
Free-living energy expenditure (EE) was assessed in 37 young pregnant Gambian women at the 12th (n = 11, 53.5 +/- 1.7 kg), 24th (n = 14, 54.7 +/- 2.1 kg), and 36th (n = 12, 65.0 +/- 2.6 kg) wk of pregnancy and was compared with nonpregnant nonlactating (NPNL) control women (n = 12, 50.3 +/- 1.6 kg). The following two methods were used to assess EE: 1) the heart rate (HR) method using individual regression lines (HR vs EE) established at different activity levels in a respiration chamber and 2) the doubly labeled water (2H2(18)O) method in a subgroup of 25 pregnant and 7 control women. With the HR method the EE during the agricultural rainy season was found to be 2,408 +/- 87, 2,293 +/- 122, and 2,782 +/- 130 kcal/day at 12, 24, and 36 wk of gestation and were not significantly different from the control group (2,502 +/- 133 kcal/day). These findings were confirmed by the 2H2(18)O measurements, which failed to show any effect of pregnancy on EE. Expressed per unit body weight, the free-living EE was found to be lower (P less than 0.01 with 2H2(18)O method) at 36 wk of gestation than in the NPNL group. It is concluded that, in these Gambian women, energy-sparing mechanisms that contribute to meet the additional energy stress of gestation are operating during pregnancy (e.g., diminished spontaneous physical activity).