958 resultados para STANDARD-RISK
Resumo:
BACKGROUND & AIMS Development of strictures is a major concern for patients with eosinophilic esophagitis (EoE). At diagnosis, EoE can present with an inflammatory phenotype (characterized by whitish exudates, furrows, and edema), a stricturing phenotype (characterized by rings and stenosis), or a combination of these. Little is known about progression of stricture formation; we evaluated stricture development over time in the absence of treatment and investigated risk factors for stricture formation. METHODS We performed a retrospective study using the Swiss EoE Database, collecting data on 200 patients with symptomatic EoE (153 men; mean age at diagnosis, 39 ± 15 years old). Stricture severity was graded based on the degree of difficulty associated with passing of the standard adult endoscope. RESULTS The median delay in diagnosis of EoE was 6 years (interquartile range, 2-12 years). With increasing duration of delay in diagnosis, the prevalence of fibrotic features of EoE, based on endoscopy, increased from 46.5% (diagnostic delay, 0-2 years) to 87.5% (diagnostic delay, >20 years; P = .020). Similarly, the prevalence of esophageal strictures increased with duration of diagnostic delay, from 17.2% (diagnostic delay, 0-2 years) to 70.8% (diagnostic delay, >20 years; P < .001). Diagnostic delay was the only risk factor for strictures at the time of EoE diagnosis (odds ratio = 1.08; 95% confidence interval: 1.040-1.122; P < .001). CONCLUSIONS The prevalence of esophageal strictures correlates with the duration of untreated disease. These findings indicate the need to minimize delay in diagnosis of EoE.
Resumo:
OBJECTIVES: To assess health care utilisation for patients co-infected with TB and HIV (TB-HIV), and to develop a weighted health care index (HCI) score based on commonly used interventions and compare it with patient outcome. METHODS: A total of 1061 HIV patients diagnosed with TB in four regions, Central/Northern, Southern and Eastern Europe and Argentina, between January 2004 and December 2006 were enrolled in the TB-HIV study. A weighted HCI score (range 0–5), based on independent prognostic factors identified in multivariable Cox models and the final score, included performance of TB drug susceptibility testing (DST), an initial TB regimen containing a rifamycin, isoniazid and pyrazinamide, and start of combination antiretroviral treatment (cART). RESULTS: The mean HCI score was highest in Central/Northern Europe (3.2, 95%CI 3.1–3.3) and lowest in Eastern Europe (1.6, 95%CI 1.5–1.7). The cumulative probability of death 1 year after TB diagnosis decreased from 39% (95%CI 31–48) among patients with an HCI score of 0, to 9% (95%CI 6–13) among those with a score of ≥4. In an adjusted Cox model, a 1-unit increase in the HCI score was associated with 27% reduced mortality (relative hazard 0.73, 95%CI 0.64–0.84). CONCLUSIONS: Our results suggest that DST, standard anti-tuberculosis treatment and early cART may improve outcome for TB-HIV patients. The proposed HCI score provides a tool for future research and monitoring of the management of TB-HIV patients. The highest HCI score may serve as a benchmark to assess TB-HIV management, encouraging continuous health care improvement.
Resumo:
OBJECTIVE To examine the degree to which use of β blockers, statins, and diuretics in patients with impaired glucose tolerance and other cardiovascular risk factors is associated with new onset diabetes. DESIGN Reanalysis of data from the Nateglinide and Valsartan in Impaired Glucose Tolerance Outcomes Research (NAVIGATOR) trial. SETTING NAVIGATOR trial. PARTICIPANTS Patients who at baseline (enrolment) were treatment naïve to β blockers (n=5640), diuretics (n=6346), statins (n=6146), and calcium channel blockers (n=6294). Use of calcium channel blocker was used as a metabolically neutral control. MAIN OUTCOME MEASURES Development of new onset diabetes diagnosed by standard plasma glucose level in all participants and confirmed with glucose tolerance testing within 12 weeks after the increased glucose value was recorded. The relation between each treatment and new onset diabetes was evaluated using marginal structural models for causal inference, to account for time dependent confounding in treatment assignment. RESULTS During the median five years of follow-up, β blockers were started in 915 (16.2%) patients, diuretics in 1316 (20.7%), statins in 1353 (22.0%), and calcium channel blockers in 1171 (18.6%). After adjusting for baseline characteristics and time varying confounders, diuretics and statins were both associated with an increased risk of new onset diabetes (hazard ratio 1.23, 95% confidence interval 1.06 to 1.44, and 1.32, 1.14 to 1.48, respectively), whereas β blockers and calcium channel blockers were not associated with new onset diabetes (1.10, 0.92 to 1.31, and 0.95, 0.79 to 1.13, respectively). CONCLUSIONS Among people with impaired glucose tolerance and other cardiovascular risk factors and with serial glucose measurements, diuretics and statins were associated with an increased risk of new onset diabetes, whereas the effect of β blockers was non-significant.
Resumo:
Mood disorders are the most common form of mental illness and one of the leading causes of morbidity worldwide. Major depressive disorder and bipolar disorder have a lifetime prevalence of 16.2% and 4.4%, respectively. Women comprise a substantial proportion of this population, and an estimated 500,000 pregnancies each year involve women with a psychiatric condition. Management with psychotropic medications is considered standard of care for most patients with mood disorders. However, many of these medications are known human teratogens. Because pregnant women with mood disorders face a high risk of relapse if unmanaged, the obstetrician faces a unique challenge in providing the best care to both mother and baby. It has been suggested that many obstetricians overestimate the teratogenic risks associated with psychotropic medications, while concurrently underestimating the risks associated with unmanaged mood disorders. This may be due a knowledge gap regarding the most current teratogen information, and lack of official management guidelines. Therefore, the purpose of this study is to determine the current knowledge base of obstetricians regarding the teratogenic effects of common psychotropic medications, as wells as to capture current management practices for pregnant women with mood disorders. A total of 117 Texas obstetricians responded to a survey regarding teratogen knowledge and management practice. It was common for respondents to encounter women who disclose both having a mood disorder and taking a psychotropic medication during pregnancy. Many respondents did not utilize up-to-date drug counseling resources, and were unaware of or over-estimated the teratogenic risks of common medications used to treat mood disorders. Finally, many respondents reported wanting to refer pregnant patients with mood disorders to psychiatrists for co-management, but are reportedly restricted in doing so due to accessibility or insurance issues. This study demonstrates that there is a knowledge gap among obstetricians regarding the teratogenicity of common psychotropic medications utilized to manage a patient population they frequently encounter. Further, obstetricians have vastly different risk perceptions of these medications, resulting in various management approaches and recommendations. Future research should focus on establishing standard practice guidelines, as well as better accessibility to psychiatric services for pregnant women.
Resumo:
The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^
Resumo:
The association of measures of physical activity with coronary heart disease (CHD) risk factors in children, especially those for atherosclerosis, is unknown. The purpose of this study was to determine the association of physical activity and cardiovascular fitness with blood lipids and lipoproteins in pre-adolescent and adolescent girls.^ The study population was comprised of 131 girls aged 9 to 16 years who participated in the Children's Nutrition Research Center's Adolescent Study. The dependent variables, blood lipids and lipoproteins, were measured by standard techniques. The independent variables were physical activity measured as the difference between total energy expenditure (TEE) and basal metabolic rate (BMR), and cardiovascular fitness, VO$\rm\sb{2max}$(ml/min/kg). TEE was measured by the doubly-labeled water (DLW) method, and BMR by whole-room calorimetry. Cardiovascular fitness, VO$\rm\sb{2max}$(ml/min/kg), was measured on a motorized treadmill. The potential confounding variables were sexual maturation (Tanner breast stage), ethnic group, body fat percent, and dietary variables. A systematic strategy for data analysis was used to isolate the effects of physical activity and cardiovascular fitness on blood lipids, beginning with assessment of confounding and interaction. Next, from regression models predicting each blood lipid and controlling for covariables, hypotheses were evaluated by the direction and value of the coefficients for physical activity and cardiovascular fitness.^ The main result was that cardiovascular fitness appeared to be more strongly associated with blood lipids than physical activity. An interaction between cardiovascular fitness and sexual maturation indicated that the effect of cardiovascular fitness on most blood lipids was dependent on the stage of sexual maturation.^ A difference of 760 kcal/d physical activity (which represents the difference between the 25th and 75th percentile of physical activity) was associated with negligible differences in blood lipids. In contrast, a difference in 10 ml/min/kg of VO$\rm\sb{2max}$ or cardiovascular fitness (which represents the difference between the 25th and 75th percentile in cardiovascular fitness) in the early stages of sexual maturation was associated with an average positive difference of 15 mg/100 ml ApoA-1 and 10 mg/100 ml HDL-C. ^
Resumo:
BACKGROUND The early repolarization (ER) pattern is associated with an increased risk of arrhythmogenic sudden death. However, strategies for risk stratification of patients with the ER pattern are not fully defined. OBJECTIVES This study sought to determine the role of electrophysiology studies (EPS) in risk stratification of patients with ER syndrome. METHODS In a multicenter study, 81 patients with ER syndrome (age 36 ± 13 years, 60 males) and aborted sudden death due to ventricular fibrillation (VF) were included. EPS were performed following the index VF episode using a standard protocol. Inducibility was defined by the provocation of sustained VF. Patients were followed up by serial implantable cardioverter-defibrillator interrogations. RESULTS Despite a recent history of aborted sudden death, VF was inducible in only 18 of 81 (22%) patients. During follow-up of 7.0 ± 4.9 years, 6 of 18 (33%) patients with inducible VF during EPS experienced VF recurrences, whereas 21 of 63 (33%) patients who were noninducible experienced recurrent VF (p = 0.93). VF storm occurred in 3 patients from the inducible VF group and in 4 patients in the noninducible group. VF inducibility was not associated with maximum J-wave amplitude (VF inducible vs. VF noninducible; 0.23 ± 0.11 mV vs. 0.21 ± 0.11 mV; p = 0.42) or J-wave distribution (inferior, odds ratio [OR]: 0.96 [95% confidence interval (CI): 0.33 to 2.81]; p = 0.95; lateral, OR: 1.57 [95% CI: 0.35 to 7.04]; p = 0.56; inferior and lateral, OR: 0.83 [95% CI: 0.27 to 2.55]; p = 0.74), which have previously been demonstrated to predict outcome in patients with an ER pattern. CONCLUSIONS Our findings indicate that current programmed stimulation protocols do not enhance risk stratification in ER syndrome.
Resumo:
BACKGROUND Potentially avoidable risk factors continue to cause unnecessary disability and premature death in older people. Health risk assessment (HRA), a method successfully used in working-age populations, is a promising method for cost-effective health promotion and preventive care in older individuals, but the long-term effects of this approach are unknown. The objective of this study was to evaluate the effects of an innovative approach to HRA and counselling in older individuals for health behaviours, preventive care, and long-term survival. METHODS AND FINDINGS This study was a pragmatic, single-centre randomised controlled clinical trial in community-dwelling individuals aged 65 y or older registered with one of 19 primary care physician (PCP) practices in a mixed rural and urban area in Switzerland. From November 2000 to January 2002, 874 participants were randomly allocated to the intervention and 1,410 to usual care. The intervention consisted of HRA based on self-administered questionnaires and individualised computer-generated feedback reports, combined with nurse and PCP counselling over a 2-y period. Primary outcomes were health behaviours and preventive care use at 2 y and all-cause mortality at 8 y. At baseline, participants in the intervention group had a mean ± standard deviation of 6.9 ± 3.7 risk factors (including unfavourable health behaviours, health and functional impairments, and social risk factors) and 4.3 ± 1.8 deficits in recommended preventive care. At 2 y, favourable health behaviours and use of preventive care were more frequent in the intervention than in the control group (based on z-statistics from generalised estimating equation models). For example, 70% compared to 62% were physically active (odds ratio 1.43, 95% CI 1.16-1.77, p = 0.001), and 66% compared to 59% had influenza vaccinations in the past year (odds ratio 1.35, 95% CI 1.09-1.66, p = 0.005). At 8 y, based on an intention-to-treat analysis, the estimated proportion alive was 77.9% in the intervention and 72.8% in the control group, for an absolute mortality difference of 4.9% (95% CI 1.3%-8.5%, p = 0.009; based on z-test for risk difference). The hazard ratio of death comparing intervention with control was 0.79 (95% CI 0.66-0.94, p = 0.009; based on Wald test from Cox regression model), and the number needed to receive the intervention to prevent one death was 21 (95% CI 12-79). The main limitations of the study include the single-site study design, the use of a brief self-administered questionnaire for 2-y outcome data collection, the unavailability of other long-term outcome data (e.g., functional status, nursing home admissions), and the availability of long-term follow-up data on mortality for analysis only in 2014. CONCLUSIONS This is the first trial to our knowledge demonstrating that a collaborative care model of HRA in community-dwelling older people not only results in better health behaviours and increased use of recommended preventive care interventions, but also improves survival. The intervention tested in our study may serve as a model of how to implement a relatively low-cost but effective programme of disease prevention and health promotion in older individuals. TRIAL REGISTRATION International Standard Randomized Controlled Trial Number: ISRCTN 28458424.
Resumo:
Gebiet: Chirurgie Abstract: Introduction: Carotid endarterectomy (CEA) and coronary artery bypass grafting (CABG) could be approached in a combined or a staged fashion. Some crucial studies have shown no significant difference in peri-operative stroke and death rate in combined versus staged CEA/CABG. At present conventional extracorporeal circulation (CECC) is regarded as the gold standard for performing on-pump coronary artery bypass grafting. On contrary, the use of minimized extracorporeal circulation (MECC) for CABG diminishes hemodilution, blood-air contact, foreign surface contact and inflammatory response. At the same time, general anaesthesia (GA) is a potential risk factor for higher perioperative stroke rate after isolated CEA, not only for the ipsilateral but also for the contralateral side especially in case of contralateral high-grade stenosis or occlusion. The aim of the study was to analyze if synchronous CEA/CABG using MECC (CEA/CABG group) allows reducing the perioperative stroke risk to the level of isolated CEA performed under GA (CEA-GA group). – Methods: A retrospective analysis of all patients who underwent CEA at our institution between January 2005 and December 2012 was performed. We compared outcomes between all patients undergoing CEA/CABG to all isolated CEA-GA during the same time period. The CEA/CABG group was additionally compared to a reference group consisting of patients undergoing isolated CEA in local anaesthesia. Primary outcome was in-hospital stroke. – Results: A total of 367 CEAs were performed, from which 46 patients were excluded having either off-pump CABG or other cardiac surgery procedures than CABG combined with CEA. Out of 321 patients, 74 were in the CEA/CABG and 64 in the CEA-GA group. There was a significantly higher rate of symptomatic stenoses among patients in the CEA-GA group (p<0.002). Three (4.1%) strokes in the CEA/CABG group were registered, two ipsilateral (2.7%) and one contralateral (1.4%) to the operated side. In the CEA-GA group 2 ipsilateral strokes (3.1%) occurred. No difference was noticed between the groups (p=1.000). One patient with stroke in each group had a symptomatic stenosis preoperatively. – Conclusions: Outcome with regard to mortality and neurologic injury is very good in both -patients undergoing CEA alone as well as patients undergoing synchronous CEA and CABG using the MECC system. Although the CEA/CABG group showed slightly increased risk of stroke, it can be considered as combined treatment in particular clinical situations.
Resumo:
AIMS The GLOBAL LEADERS trial is a superiority study in patients undergoing percutaneous coronary intervention, with a uniform use of Biolimus A9-eluting stents (BES) and bivalirudin. GLOBAL LEADERS was designed to assess whether a 24-month antithrombotic regimen with ticagrelor and one month of acetylsalicylic acid (ASA), compared to conventional dual antiplatelet therapy (DAPT), improves outcomes. METHODS AND RESULTS Patients (n >16,000) are randomised (1:1 ratio) to ticagrelor 90 mg twice daily for 24 months plus ASA ≤100 mg for one month versus DAPT with either ticagrelor (acute coronary syndrome) or clopidogrel (stable coronary artery disease) for 12 months plus ASA ≤100 mg for 24 months. The primary outcome is a composite of all-cause mortality or non-fatal, new Q-wave myocardial infarction at 24 months. The key safety endpoint is investigator-reported class 3 or 5 bleeding according to the Bleeding Academic Research Consortium (BARC) definitions. Sensitivity analysis will be carried out to explore potential differences in outcome across geographic regions and according to specific angiographic and clinical risk estimates. CONCLUSIONS The GLOBAL LEADERS trial aims to assess the role of ticagrelor as a single antiplatelet agent after a short course of DAPT for the long-term prevention of cardiac adverse events, across a wide spectrum of patients, following BES implantation.
Resumo:
PURPOSE To determine the predictive value of the vertebral trabecular bone score (TBS) alone or in addition to bone mineral density (BMD) with regard to fracture risk. METHODS Retrospective analysis of the relative contribution of BMD [measured at the femoral neck (FN), total hip (TH), and lumbar spine (LS)] and TBS with regard to the risk of incident clinical fractures in a representative cohort of elderly post-menopausal women previously participating in the Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk study. RESULTS Complete datasets were available for 556 of 701 women (79 %). Mean age 76.1 years, LS BMD 0.863 g/cm(2), and TBS 1.195. LS BMD and LS TBS were moderately correlated (r (2) = 0.25). After a mean of 2.7 ± 0.8 years of follow-up, the incidence of fragility fractures was 9.4 %. Age- and BMI-adjusted hazard ratios per standard deviation decrease (95 % confidence intervals) were 1.58 (1.16-2.16), 1.77 (1.31-2.39), and 1.59 (1.21-2.09) for LS, FN, and TH BMD, respectively, and 2.01 (1.54-2.63) for TBS. Whereas 58 and 60 % of fragility fractures occurred in women with BMD T score ≤-2.5 and a TBS <1.150, respectively, combining these two thresholds identified 77 % of all women with an osteoporotic fracture. CONCLUSIONS Lumbar spine TBS alone or in combination with BMD predicted incident clinical fracture risk in a representative population-based sample of elderly post-menopausal women.
Resumo:
We read with great interest the large-scale network meta-analysis by Kowalewski et al. comparing clinical outcomes of patients undergoing coronary artery bypass grafting (CABG) operated on using minimal invasive extracorporeal circulation (MiECC) or off-pump (OPCAB) with those undergoing surgery on conventional cardiopulmonary bypass (CPB) [1]. The authors actually integrated into single study two recently published meta-analysis comparing MiECC and OPCAB with conventional CPB, respectively [2, 3] into a single study. According to the results of this study, MiECC and OPCAB are both strongly associated with improved perioperative outcomes following CABG when compared with CABG performed on conventional CPB. The authors conclude that MiECC may represent an attractive compromise between OPCAB and conventional CPB. After carefully reading the whole manuscript, it becomes evident that the role of MiECC is clearly undervalued. Detailed statistical analysis using the surface under the cumulative ranking probabilities indicated that MiECC represented the safer and more effective intervention regarding all-cause mortality and protection from myocardial infarction, cerebral stroke, postoperative atrial fibrillation and renal dysfunction when compared with OPCAB. Even though no significant statistical differences were demonstrated between MiECC and OPCAB, the superiority of MiECC is obvious by the hierarchy of treatments in the probability analysis, which ranked MiECC as the first treatment followed by OPCAB and conventional CPB. Thus, MiECC does not represent a compromise between OPCAB and conventional CPB, but an attractive dominant technique in CABG surgery. These results are consistent with the largest published meta-analysis by Anastasiadis et al. comparing MiECC versus conventional CPB including a total of 2770 patients. A significant decrease in mortality was observed when MiECC was used, which was also associated with reduced risk of postoperative myocardial infarction and neurological events [4]. Similarly, another recent meta-analysis by Benedetto et al. compared MiECC versus OPCAB and resulted in comparable outcomes between these two surgical techniques [5]. As stated in the text, superiority of MiECC observed in the current network meta-analysis, when compared with OPCAB, could be attributed to the fact that MiECC offers the potential for complete revascularization, whereas OPCAB poses a challenge for unexperienced surgeons; especially when distal marginal branches on the lateral and/or posterior wall of the heart need revascularization. This is reflected by a significantly lower number of distal anastomoses performed in OPCAB when compared with conventional CPB. Therefore, taking into consideration the literature published up to date, including the results of the current article, we advocate that MiECC should be integrated in the clinical practice guidelines as a state-of-the-art technique and become a standard practice for perfusion in coronary revascularization surgery.
Resumo:
Sentinel lymph node (SLN) detection techniques have the potential to change the standard of surgical care for patients with prostate cancer. We performed a lymphatic mapping study and determined the value of fluorescence SLN detection with indocyanine green (ICG) for the detection of lymph node metastases in intermediate- and high-risk patients undergoing radical prostatectomy and extended pelvic lymph node dissection. A total of 42 patients received systematic or specific ICG injections into the prostate base, the midportion, the apex, the left lobe, or the right lobe. We found (1) that external and internal iliac regions encompass the majority of SLNs, (2) that common iliac regions contain up to 22% of all SLNs, (3) that a prostatic lobe can drain into the contralateral group of pelvic lymph nodes, and (4) that the fossa of Marcille also receives significant drainage. Among the 12 patients who received systematic ICG injections, 5 (42%) had a total of 29 lymph node metastases. Of these, 16 nodes were ICG positive, yielding 55% sensitivity. The complex drainage pattern of the prostate and the low sensitivity of ICG for the detection of lymph node metastases reported in our study highlight the difficulties related to the implementation of SNL techniques in prostate cancer. PATIENT SUMMARY There is controversy about how extensive lymph node dissection (LND) should be during prostatectomy. We investigated the lymphatic drainage of the prostate and whether sentinel node fluorescence techniques would be useful to detect node metastases. We found that the drainage pattern is complex and that the sentinel node technique is not able to replace extended pelvic LND.
Resumo:
Immigrants from high tuberculosis (TB) incidence regions are a risk group for TB in low-incidence countries such as Switzerland. In a previous analysis of a nationwide collection of 520 Mycobacterium tuberculosis isolates from 2000-2008, we identified 35 clusters comprising 90 patients based on standard genotyping (24-loci MIRU-VNTR and spoligotyping). Here, we used whole genome sequencing (WGS) to revisit these transmission clusters. Genome-based transmission clusters were defined as isolate pairs separated by ≤12 single nucleotide polymorphisms (SNPs). WGS confirmed 17/35 (49%) MIRU-VNTR clusters; the other 18 clusters contained pairs separated by >12 SNPs. Most transmission clusters (3/4) of Swiss-born patients were confirmed by WGS, as opposed to 25% (4/16) of clusters involving only foreign-born patients. The overall clustering proportion using standard genotyping was 17% (90 patients, 95% confidence interval [CI]: 14-21%), but only 8% (43 patients, 95% CI: 6-11%) using WGS. The clustering proportion was 17% (67/401, 95% CI: 13-21%) using standard genotyping and 7% (26/401, 95% CI: 4-9%) using WGS among foreign-born patients, and 19% (23/119, 95% CI: 13-28%) and 14% (17/119, 95% CI: 9-22%), respectively, among Swiss-born patients. Using weighted logistic regression, we found weak evidence for an association between birth origin and transmission (aOR 2.2, 95% CI: 0.9-5.5, comparing Swiss-born patients to others). In conclusion, standard genotyping overestimated recent TB transmission in Switzerland when compared to WGS, particularly among immigrants from high TB incidence regions, where genetically closely related strains often predominate. We recommend the use of WGS to identify transmission clusters in low TB incidence settings.
Resumo:
Background. Laparoscopic Cholecystectomy is the gold standard for patients who are diagnosed with biliary colic (NIH, 1993). It has been demonstrated that individuals who wait a longer time between diagnosis and treatment are at increased risk of having complications (Rutledge et al., 2000; Contini et al., 2004; Eldar et al., 1999). County hospitals, such as Ben Taub General Hospital (BTGH), have a particularly high population of uninsured patients and consequently long surgical wait periods due to limited resources. This study evaluates patients the risk factors involved in their progression to complications from gallstones in a county hospital environment. ^ Methods. A case-control study using medical records was performed on all patients who underwent a cholecystectomy for gallstone disease at BTGH during the year of 2005 (n=414). The risk factors included in the study are obesity, gender, age, race, diabetes, and amount of time from diagnosis to surgery. Multivariate analysis and logistical regression were used to assess factors that potentially lead to the development of complications. ^ Results. There were a total of 414 patients at BTGH who underwent a cholecystectomy for gallstone disease during 2005. The majority of patients were female, 84.3% (n=349) and Hispanic, 79.7% (n=330). The median wait time from diagnosis to surgery was 1.43 weeks (range: 0-184.71). The majority of patients presented with complications 72.5% (n=112). The two factors that impacted development of complications in our study population were Hispanic race (OR=1.81; CI 1.02, 3.23; p=0.04) and time from diagnosis to surgery (OR=0.98; CI 0.97, 0.99; p<0.01). Obesity, gender, age, and diabetes were not predictive of development of complications. ^ Conclusions. An individual's socioeconomic status potentially influences all aspects of their health and subsequent health care. The patient population of BTGH is largely uninsured and therefore less likely to seek care at an early stage in their disease process. In order to decrease the rate of complications, there needs to be a system that increases patient access to primary care clinics. Until the problem of access to care is solved, those who are uninsured will likely suffer more severe complications and society will bear the cost. ^