37 resultados para Log Stackers and Sawmill Yard
Resumo:
Mycobacterium tuberculosis infects more people worldwide each year than any other single organism. The Antigen 85 Complex, a family of fibronectin-binding proteins (Fbps) found in several species of mycobacteria and possibly involved in host interaction, is considered among the putative virulence factors of M. tuberculosis. These proteins are implicated in the production of trehalose dimycolate (TDM) and arabinogalactan-mycolate (AG-M), two prominent components of the mycobacterium cell wall and potent modulators of the immune system during infection. For these reasons, the principal members of the complex, FbpA and FbpB, were the focus of these studies. The genes encoding these proteins, fbpA and fbpB, were each disrupted by insertion of a kanamycin resistance cassette in a pathogenic strain of M. tuberculosis, H37Rv. Neither mutation affected growth in routine broth culture. Thin layer chromatography analysis of TDM and AG-M showed no difference in content between the parent strain H37Rv and the FbpA- and FbpB-deficient mutants grown under two different culture conditions. However, metabolic radiolabeling of the strains showed that the production of TDM (but not its precursor TMM) was delayed in the FbpA- and FbpB-deficient mutants compared to the parent H37Rv. During this same labeling period, FbpA-deficient mutant LAa1 failed to produce AG-M and in the FpbB-deficient mutant LAb1 production was decreased. In macrophage tissue culture assay, LAa1 failed to multiply when bacteria in early log phase were used to infect monolayers while LAb1 grew like the parent strain. The growth deficiency of LAa1 as well as the deficiencies in TDM and AG-M production were restored by complementing LAa1 with a functional fbpA gene. These results suggest that the FbpA and FbpB proteins are involved in synthesis of TDM (but not its precursor TMM) as well as AG-M. Other members of the complex appear to compensate for defects in synthesis caused by mutation of single genes in the complex over time. Mutation of the FbpA gene causes greater in vivo effect than mutation of the FbpB gene despite very similar deficiencies in the rate of production of mycolate containing molecules on the cell surface. ^
Resumo:
Hereditary nonpolyposis colorectal cancer (HNPCC) is an autosomal dominant disease caused by germline mutations in DNA mismatch repair(MMR) genes. The nucleotide excision repair(NER) pathway plays a very important role in cancer development. We systematically studied interactions between NER and MMR genes to identify NER gene single nucleotide polymorphism (SNP) risk factors that modify the effect of MMR mutations on risk for cancer in HNPCC. We analyzed data from polymorphisms in 10 NER genes that had been genotyped in HNPCC patients that carry MSH2 and MLH1 gene mutations. The influence of the NER gene SNPs on time to onset of colorectal cancer (CRC) was assessed using survival analysis and a semiparametric proportional hazard model. We found the median age of onset for CRC among MMR mutation carriers with the ERCC1 mutation was 3.9 years earlier than patients with wildtype ERCC1(median 47.7 vs 51.6, log-rank test p=0.035). The influence of Rad23B A249V SNP on age of onset of HNPCC is age dependent (likelihood ratio test p=0.0056). Interestingly, using the likelihood ratio test, we also found evidence of genetic interactions between the MMR gene mutations and SNPs in ERCC1 gene(C8092A) and XPG/ERCC5 gene(D1104H) with p-values of 0.004 and 0.042, respectively. An assessment using tree structured survival analysis (TSSA) showed distinct gene interactions in MLH1 mutation carriers and MSH2 mutation carriers. ERCC1 SNP genotypes greatly modified the age onset of HNPCC in MSH2 mutation carriers, while no effect was detected in MLH1 mutation carriers. Given the NER genes in this study play different roles in NER pathway, they may have distinct influences on the development of HNPCC. The findings of this study are very important for elucidation of the molecular mechanism of colon cancer development and for understanding why some mutation carriers of the MSH2 and MLH1 gene develop CRC early and others never develop CRC. Overall, the findings also have important implications for the development of early detection strategies and prevention as well as understanding the mechanism of colorectal carcinogenesis in HNPCC. ^
Resumo:
Purpose. To examine the association between living in proximity to Toxics Release Inventory (TRI) facilities and the incidence of childhood cancer in the State of Texas. ^ Design. This is a secondary data analysis utilizing the publicly available Toxics release inventory (TRI), maintained by the U.S. Environmental protection agency that lists the facilities that release any of the 650 TRI chemicals. Total childhood cancer cases and childhood cancer rate (age 0-14 years) by county, for the years 1995-2003 were used from the Texas cancer registry, available at the Texas department of State Health Services website. Setting: This study was limited to the children population of the State of Texas. ^ Method. Analysis was done using Stata version 9 and SPSS version 15.0. Satscan was used for geographical spatial clustering of childhood cancer cases based on county centroids using the Poisson clustering algorithm which adjusts for population density. Pictorial maps were created using MapInfo professional version 8.0. ^ Results. One hundred and twenty five counties had no TRI facilities in their region, while 129 facilities had at least one TRI facility. An increasing trend for number of facilities and total disposal was observed except for the highest category based on cancer rate quartiles. Linear regression analysis using log transformation for number of facilities and total disposal in predicting cancer rates was computed, however both these variables were not found to be significant predictors. Seven significant geographical spatial clusters of counties for high childhood cancer rates (p<0.05) were indicated. Binomial logistic regression by categorizing the cancer rate in to two groups (<=150 and >150) indicated an odds ratio of 1.58 (CI 1.127, 2.222) for the natural log of number of facilities. ^ Conclusion. We have used a unique methodology by combining GIS and spatial clustering techniques with existing statistical approaches in examining the association between living in proximity to TRI facilities and the incidence of childhood cancer in the State of Texas. Although a concrete association was not indicated, further studies are required examining specific TRI chemicals. Use of this information can enable the researchers and public to identify potential concerns, gain a better understanding of potential risks, and work with industry and government to reduce toxic chemical use, disposal or other releases and the risks associated with them. TRI data, in conjunction with other information, can be used as a starting point in evaluating exposures and risks. ^
Resumo:
Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^
Resumo:
Objectives. Previous studies have shown a survival advantage in ovarian cancer patients with Ashkenazi-Jewish (AJ) BRCA founder mutations, compared to sporadic ovarian cancer patients. The purpose of this study was to determine if this association exists in ovarian cancer patients with non-Ashkenazi Jewish BRCA mutations. In addition, we sought to account for possible "survival bias" by minimizing any lead time that may exist between diagnosis and genetic testing. ^ Methods. Patients with stage III/IV ovarian, fallopian tube, or primary peritoneal cancer and a non-Ashkenazi Jewish BRCA1 or 2 mutation, seen for genetic testing January 1996-July 2007, were identified from genetics and institutional databases. Medical records were reviewed for clinical factors, including response to initial chemotherapy. Patients with sporadic (non-hereditary) ovarian, fallopian tube, or primary peritoneal cancer, without family history of breast or ovarian cancer, were compared to similar cases, matched by age, stage, year of diagnosis, and vital status at time interval to BRCA testing. When possible, 2 sporadic patients were matched to each BRCA patient. An additional group of unmatched, sporadic ovarian, fallopian tube and primary peritoneal cancer patients was included for a separate analysis. Progression-free (PFS) & overall survival (OS) were calculated by the Kaplan-Meier method. Multivariate Cox proportional hazards models were calculated for variables of interest. Matched pairs were treated as clusters. Stratified log rank test was used to calculate survival data for matched pairs using paired event times. Fisher's exact test, chi-square, and univariate logistic regression were also used for analysis. ^ Results. Forty five advanced-stage ovarian, fallopian tube and primary peritoneal cancer patients with non-Ashkenazi Jewish (non-AJ) BRCA mutations, 86 sporadic-matched and 414 sporadic-unmatched patients were analyzed. Compared to the sporadic-matched and sporadic-unmatched ovarian cancer patients, non-AJ BRCA mutation carriers had longer PFS (17.9 & 13.8 mos. vs. 32.0 mos., HR 1.76 [95% CI 1.13–2.75] & 2.61 [95% CI 1.70–4.00]). In relation to the sporadic- unmatched patients, non-AJ BRCA patients had greater odds of complete response to initial chemotherapy (OR 2.25 [95% CI 1.17–5.41]) and improved OS (37.6 mos. vs. 101.4 mos., HR 2.64 [95% CI 1.49–4.67]). ^ Conclusions. This study demonstrates a significant survival advantage in advanced-stage ovarian cancer patients with non-AJ BRCA mutations, confirming the previous studies in the Jewish population. Our efforts to account for "survival bias," by matching, will continue with collaborative studies. ^
Resumo:
Introduction and objective. A number of prognostic factors have been reported for predicting survival in patients with renal cell carcinoma. Yet few studies have analyzed the effects of those factors at different stages of the disease process. In this study, different stages of disease progression starting from nephrectomy to metastasis, from metastasis to death, and from evaluation to death were evaluated. ^ Methods. In this retrospective follow-up study, records of 97 deceased renal cell carcinoma (RCC) patients were reviewed between September 2006 to October 2006. Patients with TNM Stage IV disease before nephrectomy or with cancer diagnoses other than RCC were excluded leaving 64 records for analysis. Patient TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were analyzed in relation to time to metastases. Time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from metastases to death. Finally, analysis of laboratory values at time of evaluation, Eastern Cooperative Oncology Group performance status (ECOG), UCLA Integrated Staging System (UISS), time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from evaluation to death. Linear regression and Cox Proportional Hazard (univariate and multivariate) was used for testing significance. Kaplan-Meier Log-Rank test was used to detect any significance between groups at various endpoints. ^ Results. Compared to negative lymph nodes at time of nephrectomy, a single positive lymph node had significantly shorter time to metastasis (p<0.0001). Compared to other histological types, clear cell histology had significant metastasis free survival (p=0.003). Clear cell histology compared to other types (p=0.0002 univariate, p=0.038 multivariate) and time to metastasis with log conversion (p=0.028) significantly affected time from metastasis to death. A greater than one year and greater than two year metastasis free interval, compared to patients that had metastasis before one and two years, had statistically significant survival benefit (p=0.004 and p=0.0318). Time from evaluation to death was affected by greater than one year metastasis free interval (p=0.0459), alcohol consumption (p=0.044), LDH (p=0.006), ECOG performance status (p<0.001), and hemoglobin level (p=0.0092). The UISS risk stratified the patient population in a statistically significant manner for survival (p=0.001). No other factors were found to be significant. ^ Conclusion. Clear cell histology is predictive for both time to metastasis and metastasis to death. Nodal status at time of nephrectomy may predict risk of metastasis. The time interval to metastasis significantly predicts time from metastasis to death and time from evaluation to death. ECOG performance status, and hemoglobin levels predicts survival outcome at evaluation. Finally, UISS appropriately stratifies risk in our population. ^
Resumo:
Context: Despite tremendous strides in HIV treatment over the past decade, resistance remains a major problem. A growing number of patients develop resistance and require new therapies to suppress viral replication. ^ Objective: To assess the safety of multiple administrations of the anti-CD4 receptor (anti-CD4) monoclonal antibody ibalizumab given as intravenous (IV) infusions, in three dosage regimens, in subjects infected with human immunodeficiency virus (HIV-1). ^ Design: Phase 1, multi-center, open-label, randomized clinical trial comparing the safety, pharmacokinetics and antiviral activity of three dosages of ibalizumab. ^ Setting: Six clinical trial sites in the United States. ^ Participants: A total of twenty-two HIV-positive patients on no anti-retroviral therapy or a stable failing regimen. ^ Intervention: Randomized to one of two treatment groups in Arms A and B followed by non-randomized enrollment in Arm C. Patients randomized to Arm A received 10 mg/kg of ibalizumab every 7 days, for a total of 10 doses; patients randomized to Arm B received a total of six doses of ibalizumab; a single loading dose of 10 mg/kg on Day 1 followed by five maintenance doses of 6 mg/kg every 14 days, starting at Week 1. Patients assigned to Arm C received 25 mg/kg of ibalizumab every 14 days for a total of 5 doses. All patients were followed for safety for an additional 7 to 8 weeks. ^ Main Outcome Measures: Clinical and laboratory assessments of safety and tolerability of multiple administrations of ibalizumab in HIV-infected patients. Secondary measures of efficacy include HIV-1 RNA (viral load) measurements. ^ Results: 21 patients were treatment-experienced and 1 was naïve to HIV therapy. Six patients were failing despite therapy and 15 were on no current HIV treatment. Mean baseline viral load (4.78 log 10; range 3.7-5.9) and CD4+ cell counts (332/μL; range 89-494) were similar across cohorts. Mean peak decreases in viral load from baseline of 0.99 log10(1.11 log10, and 0.96 log 10 occurred by Wk 2 in Cohorts A, B and C, respectively. Viral loads decreased by >1.0 log10 in 64%; 4 patients viral loads were suppressed to < 400 copies/mL. Viral loads returned towards baseline by Week 9 with reduced susceptibility to ibalizumab. CD4+ cell counts rose transiently and returned toward baseline. Maximum median elevations above BL in CD4+ cell counts for Cohorts A, B and C were +257, +198 and +103 cells/μL, respectively and occurred within 3 Wks in 16 of 22 subjects. The half-life of ibalizumab was 3-3.5 days and elimination was characteristic of capacity-limited kinetics. Administration of ibalizumab was well tolerated. Four serious adverse events were reported during the study. None of these events were related to study drug. Headache, nausea and cough were the most frequently reported treatment emergent adverse events and there were no laboratory abnormalities related to study drug. ^ Conclusions: Ibalizumab administered either weekly or bi-weekly was safe, well tolerated, and demonstrated antiviral activity. Further studies with ibalizumab in combination with standard antiretroviral treatments are warranted.^
Resumo:
Dialysis patients are at high risk for hepatitis B infection, which is a serious but preventable disease. Prevention strategies include the administration of the hepatitis B vaccine. Dialysis patients have been noted to have a poor immune response to the vaccine and lose immunity more rapidly. The long term immunogenicity of the hepatitis B vaccine has not been well defined in pediatric dialysis patients especially if administered during infancy as a routine childhood immunization.^ Purpose. The aim of this study was to determine the median duration of hepatitis B immunity and to study the effect of vaccination timing and other cofactors on the duration of hepatitis B immunity in pediatric dialysis patients.^ Methods. Duration of hepatitis B immunity was determined by Kaplan-Meier survival analysis. Comparison of stratified survival analysis was performed using log-rank analysis. Multivariate analysis by Cox regression was used to estimate hazard ratios for the effect of timing of vaccine administration and other covariates on the duration of hepatitis B immunity.^ Results. 193 patients (163 incident patients) had complete data available for analysis. Mean age was 11.2±5.8 years and mean ESRD duration was 59.3±97.8 months. Kaplan-Meier analysis showed that the total median overall duration of immunity (since the time of the primary vaccine series) was 112.7 months (95% CI: 96.6, 124.4), whereas the median overall duration of immunity for incident patients was 106.3 months (95% CI: 93.93, 124.44). Incident patients had a median dialysis duration of hepatitis B immunity equal to 37.1 months (95% CI: 24.16, 72.26). Multivariate adjusted analysis showed that there was a significant difference between patients based on the timing of hepatitis B vaccination administration (p<0.001). Patients immunized after the start of dialysis had a hazard ratio of 6.13 (2.87, 13.08) for loss of hepatitis B immunity compared to patients immunized as infants (p<0.001).^ Conclusion. This study confirms that patients immunized after dialysis onset have an overall shorter duration of hepatitis B immunity as measured by hepatitis B antibody titers and after the start of dialysis, protective antibody titer levels in pediatric dialysis patients wane rapidly compared to healthy children.^
Resumo:
A multivariate frailty hazard model is developed for joint-modeling of three correlated time-to-event outcomes: (1) local recurrence, (2) distant recurrence, and (3) overall survival. The term frailty is introduced to model population heterogeneity. The dependence is modeled by conditioning on a shared frailty that is included in the three hazard functions. Independent variables can be included in the model as covariates. The Markov chain Monte Carlo methods are used to estimate the posterior distributions of model parameters. The algorithm used in present application is the hybrid Metropolis-Hastings algorithm, which simultaneously updates all parameters with evaluations of gradient of log posterior density. The performance of this approach is examined based on simulation studies using Exponential and Weibull distributions. We apply the proposed methods to a study of patients with soft tissue sarcoma, which motivated this research. Our results indicate that patients with chemotherapy had better overall survival with hazard ratio of 0.242 (95% CI: 0.094 - 0.564) and lower risk of distant recurrence with hazard ratio of 0.636 (95% CI: 0.487 - 0.860), but not significantly better in local recurrence with hazard ratio of 0.799 (95% CI: 0.575 - 1.054). The advantages and limitations of the proposed models, and future research directions are discussed. ^
Resumo:
Background. Polyomavirus reactivation is common in solid-organ transplant recipients who are given immunosuppressive medications as standard treatment of care. Previous studies have shown that polyomavirus infection can lead to allograft failure in as many as 45% of the affected patients. Hypothesis. Ubiquitous polyomaviruses when reactivated by post-transplant immunosuppressive medications may lead to impaired renal function and possibly lower survival prospects. Study Overview. Secondary analysis of data was conducted on a prospective longitudinal study of subjects who were at least 18 years of age and were recipients of liver and/or kidney transplant at Mayo Clinic Scottsdale, Arizona. Methods. DNA extractions of blinded urine and blood specimens of transplant patients collected at Mayo Clinic during routine transplant patient visits were performed at Baylor College of Medicine using Qiagen kits. Virologic assays included testing DNA samples for specific polyomavirus sequences using QPCR technology. De-identified demographic and clinical patient data were merged with laboratory data and statistical analysis was performed using Stata10. Results. 76 patients enrolled in the study were followed for 3.9 years post transplantation. The prevalence of BK virus and JC virus urinary excretion was 30% and 28%. Significant association was observed between JC virus excretion and kidney as the transplanted organ (P = 0.039, Pearson Chi-square test). The median urinary JCV viral loads were two logs higher than those of BKV. Patients that excreted both BKV and JCV appeared to have the worst renal function with a mean creatinine clearance value of 71.6 millimeters per minute. A survival disadvantage was observed for dual shedders of BKV and JCV, log-rank statistics, p = 0.09; 2/5 dual-shedders expired during the study period. Liver transplant and male sex were determined to be potential risk factors for JC virus activation in renal and liver transplant recipients. All patients tested negative for SV40 and no association was observed between polyomavirus excretion and type of immunosuppressive medication (tacrolimus, mycophenolate mofetil, cyclosporine and sirolimus). Conclusions. Polyomavirus reactivation was common after solid-organ transplantation and may be associated with impaired renal function. Male sex and JCV infection may be potential risk factors for viral reactivation; findings should be confirmed in larger studies.^
Resumo:
Bladder cancer is the fourth most common cancer in men in the United States. There is compelling evidence supporting that genetic variations contribute to the risk and outcomes of bladder cancer. The PI3K-AKT-mTOR pathway is a major cellular pathway involved in proliferation, invasion, inflammation, tumorigenesis, and drug response. Somatic aberrations of PI3K-AKT-mTOR pathway are frequent events in several cancers including bladder cancer; however, no studies have investigated the role of germline genetic variations in this pathway in bladder cancer. In this project, we used a large case control study to evaluate the associations of a comprehensive catalogue of SNPs in this pathway with bladder cancer risk and outcomes. Three SNPs in RAPTOR were significantly associated with susceptibility: rs11653499 (OR: 1.79, 95%CI: 1.24–2.60), rs7211818 (OR: 2.13, 95%CI: 1.35–3.36), and rs7212142 (OR: 1.57, 95%CI: 1.19–2.07). Two haplotypes constructed from these 3 SNPs were also associated with bladder cancer risk. In combined analysis, a significant trend was observed for increased risk with an increase in the number of unfavorable genotypes (P for trend<0.001). Classification and regression tree analysis identified potential gene-environment interactions between RPS6KA5 rs11653499 and smoking. In superficial bladder cancer, we found that PTEN rs1234219 and rs11202600, TSC1 rs7040593, RAPTOR rs901065, and PIK3R1 rs251404 were significantly associated with recurrence in patients receiving BCG. In muscle invasive and metastatic bladder cancer, AKT2 rs3730050, PIK3R1 rs10515074, and RAPTOR rs9906827 were associated with survival. Survival tree analysis revealed potential gene-gene interactions: patients carrying the unfavorable genotypes of PTEN rs1234219 and TSC1 rs704059 exhibited a 5.24-fold (95% CI: 2.44–11.24) increased risk of recurrence. In combined analysis, with the increasing number of unfavorable genotypes, there was a significant trend of higher risk of recurrence and death (P for trend<0.001) in Cox proportional hazard regression analysis, and shorter event (recurrence and death) free survival in Kaplan-Meier estimates (P log rank<0.001). This study strongly suggests that genetic variations in PI3K-AKT-mTOR pathway play an important role in bladder cancer development. The identified SNPs, if validated in further studies, may become valuable biomarkers in assessing an individual's cancer risk, predicting prognosis and treatment response, and facilitating physicians to make individualized treatment decisions. ^
Resumo:
Research studies on the association between exposures to air contaminants and disease frequently use worn dosimeters to measure the concentration of the contaminant of interest. But investigation of exposure determinants requires additional knowledge beyond concentration, i.e., knowledge about personal activity such as whether the exposure occurred in a building or outdoors. Current studies frequently depend upon manual activity logging to record location. This study's purpose was to evaluate the use of a worn data logger recording three environmental parameters—temperature, humidity, and light intensity—as well as time of day, to determine indoor or outdoor location, with an ultimate aim of eliminating the need to manually log location or at least providing a method to verify such logs. For this study, data collection was limited to a single geographical area (Houston, Texas metropolitan area) during a single season (winter) using a HOBO H8 four-channel data logger. Data for development of a Location Model were collected using the logger for deliberate sampling of programmed activities in outdoor, building, and vehicle locations at various times of day. The Model was developed by analyzing the distributions of environmental parameters by location and time to establish a prioritized set of cut points for assessing locations. The final Model consisted of four "processors" that varied these priorities and cut points. Data to evaluate the Model were collected by wearing the logger during "typical days" while maintaining a location log. The Model was tested by feeding the typical day data into each processor and generating assessed locations for each record. These assessed locations were then compared with true locations recorded in the manual log to determine accurate versus erroneous assessments. The utility of each processor was evaluated by calculating overall error rates across all times of day, and calculating individual error rates by time of day. Unfortunately, the error rates were large, such that there would be no benefit in using the Model. Another analysis in which assessed locations were classified as either indoor (including both building and vehicle) or outdoor yielded slightly lower error rates that still precluded any benefit of the Model's use.^
Resumo:
Unlike infections occurring during periods of chemotherapy-induced neutropenia, postoperative infections in patients with solid malignancy remain largely understudied. The purpose of this population-based study was to evaluate the clinical and economic burden, as well as the relationship of hospital surgical volume and outcomes associated with serious postoperative infection (SPI) – i.e., bacteremia/sepsis, pneumonia, and wound infection – following resection of common solid tumors.^ From the Texas Discharge Data Research File, we identified all Texas residents who underwent resection of cancer of the lung, esophagus, stomach, pancreas, colon, or rectum between 2002 and 2006. From their billing records, we identified ICD-9 codes indicating SPI and also subsequent SPI-related readmissions occurring within 30 days of surgery. Random-effects logistic regression was used to calculate the impact of SPI on mortality, as well as the association between surgical volume and SPI, adjusting for case-mix, hospital characteristics, and clustering of multiple surgical admissions within the same patient and patients within the same hospital. Excess bed days and costs were calculated by subtracting values for patients without infections from those with infections computed using multilevel mixed-effects generalized linear model by fitting a gamma distribution to the data using log link.^ Serious postoperative infection occurred following 9.4% of the 37,582 eligible tumor resections and was independently associated with an 11-fold increase in the odds of in-hospital mortality (95% Confidence Interval [95% CI], 6.7-18.5, P < 0.001). Patients with SPI required 6.3 additional hospital days (95% CI, 6.1 - 6.5) at an incremental cost of $16,396 (95% CI, $15,927–$16,875). There was a significant trend toward lower overall rates of SPI with higher surgical volume (P=0.037). ^ Due to the substantial morbidity, mortality, and excess costs associated with SPI following solid tumor resections and given that, under current reimbursement practices, most of this heavy burden is borne by acute care providers, it is imperative for hospitals to identify more effective prophylactic measures, so that these potentially preventable infections and their associated expenditures can be averted. Additional volume-outcomes research is also needed to identify infection prevention processes that can be transferred from higher- to lower-volume providers.^
Resumo:
Previous research has shown dietary intake self-monitoring, and culturally tailored weight loss interventions to be effective tools for weight loss. Technology can be used to tailor weight loss interventions to better suit adolescents. There is a lack of research to date on the use of personal digital assistants (PDAs) to self-monitor dietary intake among adolescents. The objective of this study was to determine the difference in dietary intake self-monitoring frequency between using a Personal Digital Assistant (PDA) or paper logs as a diet diary in obese adolescent females; and to describe differences in diet adherence, as well as changes in body size and self-efficacy to resist eating. We hypothesized dietary intake self-monitoring frequency would be greater during PDA use than during paper log use. This study was a randomized crossover trial. Participants recorded their diet for 4 weeks: 2 weeks on a PDA and 2 weeks on paper logs. Thirty-four obese females ages 12-20 were recruited for participation. Thirty were included in analyses. Participants recorded more entries/day while using the paper logs (4.10 entries/day ± 0.63) than while using the PDA (3.01 entries/day ±0.75) (p<0.001). Significantly more meals and snacks were skipped during paper log use (0.81/day ± 0.65) than during PDA use (0.23/day ± 0.22) (p=0.011). Changes in body size (BMI, weight, and waist circumference) and self-efficacy to resist eating did not differ significantly between PDA and paper log use. When compared to paper logs, participants felt the PDA was more convenient (p=0.020), looked forward to using the PDA more (p=0.008), and would rather continue using the PDA than the paper logs (p=0.020). The findings of this study indicate use of a PDA as a dietary intake self-monitoring tool among adolescents would not result in increased dietary intake self-monitoring to aid in weight loss. Use of paper logs would result in greater data returned to clinicians, though use of PDAs would likely get adolescents more excited about adhering to recommendations to record their diet. Future research should look at updated communication devices, such as cell phones and other PDAs with additional features, and the role they can play in increasing dietary intake self-monitoring among adolescents.^
Resumo:
The cross-sectional study was performed to quantify the prevalence of symtomatology in residents of mobile homes as a function of indoor formaldehyde concentration. Formaldehyde concentrations were monitored for a seven hour period with an automated wet-chemical colorimetric analyzer. The health status of family members was ascertained by administration of questionnaires and physical exams. This is the first investigation to perform clinical assessments on residents undergoing concurrent exposure assessment in the home.^ Only 22.8% of households eligible for participation chose to cooperate. Monitoring data and health evaluations were obtained from 155 households in four Texas counties. A total of 428 residents (86.1%) were available for examination during the sampling hours. The study population included 45 infants, 126 children, and 257 adults.^ Formaldehyde concentration was not found to be significantly associated with increased risks for symptoms and signs of ocular irritation, dermal anomalies, or malaise. Three associations were identified that warrant further investigation. The relative odds associated with a doubling of formaldehyde concentration was significantly associated with parenchymal rales in adults and children. However, risk was modified by log respirable suspended particulate concentrations. Due to the presence of modification by a continuous variable, prevalence odds ratios (POR) and 95% confidence intervals (95% CI) for these associations are presented in tables. A doubling of formaldehyde concentration was also associated with an increased risk of perceived tightness in the chest in adults. Prevalence odds ratios are presented in a table due to effect modification by the average number of hours spent indoors on weekdays. Furthermore, a doubling of formaldehyde concentration was associated with an increased risk of drowsiness in children (POR = 2.60; 95% CI 1.04-6.51) and adults (POR = 1.94; 95% CI 1.20-3.14). ^