12 resultados para Burns and scalds -- Patients -- Rehabilitation. Burns and scalds in children.
em DigitalCommons@The Texas Medical Center
Resumo:
Second-generation antipsychotics (SGAs) are increasingly prescribed to treat psychiatric symptoms in pediatric patients infected with HIV. We examined the relationship between prescribed SGAs and physical growth in a cohort of youth with perinatally acquired HIV-1 infection. Pediatric AIDS Clinical Trials Group (PACTG), Protocol 219C (P219C), a multicenter, longitudinal observational study of children and adolescents perinatally exposed to HIV, was conducted from September 2000 until May 2007. The analysis included P219C participants who were perinatally HIV-infected, 3-18 years old, prescribed first SGA for at least 1 month, and had available baseline data prior to starting first SGA. Each participant prescribed an SGA was matched (based on gender, age, Tanner stage, baseline body mass index [BMI] z score) with 1-3 controls without antipsychotic prescriptions. The main outcomes were short-term (approximately 6 months) and long-term (approximately 2 years) changes in BMI z scores from baseline. There were 236 participants in the short-term and 198 in the long-term analysis. In linear regression models, youth with SGA prescriptions had increased BMI z scores relative to youth without antipsychotic prescriptions, for all SGAs (short-term increase = 0.192, p = 0.003; long-term increase = 0.350, p < 0.001), and for risperidone alone (short-term = 0.239, p = 0.002; long-term = 0.360, p = 0.001). Participants receiving both protease inhibitors (PIs) and SGAs showed especially large increases. These findings suggest that growth should be carefully monitored in youth with perinatally acquired HIV who are prescribed SGAs. Future research should investigate the interaction between PIs and SGAs in children and adolescents with perinatally acquired HIV infection.
Resumo:
Hypertension is a known risk factor for cardiovascular disease in adults. Essential hypertension in children and adolescents is increasing in prevalence in the United States, and hypertension in children may track into adulthood. This increasing prevalence is attributed to the trends of increasing overweight and obese children and adolescents. Family history and being of African-American/black descent may predispose youth to elevated blood pressure. Interventions targeted to reduce and treat hypertension in youth include non-pharmaceutical interventions such as weight reduction, increased physical activity, and dietary changes and pharmaceutical treatment when indicated. The effectiveness of non-pharmaceutical interventions is well documented in adults, but there are limited studies with regards to children and adolescents, specifically in the arena of dietary interventions. Lifestyle modifications such as dietary interventions are the mainstay of recommended treatment for those children and adolescents with prehypertension or stage 1 hypertension. Given the association of being overweight and hypertension, efficacy of dietary interventions are of interest because of reduced cost, easy implementation and potential for multiple beneficial outcomes such as reduced weight and reduction of other metabolic or cardiovascular derangements. Barriers to dietary interventions often include socioeconomic status, ethnicity, personal, and external factors. The goal of this systematic review of the literature is to identify interventions targeted to children and adolescents that focus on recommended dietary changes related to blood pressure. Dietary interventions found for this review mostly focused on a particular nutrient or food group with the one notable exception that focused on the DASH pattern of eating. The effects of the interventions on blood pressure varied, but overall dietary modifications can be achieved in youth and can serve a role in producing positive outcomes on blood pressure. Increasing potassium and following a DASH diet seemed to provide the most clinically significant results. Further studies are still needed to evaluate long-term effectiveness and to contribute more supporting evidence for particular modifications in these age cohorts.^
Resumo:
Multiple sclerosis (MS) is the most common demyelinating disease affecting the central nervous system. There is no cure for MS and current therapies have limited efficacy. While the majority of individuals with MS develop significant clinical disability, a subset experiences a disease course with minimal impairment even in the presence of significant apparent tissue damage on magnetic resonance imaging (MRI). The current studies combined functional MRI and diffusion tensor imaging (DTI) to elucidate brain mechanisms associated with lack of clinical disability in patients with MS. Recent evidence has implicated cortical reorganization as a mechanism to limit the clinical manifestation of the disease. Functional MRI was used to test the hypothesis that non-disabled MS patients (Expanded Disability Status Scale ≤ 1.5) show increased recruitment of cognitive control regions (dorsolateral prefrontal and anterior cingulate cortex) while performing sensory, motor and cognitive tasks. Compared to matched healthy controls, patients increased activation of cognitive control brain regions when performing non-dominant hand movements and the 2-back working memory task. Using dynamic causal modeling, we tested whether increased cognitive control recruitment is associated with alterations in connectivity in the working memory functional network. Patients exhibited similar network connectivity to that of control subjects when performing working memory tasks. We subsequently investigated the integrity of major white matter tracts to assess structural connectivity and its relation to activation and functional integration of the cognitive control system. Patients showed substantial alterations in callosal, inferior and posterior white matter tracts and less pronounced involvement of the corticospinal tracts and superior longitudinal fasciculi (SLF). Decreased structural integrity within the right SLF in patients was associated with decreased performance, and decreased activation and connectivity of the cognitive control system when performing working memory tasks. These studies suggest that patient with MS without clinical disability increase cognitive control system recruitment across functional domains and rely on preserved functional and structural connectivity of brain regions associated with this network. Moreover, the current studies show the usefulness of combining brain activation data from functional MRI and structural connectivity data from DTI to improve our understanding of brain adaptation mechanisms to neurological disease.
Resumo:
The plasma membrane xc- cystine/glutamate transporter mediates cellular uptake of cystine in exchange for intracellular glutamate and is highly expressed by pancreatic cancer cells. The xCT gene, encoding the cystine-specific xCT protein subunit of xc-, is important in regulating intracellular glutathione (GSH) levels, critical for cancer cell protection against oxidative stress, tumor growth and resistance to chemotherapeutic agents including platinum. We examined 4 single nucleotide polymorphisms (SNPs) of the xCT gene in 269 advanced pancreatic cancer patients who received first line gemcitabine with or without cisplatin or oxaliplatin. Genotyping was performed using Taqman real-time PCR assays. A statistically significant correlation was noted between the 3' untranslated region (UTR) xCT SNP rs7674870 and overall survival (OS): Median survival time (MST) was 10.9 and 13.6 months, respectively, for the TT and TC/CC genotypes (p = 0.027). Stratified analysis showed the genotype effect was significant in patients receiving gemcitabine in combination with platinum therapy (n = 145): MST was 10.5 versus 14.1 months for the TT and TC/CC genotypes, respectively (p = 0.013). The 3' UTR xCT SNP rs7674870 may correlate with OS in pancreatic cancer patients receiving gemcitabine and platinum combination therapy. Paraffin-embedded core and surgical biopsy tumor specimens from 98 patients with metastatic pancreatic adenocarcinoma were analyzed by immunohistochemistry using an xCT specific antibody. xCT protein IHC expression scores were analyzed in relation to overall survival in 86 patients and genotype in 12 patients and no statistically significant association was found between the level of xCT IHC expression score and overall survival (p = 0.514). When xCT expression was analyzed in terms of treatment response, no statistically significant associations could be determined (p = 0.908). These data suggest that polymorphic variants of xCT may have predictive value, and that the xc- transporter may represent an important target for therapy in pancreatic cancer.
Resumo:
The fourth component of human complement (C4) exists in blood as two major forms or isotypes which differ in their biochemical and functional properties. Because C4A preferentially transacylates onto amino groups, it has been postulated that this isotype is more important in the clearance of immune complexes. Patients having systemic lupus erythematosus (SLE), an autoimmune disease, have an increased incidence of C4A null genes and presumably decreased levels of C4A. Currently accepted methods for the detection of C4, however, cannot accurately quantitate C4A and C4B. Thus, their role in disease susceptibility and activity has not been studied. A novel immunoassay, which utilized heat-aggregated IgG to activate and capture C4, was developed for accurate quantitation of total C4, C4A and C4B by monoclonal antibody conjugates. Higher mean total C4 values were found in a healthy Black control population when compared to White controls. This appeared to be due to an increase in C4B. In SLE patients, mean total C4 levels were significantly lower than controls regardless of disease activity. Serial patient studies showed that the ratio of C4A:C4B remained relatively constant. When the patient group was compared to controls based on C4 null gene status, the mean levels of C4A were identical while C4B was decreased in the patients. This suggests that the common HLA-B8, Dr3 C4A*Q0 gene deletion found in SLE patients may also adversely affect genetic control of the C4B genes. Furthermore, low levels of C4A cannot fully account for disease development in SLE patients having C4A null genes. ^
Resumo:
It is estimated that 50% of all lung cancer patients continue to smoke after diagnosis. Many of these lung cancer patients who are current smokers often experience tremendous guilt and responsibility for their disease, and feel it might be too late for them to quit smoking. In addition, many oncologists may be heard to say that it is 'too late', 'it doesn't matter', 'it is too difficult', 'it is too stressful' for their patients to stop smoking, or they never identify the smoking status of the patient. Many oncologists feel unprepared to address smoking cessation as part of their clinical practice. In reality, physicians can have tremendous effects on motivating patients, particularly when patients are initially being diagnosed with cancer. More information is needed to convince patients to quit smoking and to encourage clinicians to assist patients with their smoking cessation. ^ In this current study, smoking status at time of lung cancer diagnosis was assessed to examine its impact on complications and survival, after exploring the reliability of smoking data that is self-reported. Logistic Regression was used to determine the risks of smoking prior to lung resection. In addition, survival analysis was performed to examine the impact of smoking on survival. ^ The reliability of how patients report their smoking status was high, but there was some discordance between current smokers and recent quitters. In addition, we found that cigarette pack-year history and duration of smoking cessation were directly related to the rate of a pulmonary complication. In regards to survival, we found that current smoking at time of lung cancer diagnosis was an independent predictor of early stage lung cancer. This evidence supports the idea that it is "never too late" for patients to quit smoking and health care providers should incorporate smoking status regularly into their clinical practice.^
Resumo:
Critically ill and injured patients require pain relief and sedation to reduce the body's stress response and to facilitate painful diagnostic and therapeutic procedures. Presently, the level of sedation and analgesia is guided by the use of clinical scores which can be unreliable. There is therefore, a need for an objective measure of sedation and analgesia. The Bispectral Index (BIS) and Patient State Index (PSI) were recently introduced into clinical practice as objective measures of the depth of analgesia and sedation. ^ Aim. To compare the different measures of sedation and analgesia (BIS and PSI) to the standard and commonly used modified Ramsay Score (MRS) and determine if the monitors can be used interchangeably. ^ Methods. MRS, BIS and PSI values were obtained in 50 postoperative cardiac surgery patients requiring analgesia and sedation from June to December 2004. The MRS, BIS and PSI values were assessed hourly for up to 6-h by a single observer. ^ The relationship between BIS and PSI values were explored using scatter plots and correlation between MRS, BIS and PSI was determined using Spearman's correlation coefficient. Intra-class correlation (ICC) was used to determine the inter-rater reliability of MRS, BIS and PSI. Kappa statistics was used to further evaluate the agreement between BIS and PSI at light, moderate and deep levels of sedation. ^ Results. There was a positive correlation between BIS and PSI values (Rho = 0.731, p<0.001). Intra-class correlation between BIS and PSI was 0.58, MRS and BIS 0.43 and MRS and PSI 0.27. Using Kappa statistics, agreement between MRS and BIS was 0.35 (95% CI: 0.27–0.43) and for MRS and PSI was 0.21 (95% CI: 0.15–0.28). The kappa statistic for BIS and PSI was 0.45 (95% CI: 0.37–0.52). Receiver operating characteristics (ROC) curves constructed to detect undersedation indicated an area under the curve (AUC) of 0.91 (95% CI = 0.87 to 0.94) for the BIS and 0.84 (95% CI = 0.79 to 0.88) for the PSI. For detection of oversedation, AUC for the BIS was 0.89 (95% CI = 0.84 to 0.92) and 0.80 (95% CI = 0.75 to 0.85) for the PSI. ^ Conclusions. There is a statistically significant positive correlation between the BIS and PSI but poor correlation and poor test agreement between the MRS and BIS as well as MRS and PSI. Both the BIS and PSI demonstrated a high level of prediction for undersedation and oversedation; however, the BIS and PSI can not be considered interchangeable monitors of sedation. ^
Resumo:
Background. Clostridium difficile infection is one of the major causes of antibiotic associated diarrhea and colitis in the United States. Currently, there is a dearth of literature on the risk factors and outcomes differences between the patients with infection due to the hypervirulent strain vs. the non-hypervirulent strains. The objective of this study was to determine the relationship between C. difficile toxin type and clinical features, severity and outcome in patients with C. difficile diarrhea. ^ Methods. The case group included 37 patients who had infections due to hypervirulent strain (tcdC deletion) and the control group included 55 patients with other toxin types (toxin A, B, binary toxin). A univariate analysis was performed followed by a multivariable logistic regression analysis to assess the differences between cases and controls. ^ Results. In the multivariate analyses, we found out that being a male was a protective factor for developing the infection due to the hypervirulent strain [OR 0.33; 95% CI 0.12-0.90]. Also, the hypervirulent group has worse clinical and economic outcomes, although the differences were small and nonsignificant. ^ Conclusions. There may likely be no predictive risk factor for acquiring infection due to the hypervirulent strain and the acquisition may be more linked to the infection control practices of the individual hospitals or location of patients. Hence, better infection control practices may prove helpful in decreasing the overall disease burden and thus improve patient outcomes. ^
Resumo:
Objectives. Previous studies have shown a survival advantage in ovarian cancer patients with Ashkenazi-Jewish (AJ) BRCA founder mutations, compared to sporadic ovarian cancer patients. The purpose of this study was to determine if this association exists in ovarian cancer patients with non-Ashkenazi Jewish BRCA mutations. In addition, we sought to account for possible "survival bias" by minimizing any lead time that may exist between diagnosis and genetic testing. ^ Methods. Patients with stage III/IV ovarian, fallopian tube, or primary peritoneal cancer and a non-Ashkenazi Jewish BRCA1 or 2 mutation, seen for genetic testing January 1996-July 2007, were identified from genetics and institutional databases. Medical records were reviewed for clinical factors, including response to initial chemotherapy. Patients with sporadic (non-hereditary) ovarian, fallopian tube, or primary peritoneal cancer, without family history of breast or ovarian cancer, were compared to similar cases, matched by age, stage, year of diagnosis, and vital status at time interval to BRCA testing. When possible, 2 sporadic patients were matched to each BRCA patient. An additional group of unmatched, sporadic ovarian, fallopian tube and primary peritoneal cancer patients was included for a separate analysis. Progression-free (PFS) & overall survival (OS) were calculated by the Kaplan-Meier method. Multivariate Cox proportional hazards models were calculated for variables of interest. Matched pairs were treated as clusters. Stratified log rank test was used to calculate survival data for matched pairs using paired event times. Fisher's exact test, chi-square, and univariate logistic regression were also used for analysis. ^ Results. Forty five advanced-stage ovarian, fallopian tube and primary peritoneal cancer patients with non-Ashkenazi Jewish (non-AJ) BRCA mutations, 86 sporadic-matched and 414 sporadic-unmatched patients were analyzed. Compared to the sporadic-matched and sporadic-unmatched ovarian cancer patients, non-AJ BRCA mutation carriers had longer PFS (17.9 & 13.8 mos. vs. 32.0 mos., HR 1.76 [95% CI 1.13–2.75] & 2.61 [95% CI 1.70–4.00]). In relation to the sporadic- unmatched patients, non-AJ BRCA patients had greater odds of complete response to initial chemotherapy (OR 2.25 [95% CI 1.17–5.41]) and improved OS (37.6 mos. vs. 101.4 mos., HR 2.64 [95% CI 1.49–4.67]). ^ Conclusions. This study demonstrates a significant survival advantage in advanced-stage ovarian cancer patients with non-AJ BRCA mutations, confirming the previous studies in the Jewish population. Our efforts to account for "survival bias," by matching, will continue with collaborative studies. ^
Resumo:
Introduction and objective. A number of prognostic factors have been reported for predicting survival in patients with renal cell carcinoma. Yet few studies have analyzed the effects of those factors at different stages of the disease process. In this study, different stages of disease progression starting from nephrectomy to metastasis, from metastasis to death, and from evaluation to death were evaluated. ^ Methods. In this retrospective follow-up study, records of 97 deceased renal cell carcinoma (RCC) patients were reviewed between September 2006 to October 2006. Patients with TNM Stage IV disease before nephrectomy or with cancer diagnoses other than RCC were excluded leaving 64 records for analysis. Patient TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were analyzed in relation to time to metastases. Time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from metastases to death. Finally, analysis of laboratory values at time of evaluation, Eastern Cooperative Oncology Group performance status (ECOG), UCLA Integrated Staging System (UISS), time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from evaluation to death. Linear regression and Cox Proportional Hazard (univariate and multivariate) was used for testing significance. Kaplan-Meier Log-Rank test was used to detect any significance between groups at various endpoints. ^ Results. Compared to negative lymph nodes at time of nephrectomy, a single positive lymph node had significantly shorter time to metastasis (p<0.0001). Compared to other histological types, clear cell histology had significant metastasis free survival (p=0.003). Clear cell histology compared to other types (p=0.0002 univariate, p=0.038 multivariate) and time to metastasis with log conversion (p=0.028) significantly affected time from metastasis to death. A greater than one year and greater than two year metastasis free interval, compared to patients that had metastasis before one and two years, had statistically significant survival benefit (p=0.004 and p=0.0318). Time from evaluation to death was affected by greater than one year metastasis free interval (p=0.0459), alcohol consumption (p=0.044), LDH (p=0.006), ECOG performance status (p<0.001), and hemoglobin level (p=0.0092). The UISS risk stratified the patient population in a statistically significant manner for survival (p=0.001). No other factors were found to be significant. ^ Conclusion. Clear cell histology is predictive for both time to metastasis and metastasis to death. Nodal status at time of nephrectomy may predict risk of metastasis. The time interval to metastasis significantly predicts time from metastasis to death and time from evaluation to death. ECOG performance status, and hemoglobin levels predicts survival outcome at evaluation. Finally, UISS appropriately stratifies risk in our population. ^
Resumo:
Purpose. A descriptive analysis of glioma patients by race was carried out in order to better elucidate potential differences between races in demographics, treatment, characteristics, prognosis and survival. ^ Patients and Methods. Among 1,967 patients ≥ 18 years diagnosed with glioma seen between July 2000 and September 2006 at The University of Texas M.D. Anderson Cancer Center (UTMDACC). Data were collated from the UTMDACC Patient History Database (PHDB) and the UTMDACC Tumor Registry Database (TRDB). Chi-square analysis, uni- /multivariate Cox proportional hazards modeling and survival analysis were used to analyze differences by race. ^ Results. Demographic, treatment and histologic differences exist between races. Though risk differences were seen between races, race was not found to be a significant predictor in multivariate regression analysis after accounting for age, surgery, chemotherapy, radiation, tumor type as stratified by WHO tumor grade. Age was the most consistent predictor in risk for death. Overall survival by race was significantly different (p=0.0049) only in low-grade gliomas after adjustment for age although survival differences were very slight. ^ Conclusion. Among this cohort of glioma patients, age was the strongest predictor for survival. It is likely that survival is more influenced by age, time to treatment, tumor grade and surgical expertise rather than racial differences. However, age at diagnosis, gender ratios, histology and history of cancer differed significantly between race and genetic differences to this effect cannot be excluded. ^
Resumo:
Background. Cancer cachexia is a common syndrome complex in cancer, occurring in nearly 80% of patients with advanced cancer and responsible for at least 20% of all cancer deaths. Cachexia is due to increased resting energy expenditure, increased production of inflammatory mediators, and changes in lipid and protein metabolism. Non-steroidal anti-inflammatory drugs (NSAIDs), by virtue of their anti-inflammatory properties, are possibly protective against cancer-related cachexia. Since cachexia is also associated with increased hospitalizations, this outcome may also show improvement with NSAID exposure. ^ Design. In this retrospective study, computerized records from 700 non-small cell lung cancer patients (NSCLC) were reviewed, and 487 (69.57%) were included in the final analyses. Exclusion criteria were severe chronic obstructive pulmonary disease, significant peripheral edema, class III or IV congestive heart failure, liver failure, other reasons for weight loss, or use of research or anabolic medications. Information on medication history, body weight and hospitalizations was collected from one year pre-diagnosis until three years post-diagnosis. Exposure to NSAIDs was defined if a patient had a history of being treated with NSAIDs for at least 50% of any given year in the observation period. We used t-test and chi-square tests for statistical analyses. ^ Results. Neither the proportion of patients with cachexia (p=0.27) nor the number of hospitalizations (p=0.74) differed among those with a history of NSAID use (n=92) and those without (n=395). ^ Conclusions. In this study, NSAID exposure was not significantly associated with weight loss or hospital admissions in patients with NSCLC. Further studies may be needed to confirm these observations.^