968 resultados para Biology, Molecular|Biology, Genetics|Biology, Cell|Health Sciences, Medicine and Surgery
Resumo:
Preimplantation genetic diagnosis (PGD) following in vitro fertilization (IVF) offers couples at risk for transmitting genetic disorders the opportunity to identify affected embryos prior to replacement. In particular, embryo gender determination permits screening for X-linked diseases of unknown etiology. Analysis of embryos can be performed by polymerase chain reaction (PCR) amplification of material obtained by micromanipulation. This approach provides an alternative to the termination of an established pregnancy following chorionic villi sampling or amniocentesis. ^ Lately, the focus of preimplantation diagnosis and intervention has been shifting toward an attempt to correct cytoplasmic deficiencies. Accordingly, it is the aim of this investigation to develop methods to permit the examination of single cells or components thereof for clinical evaluation. In an attempt to lay the groundwork for precise therapeutic intervention for age related aneuploidy, transcripts encoding proteins believed to be involved in the proper segregation of chromosomes during human oocyte maturation were examined and quantified. Following fluorescent rapid cycle RT-PCR analysis it was determined that the concentration of cell cycle checkpoint gene transcripts decreases significantly as maternal age increases. Given the well established link between increasing maternal age and the incidence of aneuploidy, these results suggest that the degradation of these messages in aging oocytes may be involved with inappropriate chromosome separation during meiosis. ^ In order to investigate the cause of embryonic rescue observed following clinical cytoplasmic transfer procedures and with the objective of developing a diagnostic tool, mtDNA concentrations in polar bodies and subcellular components were evaluated. First, the typical concentration of mtDNA in human and mouse oocytes was determined by fluorescent rapid cycle PCR. Some disparity was noted between the copy numbers of individual cytoplasmic samples which may limit the use of the current methodology for the clinical assessment of the corresponding oocyte. ^
Resumo:
Tumor necrosis factor (TNF)-induced apoptosis is important in immunologic cytotoxicity, autoimmunity, sepsis, normal embryonic development, and wound healing. TNF exerts cytotoxicity on many types of tumor cells but not on normal cells. The molecular events leading to cell death triggered by TNF are still poorly understood. We found that enforced expression of an activated H-ras oncogene converted the non-tumorigenic TNF-resistant C3H 10T1/2 fibroblasts into tumorigenic cells (10TEJ) that also became very sensitive to TNF-induced apoptosis. This finding suggested that the oncogenic form of H-Ras, in which the p21 is locked in the GTP-bound form, could play a role in TNF-induced apoptosis of these cells. To investigate whether Ras activation is an obligatory step in TNF-induced apoptosis, we introduced two different molecular antagonists of Ras, namely the Rap1A tumor suppressor gene or the dominant-negative rasN17 gene, into H-ras transformed 10TEJ cells. Expression of either Rap1A or RasN17 in 10TEJ cells resulted in abrogation of TNF-induced apoptosis. Similar results were obtained by expression of either Ras antagonist in L929 cells, a fibroblast cell line that is sensitive to TNF-induced apoptosis but does not have a ras mutation. The effects of Rap-1A and RasN17 appear to be specific to TNF, since cytotoxicity induced by doxorubicin and thapsigargin are unaffected. Additionally, constitutive apoptosis sensitivity in isolated nuclei, as measured by activation of Ca$\sp{2+}$-dependent endogenous endonuclease, is not affected by Rap-1A or RasN17. Moreover, TNF treatment of L929 cells increased Ras-bound GTP, indicating that Ras activation is triggered by TNF. Thus, Ras activation is required for TNF-induced apoptosis in mouse cells. ^
Resumo:
Colorectal cancer is a leading cause of cancer mortality and early detection can significantly improve the clinical outcome. Most colorectal cancers arise from benign neoplastic lesions recognized as adenomas. Only a small percentage of all adenomas will become malignant. Thus, there is a need to identify specific markers of malignant potential. Studies at the molecular level have demonstrated an accumulation of genetic alterations, some hereditary but for the most occurring in somatic cells. The most common are the activation of ras, an oncogene involved in signal transduction, and the inactivation of p53, a tumor suppressor gene implicated in cell cycle regulation. In this study, 38 carcinomas, 95 adenomas and 20 benign polyps were analyzed by immunohistochemistry for the abnormal expression of p53 and ras proteins. An index of cellular proliferation was also measured by labeling with PCNA. A general overexpression of p53 was immunodetected in 66% of the carcinomas, while 26% of adenomas displayed scattered individual positive cells or a focal high concentration of positive cells. This later was more associated with severe dysplasia. Ras protein was detected in 37% of carcinomas and 32% of adenomas mostly throughout the tissue. p53 immunodetection was more frequent in adenomas originating in colons with synchronous carcinomas, particularly in patients with familial adenomatous polyposis and it may be a useful marker in these cases. Difference in the frequency of p53 and ras alterationbs was related to the location of the neoplasm. Immunodetection of p53 protein was correlated to the presence of a mutation in p53 gene at exon 7 and 5 in 4/6 carcinomas studied and 2 villous adenomas. Thus, we characterized in adenomas the abnormal expression of two proteins encoded by the most commonly altered genes in colorectal cancer. p53 alteration appears to be more specifically associated with transition to malignancy than ras. By using immunohistochemistry, a technique that keeps the architecture of the tissue intact, it was possible to correlate these alterations to histopathological characteristics that were associated with higher risks for transformation: villous content, dysplasia and size of adenoma. ^
Resumo:
Objective: The primary objective of our study was to study the effect of metformin in patients of metastatic renal cell cancer (mRCC) and diabetes who are on treatment with frontline therapy of tyrosine kinase inhibitors. The effect of therapy was described in terms of overall survival and progression free survival. Comparisons were made between group of patients receiving metformin versus group of patients receiving insulin in diabetic patients of metastatic renal cancer on frontline therapy. Exploratory analyses were also done comparing non-diabetic patients of metastatic renal cell cancer receiving frontline therapy compared to diabetic patients of metastatic renal cell cancer receiving metformin therapy. ^ Methods: The study design is a retrospective case series to elaborate the response rate of frontline therapy in combination with metformin for mRCC patients with type 2 diabetes mellitus. The cohort was selected from a database, which was generated for assessing the effect of tyrosine kinase inhibitor therapy associated hypertension in metastatic renal cell cancer at MD Anderson Cancer Center. Patients who had been started on frontline therapy for metastatic renal cell carcinoma from all ethnic and racial backgrounds were selected for the study. The exclusion criteria would be of patients who took frontline therapy for less than 3 months or were lost to follow-up. Our exposure variable was treatment with metformin, which comprised of patients who took metformin for the treatment of type 2 diabetes at any time of diagnosis of metastatic renal cell carcinoma. The outcomes assessed were last available follow-up or date of death for the overall survival and date of progression of disease from their radiological reports for time to progression. The response rates were compared by covariates that are known to be strongly associated with renal cell cancer. ^ Results: For our primary analyses between the insulin and metformin group, there were 82 patients, out of which 50 took insulin therapy and 32 took metformin therapy for type 2 diabetes. For our exploratory analysis, we compared 32 diabetic patients on metformin to 146 non-diabetic patients, not on metformin. Baseline characteristics were compared among the population. The time from the start of treatment until the date of progression of renal cell cancer and date of death or last follow-up were estimated for survival analysis. ^ In our primary analyses, there was a significant difference in the time to progression of patients receiving metformin therapy vs insulin therapy, which was also seen in our exploratory analyses. The median time to progression in primary analyses was 1259 days (95% CI: 659-1832 days) in patients on metformin therapy compared to 540 days (95% CI: 350-894) in patients who were receiving insulin therapy (p=0.024). The median time to progression in exploratory analyses was 1259 days (95% CI: 659-1832 days) in patients on metformin therapy compared to 279 days (95% CI: 202-372 days) in non-diabetic group (p-value <0.0001). ^ The median overall survival was 1004 days in metformin group (95% CI: 761-1212 days) compared to 816 days (95%CI: 558-1405 days) in insulin group (p-value<0.91). For the exploratory analyses, the median overall survival was 1004 days in metformin group (95% CI: 761-1212 days) compared to 766 days (95%CI: 649-965 days) in the non-diabetic group (p-value<0.78). Metformin was observed to increase the progression free survival in both the primary and exploratory analyses (HR=0.52 in metformin Vs insulin group and HR=0.36 in metformin Vs non-diabetic group, respectively). ^ Conclusion: In laboratory studies and a few clinical studies metformin has been proven to have dual benefits in patients suffering from cancer and type 2-diabetes via its action on the mammalian target of Rapamycin pathway and effect in decreasing blood sugar by increasing the sensitivity of the insulin receptors to insulin. Several studies in breast cancer patients have documented a beneficial effect (quantified by pathological remission of cancer) of metformin use in patients taking treatment for breast cancer therapy. Combination of metformin therapy in patients taking frontline therapy for renal cell cancer may provide a significant benefit in prolonging the overall survival in patients with metastatic renal cell cancer and diabetes. ^
Resumo:
Enteroaggregative Escherichia coli (EAEC) are considered an important emerging enteric and food-borne pathogen. The groups importantly affected by EAEC include international travelers, children in the developing world, and patients with HIV infection. EAEC does not commonly cause diarrheal illness in all hosts. ^ The reasons for the observed clinical variation in EAEC infection are multifactorial and are dependant on the pathogen, the inoculum ingested and the host susceptibility. A major obstacle in identifying the mechanism of pathogenesis for EAEC is the heterogeneity in virulence of strains. No EAEC virulence gene is consistently present in all diarrheagenic strains. However, a recent report suggests that a package of plasmid borne and chromosomal virulence factors are under the control of the described transcriptional activator aggR. Although the exact inoculum required for EAEC diarrheal illness is not known, a volunteer study has shown that oral ingestion of 10 10 cfu of virulent EAEC elicited diarrhea. Ongoing studies are being conducted to better define the exact infectious dose. There are also host factors associated with increased susceptibility of persons to diarrheal illness with EAEC. ^ The following three manuscripts: (1) review EAEC as an emerging enteric pathogen; (2) identify EAEC as a cause of acute diarrhea among different subpopulations worldwide; (3) identify virulence characteristics and the molecular epidemiology of EAEC isolates among travelers with diarrheal illness and describe the pathogenesis of EAEC infection. ^
Resumo:
Three approaches were used to examine the role of Ca$\sp{2+}$- and/or calmodulin (CaM)-regulated processes in the mammalian heat stress response. The focus of the first approach was on the major Ca$\sp{2+}$-binding protein, CaM, and involved the use of CaM antagonists that perturbed CaM-regulated processes during heat stress. The second approach involved the use of a cell line and its BPV-1 transformants that express increased basal levels of CaM, or parvalbumin--a Ca$\sp{2+}$-binding protein not normally found in these cells. The last approach used Ca$\sp{2+}$ chelators to buffer Ca$\sp{2+}$-transients.^ The principle conclusions resulting from these three experimental approaches are: (1) CaM antagonists cause a temperature-dependent potentiation of heat killing, but do not inhibit the triggering and development of thermotolerance suggesting some targets for heat killing are different from those that lead to thermotolerance; (2) Members of major HSP families (especially HSP70) can bind to CaM in a Ca$\sp{2+}$-dependent manner in vitro, and HSP have been associated with events leading to thermotolerance. But, because thermotolerance is not affected by CaM antagonists, and antagonists should interfere with HSP binding to CaM, the events leading to triggering or developing thermotolerance were not strongly dependent on HSP binding to CaM; (3) CaM antagonists can also bind to HSP70 (and possibly other HSP) suggesting an alternative mechanism for the action of these agents in heat killing may involve direct binding to other proteins, like HSP70, whose function is important for survival following heating and inhibiting their activity; and (4) The signal governing the rate of synthesis of another major HSP group, the HSP26 family, can be largely abrogated by elevated Ca$\sp{2+}$-binding proteins or Ca$\sp{2+}$ chelators without significantly reducing survival or thermotolerance suggesting if the HSP26 family is involved in either end point, it may function in (Ca$\sp{2+}$) $\sb{\rm i}$ homeostasis. ^
Resumo:
Diarrhea disease is a leading cause of morbidity and mortality, especially in children in developing countries. An estimate of the global mortality caused by diarrhea among children under five years of age was 3.3 million deaths per year. Cryptosporidium parvum was first identified in 1907, but it was not until 1970 that this organism was recognized as a cause of diarrhea in calves. Then it was as late as 1976 that the first reported case of human Cryptosporidiosis occurred. This study was conducted to ascertain the risk factors of first symptomatic infection with Cryptosporidium parvum in a cohort of infants in a rural area of Egypt. The cohort was followed from birth through the first year of life. Univariate and multivariate analyses of data demonstrated that infants greater than six months of age had a two-fold risk of infection compared with infants less than six months of age (RR = 2.17; 95% C.I. = 1.01-4.82). When stratified, male infants greater than six months of age were four times more likely to become infected than male infants less than six months of age. Among female infants, there was no difference in risk between infants greater than six months of age and infants less than six months of age. Female infants less than six months of age were twice more likely to become infected than male infants less than six months of age. The reverse occurred for infants greater than six months of age, i.e., male infants greater than six months of age had twice the risk of infection compared to females of the same age group. Further analysis of the data revealed an increased risk of Cryptosporidiosis infection in infants who were attended in childbirth by traditional childbirth attendants compared to infants who were attended by modern childbirth attendants (nurses, trained midwives, physicians) (RR = 4. 18; 95% C.I. = 1.05-36.06). The final risk factor of significance was the number of people residing in the household. Infants in households which housed more than seven persons had an almost two-fold risk of infection compared with infants in homes with fewer than seven persons. Other risk factors which suggested increased risk were lack of education among the mothers, absence of latrines and faucets in the homes, and mud used as building material for walls and floors in the homes. ^
Resumo:
In order to identify optimal therapy for children with bacterial pneumonia, Pakistan's ARI Program, in collaboration with the National Institute of Health (NIH), Islamabad, undertook a national surveillance of antimicrobial resistance in S. pneumoniae and H. influenzae. The project was carried out at selected urban and peripheral sites in 6 different regions of Pakistan, in 1991–92. Nasopharyngeal (NP) specimens and blood cultures were obtained from children with pneumonia diagnosed in the outpatient clinic of participating facilities. Organisms were isolated by local hospital laboratories and sent to NIH for confirmation, serotyping and antimicrobial susceptibility testing. Following were the aims of the study (i) to determine the antimicrobial resistance patterns of S. pneumoniae and H. influenzae in children aged 2–59 months; (ii) to determine the ability of selected laboratories to identify and effectively transport isolates of S. pneumoniae and H. influenzae cultured from nasopharyngeal and blood specimens; (iii) to validate the comparability of resistance patterns for nasopharyngeal and blood isolates of S. pneumoniae and H. influenzae from children with pneumonia; and (iv) to examine the effect of drug resistance and laboratory error on the cost of effectively treating children with ARI. ^ A total of 1293 children with ARI were included in the study: 969 (75%) from urban areas and 324 (25%) from rural parts of the country. Of 1293, there were 786 (61%) male and 507 (39%) female children. The resistance rate of S. pneumoniae to various antibiotics among the urban children with ARI was: TMP/SMX (62%); chloramphenicol (23%); penicillin (5%); tetracycline (16%); and ampicillin/amoxicillin (0%). The rates of resistance of H. influenzae were higher than S. pneumoniae: TMP/SMX (85%); chloramphenicol (62%); penicillin (59%); ampicillin/amoxicillin (46%); and tetracycline (100%). There were similar rates of resistance to each antimicrobial agent among isolates from the rural children. ^ Of a total 614 specimens that were tested for antimicrobial susceptibility, 432 (70.4%) were resistant to TMP/SMX and 93 (15.2%) were resistant to antimicrobial agents other than TMP/SMX viz. ampicillin/amoxicillin, chloramphenicol, penicillin, and tetracycline. ^ The sensitivity and positive predictive value of peripheral laboratories for H. influenzae were 99% and 65%, respectively. Similarly, the sensitivity and positive predictive value of peripheral laboratory tests compared to gold standard i.e. NIH laboratory, for S. pneumoniae were 99% and 54%, respectively. ^ The sensitivity and positive predictive value of nasopharyngeal specimens compared to blood cultures (gold standard), isolated by the peripheral laboratories, for H. influenzae were 88% and 11%, and for S. pneumoniae 92% and 39%, respectively. (Abstract shortened by UMI.)^
Resumo:
Background. Clostridium difficile is the leading cause of hospital associated infectious diarrhea and colitis. About 3 million cases of Clostridium difficile diarrhea occur each year with an annual cost of $1 billion. ^ About 20% of patients acquire C. difficile during hospitalization. Infection with Clostridium difficile can result in serious complications, posing a threat to the patient's life. ^ Purpose. The aim of this research was to demonstrate the uniqueness in the characteristics of C. difficile positive nosocomial diarrhea cases compared with C. difficile negative nosocomial diarrhea controls admitted to a local hospital. ^ Methods. One hundred and ninety patients with a positive test and one hundred and ninety with a negative test for Clostridium difficile nosocomial diarrhea, selected from patients tested between January 1, 2002 and December 31, 2003, comprised the study population. Demographic and clinical data were collected from medical records. Logistic regression analyses were conducted to determine the associated odds between selected variables and the outcome of Clostridium difficile nosocomial diarrhea. ^ Results. For the antibiotic classes, cephalosporins (OR, 1.87; CI 95, 1.23 to 2.85), penicillins (OR, 1.57; CI 95, 1.04 to 2.37), fluoroquinolones (OR, 1.65; CI 95, 1.09 to 2.48) and antifungals (OR, 2.17; CI 95, 1.20 to 3.94), were significantly associated with Clostridium difficile nosocomial diarrhea Ceftazidime (OR, 1.95; CI 95, 1.25 to 3.03, p=0.003), gatifloxacin (OR, 1.97; CI 95, 1.31 to 2.97, p=0.001), clindamycin (OR, 3.13; CI 95, 1.99 to 4.93, p<0.001) and vancomycin (OR, 1.77; CI 95, 1.18 to 2.66, p=0.006, were also significantly associated with the disease. Vancomycin was not statistically significant when analyzed in a multivariable model. Other significantly associated drugs were, antacids, laxatives, narcotics and ranitidine. Prolong use of antibiotics and an increased number of comorbid conditions were also associated with C. difficile nosocomial diarrhea. ^ Conclusion. The etiology for C. difficile diarrhea is multifactorial. Exposure to antibiotics and other drugs, prolonged antibiotic usage, the presence and severity of comorbid conditions and prolonged hospital stay were shown to contribute to the development of the disease. It is imperative that any attempt to prevent the disease, or contain its spread, be done on several fronts. ^
Resumo:
Purpose. A descriptive analysis of glioma patients by race was carried out in order to better elucidate potential differences between races in demographics, treatment, characteristics, prognosis and survival. ^ Patients and Methods. Among 1,967 patients ≥ 18 years diagnosed with glioma seen between July 2000 and September 2006 at The University of Texas M.D. Anderson Cancer Center (UTMDACC). Data were collated from the UTMDACC Patient History Database (PHDB) and the UTMDACC Tumor Registry Database (TRDB). Chi-square analysis, uni- /multivariate Cox proportional hazards modeling and survival analysis were used to analyze differences by race. ^ Results. Demographic, treatment and histologic differences exist between races. Though risk differences were seen between races, race was not found to be a significant predictor in multivariate regression analysis after accounting for age, surgery, chemotherapy, radiation, tumor type as stratified by WHO tumor grade. Age was the most consistent predictor in risk for death. Overall survival by race was significantly different (p=0.0049) only in low-grade gliomas after adjustment for age although survival differences were very slight. ^ Conclusion. Among this cohort of glioma patients, age was the strongest predictor for survival. It is likely that survival is more influenced by age, time to treatment, tumor grade and surgical expertise rather than racial differences. However, age at diagnosis, gender ratios, histology and history of cancer differed significantly between race and genetic differences to this effect cannot be excluded. ^
Resumo:
Can the early identification of the species of staphylococcus responsible for infection by the use of Real Time PCR technology influence the approach to the treatment of these infections? ^ This study was a retrospective cohort study in which two groups of patients were compared. The first group, ‘Physician Aware’ consisted of patients in whom physicians were informed of specific staphylococcal species and antibiotic sensitivity (using RT-PCR) at the time of notification of the gram stain. The second group, ‘Physician Unaware’ consisted of patients in whom treating physicians received the same information 24–72 hours later as a result of blood culture and antibiotic sensitivity determination. ^ The approach to treatment was compared between ‘Physician Aware’ and ‘Physician Unaware’ groups for three different microbiological diagnoses—namely MRSA, MSSA and no-SA (or coagulase negative Staphylococcus). ^ For a diagnosis of MRSA, the mean time interval to the initiation of Vancomycin therapy was 1.08 hours in the ‘Physician Aware’ group as compared to 5.84 hours in the ‘Physician Unaware’ group (p=0.34). ^ For a diagnosis of MSSA, the mean time interval to the initiation of specific anti-MSSA therapy with Nafcillin was 5.18 hours in the ‘Physician Aware’ group as compared to 49.8 hours in the ‘Physician Unaware’ group (p=0.007). Also, for the same diagnosis, the mean duration of empiric therapy in the ‘Physician Aware’ group was 19.68 hours as compared to 80.75 hours in the ‘Physician Unaware’ group (p=0.003) ^ For a diagnosis of no-SA or coagulase negative staphylococcus, the mean duration of empiric therapy was 35.65 hours in the ‘Physician Aware’ group as compared to 44.38 hours in the ‘Physician Unaware’ group (p=0.07). However, when treatment was considered a categorical variable and after exclusion of all cases where anti-MRS therapy was used for unrelated conditions, only 20 of 72 cases in the ‘Physician Aware’ group received treatment as compared to 48 of 106 cases in the ‘Physician Unaware’ group. ^ Conclusions. Earlier diagnosis of MRSA may not alter final treatment outcomes. However, earlier identification may lead to the earlier institution of measures to limit the spread of infection. The early diagnosis of MSSA infection, does lead to treatment with specific antibiotic therapy at an earlier stage of treatment. Also, the duration of empiric therapy is greatly reduced by early diagnosis. The early diagnosis of coagulase negative staphylococcal infection leads to a lower rate of unnecessary treatment for these infections as they are commonly considered contaminants. ^
Resumo:
The use of exercise electrocardiography (ECG) to detect latent coronary heart disease (CHD) is discouraged in apparently healthy populations because of low sensitivity. These recommendations however, are based on the efficacy of evaluation of ischemia (ST segment changes) with little regard for other measures of cardiac function that are available during exertion. The purpose of this investigation was to determine the association of maximal exercise hemodynamic responses with risk of mortality due to all-causes, cardiovascular disease (CVD), and coronary heart disease (CHD) in apparently healthy individuals. Study participants were 20,387 men (mean age = 42.2 years) and 6,234 women (mean age = 41.9 years) patients of a preventive medicine center in Dallas, TX examined between 1971 and 1989. During an average of 8.1 years of follow-up, there were 348 deaths in men and 66 deaths in women. In men, age-adjusted all-cause death rates (per 10,000 person years) across quartiles of maximal systolic blood pressure (SBP) (low to high) were: 18.2, 16.2, 23.8, and 24.6 (p for trend $<$0.001). Corresponding rates for maximal heart rate were: 28.9, 15.9, 18.4, and 15.1 (p trend $<$0.001). After adjustment for confounding variables including age, resting systolic pressure, serum cholesterol and glucose, body mass index, smoking status, physical fitness and family history of CVD, risks (and 95% confidence interval (CI)) of all-cause mortality for quartiles of maximal SBP, relative to the lowest quartile, were: 0.96 (0.70-1.33), 1.36 (1.01-1.85), and 1.37 (0.98-1.92) for quartiles 2-4 respectively. Similar risks for maximal heart rate were: 0.61 (0.44-0.85), 0.69 (0.51-0.93), and 0.60 (0.41-0.87). No associations were noted between maximal exercise rate-pressure product mortality. Similar results were seen for risk of CVD and CHD death. In women, similar trends in age-adjusted all-cause and CVD death rates across maximal SBP and heart rate categories were observed. Sensitivity of the exercise test in predicting mortality was enhanced when ECG results were evaluated together with maximal exercise SBP or heart rate with a concomitant decrease in specificity. Positive predictive values were not improved. The efficacy of the exercise test in predicting mortality in apparently healthy men and women was not enhanced by using maximal exercise hemodynamic responses. These results suggest that an exaggerated systolic blood pressure or an attenuated heart rate response to maximal exercise are risk factors for mortality in apparently healthy individuals. ^
Resumo:
The natural history of placebo treated travelers' diarrhea and the prognostic factors of recovery from diarrhea were evaluated using 9 groups of placebo treated subjects from 9 clinical trial studies conducted since 1975, for use as a historical control in the future clinical trial of antidiarrheal agents. All of these studies were done by the same group of investigators in one site (Guadalajara, Mexico). The studies are similar in terms of population, measured parameters, microbiologic identification of enteropathogens and definitions of parameters. The studies had two different durations of followup. In some studies, subjects were followed for two days, and in some they were followed for five days.^ Using definitions established by the Infectious Diseases society of America and the Food and Drug Administration, the following efficacy parameters were evaluated: Time to last unformed stool (TLUS), number of unformed stools post-initiation of placebo treatment for five consecutive days of followup, microbiologic cure, and improvement of diarrhea. Among the groups that were followed for five days, the mean TLUS ranged from 59.1 to 83.5 hours. Fifty percent to 78% had diarrhea lasting more than 48 hours and 25% had diarrhea more than five days. The mean number of unformed stools passed on the first day post-initiation of therapy ranged from 3.6 to 5.8 and, for the fifth day ranged from 0.5 to 1.5. By the end of followup, diarrhea improved in 82.6% to 90% of the subjects. Subjects with enterotoxigenic E. coli had 21.6% to 90.0% microbiologic cure; and subjects with shigella species experienced 14.3% to 60.0% microbiologic cure.^ In evaluating the prognostic factors of recovery from diarrhea (primary efficacy parameter in evaluating the efficacy of antidiarrheal agents against travelers' diarrhea). The subjects from five studies were pooled and the Cox proportional hazard model was used to evaluate the predictors of prolonged diarrhea. After adjusting for design characteristics of each trial, fever with a rate ratio (RR) of 0.40, presence of invasive pathogens with a RR of 0.41, presence of severe abdominal pain and cramps with a RR of 0.50, number of watery stools more than five with a RR of 0.60, and presence of non-invasive pathogens with a RR of 0.84 predicted a longer duration of diarrhea. Severe vomiting with a RR of 2.53 predicted a shorter duration of diarrhea. The number of soft stools, presence of fecal leukocytes, presence of nausea, and duration of diarrhea before enrollment were not associated with duration of diarrhea. ^
Resumo:
Background. Necrotizing pneumonia is generally considered a rare complication of pneumococcal pneumonia in adults. We systematically studied the incidence of necrotizing changes in adult patients with pneumococcal pneumonia, and examined the severity of infection, the role of causative serotype and the association with bacteremia. ^ Methods. We used a data base of all pneumococcal infections identified at our medical center between 2000 and 2010. Original readings of chest X-rays (CXR) and computerized tomography (CT) were noted. All images were then reread independently by 2 radiologists. The severity of disease was assessed using the SMART-COP scoring system. ^ Results. There were 351 cases of pneumococcal pneumonia. Necrosis was reported in no original CXR readings and 6 of 136 (4.4%) CTs. With re-reading, 8 of 351 (2.3%) CXR and 15 of 136 (11.0%) CT had necrotizing changes. Overall, these changes were found in 23 of 351 (6.6%, 95% CI 4.0 - 9.1) patients. The incidence of bacteremia and the admitting SMART-COP scores were similar in patients with and without necrosis (P=1.00 and P=0.32, respectively). Type 3 pneumococcus was more commonly isolated from patients with than from patients without necrotizing pneumonia (P=0.05), but a total of 10 serotypes were identified among 16 cases in which the organism was available for typing. ^ Conclusions. Necrotizing changes in the lungs were seen in 6.6% (95% CI 4.0 - 9.1) of a large series of adults with pneumococcal pneumonia. Patients with necrosis were not more likely to have bacteremia or more severe disease. Type 3 pneumococcus was commonly implicated, but 9 other serotypes were also identified.^
Resumo:
Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^