331 resultados para diagnostic and prognostic algorithms developmen
Resumo:
Recovery of group B streptococci (GBS) was assessed in 1,204 vaginorectal swabs stored in Amies transport medium at 4 or 21°C for 1 to 4 days either by direct inoculation onto Granada agar (GA) or by culture in blood agar (BA) and GA after a selective broth enrichment (SBE) step. Following storage at 4°C, GBS detection in GA was not affected after 72 h by either direct inoculation or SBE; however, GBS were not detected after SBE in the BA subculture in some samples after 48 h of storage and in GA after 96 h. After storage at 21°C, loss of GBS-positive results was significant after 48 h by direct inoculation in GA and after 96 h by SBE and BA subculture; some GBS-positive samples were not detected after 24 h of storage followed by SBE and BA subculture or after 48 h of storage followed by SBE and GA subculture. Storage of swabs in transport medium, even at 4°C, produced after 24 h an underestimation of the intensity of GBS colonization in most specimens. These data indicate that viability of GBS is not fully preserved by storage of vaginorectal swabs in Amies transport medium, mainly if they are not stored under refrigeration
Resumo:
INTRODUCTION Functional imaging studies of addiction following protracted abstinence have not been systematically conducted to look at the associations between severity of use of different drugs and brain dysfunction. Findings from such studies may be relevant to implement specific interventions for treatment. The aim of this study was to examine the association between resting-state regional brain metabolism (measured with 18F-fluorodeoxyglucose Positron Emission Tomography (FDG-PET) and the severity of use of cocaine, heroin, alcohol, MDMA and cannabis in a sample of polysubstance users with prolonged abstinence from all drugs used. METHODS Our sample consisted of 49 polysubstance users enrolled in residential treatment. We conducted correlation analyses between estimates of use of cocaine, heroin, alcohol, MDMA and cannabis and brain metabolism (BM) (using Statistical Parametric Mapping voxel-based (VB) whole-brain analyses). In all correlation analyses conducted for each of the drugs we controlled for the co-abuse of the other drugs used. RESULTS The analysis showed significant negative correlations between severity of heroin, alcohol, MDMA and cannabis use and BM in the dorsolateral prefrontal cortex (DLPFC) and temporal cortex. Alcohol use was further associated with lower metabolism in frontal premotor cortex and putamen, and stimulants use with parietal cortex. CONCLUSIONS Duration of use of different drugs negatively correlated with overlapping regions in the DLPFC, whereas severity of cocaine, heroin and alcohol use selectively impact parietal, temporal, and frontal-premotor/basal ganglia regions respectively. The knowledge of these associations could be useful in the clinical practice since different brain alterations have been associated with different patterns of execution that may affect the rehabilitation of these patients.
Resumo:
Clonally complex infections by Mycobacterium tuberculosis are progressively more accepted. Studies of their dimension in epidemiological scenarios where the infective pressure is not high are scarce. Our study systematically searched for clonally complex infections (mixed infections by more than one strain and simultaneous presence of clonal variants) by applying mycobacterial interspersed repetitive-unit (MIRU)-variable-number tandem-repeat (VNTR) analysis to M. tuberculosis isolates from two population-based samples of respiratory (703 cases) and respiratory-extrapulmonary (R+E) tuberculosis (TB) cases (71 cases) in a context of moderate TB incidence. Clonally complex infections were found in 11 (1.6%) of the respiratory TB cases and in 10 (14.1%) of those with R+E TB. Among the 21 cases with clonally complex TB, 9 were infected by 2 independent strains and the remaining 12 showed the simultaneous presence of 2 to 3 clonal variants. For the 10 R+E TB cases with clonally complex infections, compartmentalization (different compositions of strains/clonal variants in independent infected sites) was found in 9 of them. All the strains/clonal variants were also genotyped by IS6110-based restriction fragment length polymorphism analysis, which split two MIRU-defined clonal variants, although in general, it showed a lower discriminatory power to identify the clonal heterogeneity revealed by MIRU-VNTR analysis. The comparative analysis of IS6110 insertion sites between coinfecting clonal variants showed differences in the genes coding for a cutinase, a PPE family protein, and two conserved hypothetical proteins. Diagnostic delay, existence of previous TB, risk for overexposure, and clustered/orphan status of the involved strains were analyzed to propose possible explanations for the cases with clonally complex infections. Our study characterizes in detail all the clonally complex infections by M. tuberculosis found in a systematic survey and contributes to the characterization that these phenomena can be found to an extent higher than expected, even in an unselected population-based sample lacking high infective pressure.
Resumo:
The use of molecular tools for genotyping Mycobacterium tuberculosis isolates in epidemiological surveys in order to identify clustered and orphan strains requires faster response times than those offered by the reference method, IS6110 restriction fragment length polymorphism (RFLP) genotyping. A method based on PCR, the mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) genotyping technique, is an option for fast fingerprinting of M. tuberculosis, although precise evaluations of correlation between MIRU-VNTR and RFLP findings in population-based studies in different contexts are required before the methods are switched. In this study, we evaluated MIRU-VNTR genotyping (with a set of 15 loci [MIRU-15]) in parallel to RFLP genotyping in a 39-month universal population-based study in a challenging setting with a high proportion of immigrants. For 81.9% (281/343) of the M. tuberculosis isolates, both RFLP and MIRU-VNTR types were obtained. The percentages of clustered cases were 39.9% (112/281) and 43.1% (121/281) for RFLP and MIRU-15 analyses, and the numbers of clusters identified were 42 and 45, respectively. For 85.4% of the cases, the RFLP and MIRU-15 results were concordant, identifying the same cases as clustered and orphan (kappa, 0.7). However, for the remaining 14.6% of the cases, discrepancies were observed: 16 of the cases clustered by RFLP analysis were identified as orphan by MIRU-15 analysis, and 25 cases identified as orphan by RFLP analysis were clustered by MIRU-15 analysis. When discrepant cases showing subtle genotypic differences were tolerated, the discrepancies fell from 14.6% to 8.6%. Epidemiological links were found for 83.8% of the cases clustered by both RFLP and MIRU-15 analyses, whereas for the cases clustered by RFLP or MIRU-VNTR analysis alone, links were identified for only 30.8% or 38.9% of the cases, respectively. The latter group of cases mainly comprised isolates that could also have been clustered, if subtle genotypic differences had been tolerated. MIRU-15 genotyping seems to be a good alternative to RFLP genotyping for real-time interventional schemes. The correlation between MIRU-15 and IS6110 RFLP findings was reasonable, although some uncertainties as to the assignation of clusters by MIRU-15 analysis were identified.
Resumo:
INTRODUCTION No definitive data are available regarding the value of switching to an alternative TNF antagonist in rheumatoid arthritis patients who fail to respond to the first one. The aim of this study was to evaluate treatment response in a clinical setting based on HAQ improvement and EULAR response criteria in RA patients who were switched to a second or a third TNF antagonist due to failure with the first one. METHODS This was an observational, prospective study of a cohort of 417 RA patients treated with TNF antagonists in three university hospitals in Spain between January 1999 and December 2005. A database was created at the participating centres, with well-defined operational instructions. The main outcome variables were analyzed using parametric or non-parametric tests depending on the level of measurement and distribution of each variable. RESULTS Mean (+/- SD) DAS-28 on starting the first, second and third TNF antagonist was 5.9 (+/- 2.0), 5.1 (+/- 1.5) and 6.1 (+/- 1.1). At the end of follow-up, it decreased to 3.3 (+/- 1.6; Delta = -2.6; p > 0.0001), 4.2 (+/- 1.5; Delta = -1.1; p = 0.0001) and 5.4 (+/- 1.7; Delta = -0.7; p = 0.06). For the first TNF antagonist, DAS-28-based EULAR response level was good in 42% and moderate in 33% of patients. The second TNF antagonist yielded a good response in 20% and no response in 53% of patients, while the third one yielded a good response in 28% and no response in 72%. Mean baseline HAQ on starting the first, second and third TNF antagonist was 1.61, 1.52 and 1.87, respectively. At the end of follow-up, it decreased to 1.12 (Delta = -0.49; p < 0.0001), 1.31 (Delta = -0.21, p = 0.004) and 1.75 (Delta = -0.12; p = 0.1), respectively. Sixty four percent of patients had a clinically important improvement in HAQ (defined as > or = -0.22) with the first TNF antagonist and 46% with the second. CONCLUSION A clinically significant effect size was seen in less than half of RA patients cycling to a second TNF antagonist.
Resumo:
Impact of immune microenvironment in prognosis of solid tumors has been extensively studied in the last few years. Specifically in colorectal carcinoma, increased knowledge of the immune events around these tumors and their relation with clinical outcomes have led to consider immune microenvironment as one of the most important prognostic factors in this disease. In this review we will summarize and update the current knowledge with respect to this intriguing and complex new hallmark of cancer, paying special attention to infiltration by T-infiltrating lymphocytes and their subtypes in colorectal cancer, as well as its eventual clinical translation in terms of long-term prognosis. Finally, we suggest some possible investigational approaches based on combinatorial strategies to trigger and boost immune reaction against tumor cells.
Resumo:
End-stage renal diseases (ESRD) are becoming more frequent in HIV-infected patients. In Europe there is little information about HIV-infected patients on dialysis. A cross-sectional multicenter survey in 328 Spanish dialysis units was conducted in 2006. Information from 14,876 patients in dialysis was obtained (81.6% of the Spanish dialysis population). Eighty-one were HIV infected (0.54%; 95% CI, 0.43-0.67), 60 were on hemodialysis, and 21 were on peritoneal dialysis. The mean (range) age was 45 (28-73) years. Seventy-two percent were men and 33% were former drug users. The mean (range) time of HIV infection was 11 (1-27) years and time on dialysis was 4.6 (0.4-25) years. ESRD was due to glomerulonephritis (36%) and diabetes (15%). HIV-associated nephropathy was not reported. Eighty-five percent were on HAART, 76.5% had a CD4 T cell count above 200 cells, and 73% had undetectable viral load. Thirty-nine percent of patients met criteria for inclusion on the renal transplant (RT) waiting list but only 12% were included. Sixty-one percent had HCV coinfection. HCV-coinfected patients had a longer history of HIV, more previous AIDS events, parenteral transmission as the most common risk factor for acquiring HIV infection, and less access to the RT waiting list (p < 0.05). The prevalence of HIV infection in Spanish dialysis units in 2006 was 0.54% HCV coinfection was very frequent (61%) and the percentage of patients included on the Spanish RT waiting list was low (12%).
Resumo:
BACKGROUND Spain shows the highest bladder cancer incidence rates in men among European countries. The most important risk factors are tobacco smoking and occupational exposure to a range of different chemical substances, such as aromatic amines. METHODS This paper describes the municipal distribution of bladder cancer mortality and attempts to "adjust" this spatial pattern for the prevalence of smokers, using the autoregressive spatial model proposed by Besag, York and Molliè, with relative risk of lung cancer mortality as a surrogate. RESULTS It has been possible to compile and ascertain the posterior distribution of relative risk for bladder cancer adjusted for lung cancer mortality, on the basis of a single Bayesian spatial model covering all of Spain's 8077 towns. Maps were plotted depicting smoothed relative risk (RR) estimates, and the distribution of the posterior probability of RR>1 by sex. Towns that registered the highest relative risks for both sexes were mostly located in the Provinces of Cadiz, Seville, Huelva, Barcelona and Almería. The highest-risk area in Barcelona Province corresponded to very specific municipal areas in the Bages district, e.g., Suría, Sallent, Balsareny, Manresa and Cardona. CONCLUSION Mining/industrial pollution and the risk entailed in certain occupational exposures could in part be dictating the pattern of municipal bladder cancer mortality in Spain. Population exposure to arsenic is a matter that calls for attention. It would be of great interest if the relationship between the chemical quality of drinking water and the frequency of bladder cancer could be studied.
Resumo:
Glucose control is the cornerstone of Diabetes Mellitus (DM) treatment. Although self-regulation using capillary glycemia (SRCG) still remains the best procedure in clinical practice, continuous glucose monitoring systems (CGM) offer the possibility of continuous and dynamic assessment of interstitial glucose concentration. CGM systems have the potential to improve glycemic control while decreasing the incidence of hypoglycemia but the efficiency, compared with SRCG, is still debated. CGM systems have the greatest potential value in patients with hypoglycemic unawareness and in controlling daily fluctuations in blood glucose. The implementation of continuous monitoring in the standard clinical setting has not yet been established but a new generation of open and close loop subcutaneous insulin infusion devices are emerging making insulin treatment and glycemic control more reliable.
Resumo:
OBJECTIVES To assess the relationship between life styles and eating habits with the overweight and obesity prevalence in a Spanish adult population. METHODS A population-based, cross-sectional study conducted on 2640 subjects older than 15 years, in Cádiz (Spain). Surveys were conducted in subjects' homes to obtain life styles, eating habits, and anthropometric data. Logistic regression has been used to study the association between the life style variables and overweight and obesity. RESULTS Prevalence of overweight and obesity in Cadiz is 37% and 17%, respectively; higher in males and increases with age. BMI has an inverse relationship with educational level (PR = 2.3, 1.57-2.38). The highest levels of obesity are associated with daily alcohol consumption (PR = 1.39, 1.29-1.50), greater consumption of television,and sedentary pursuit (PR 1.5, 1.07-1.24). A lower prevalence of obesity is observed among those with active physical activity (10.9% vs 21.6%), with differences between sex. Following a slimming diet is more frequent in the obese and in women but dedicate more hours than men to passive activities. In men is greater the consumption of alcohol, high energy foods and snacks. Overweight and obesity is associated with the male sex (OR = 3.35 2.75-4.07), high consumption of alcohol (OR = 1.38 1.03-1.86) and watching television (OR = 1.52 1.11-2.07), and foods likes bread and cereals (OR = 1.47 1.13-1.91). Exercise activities is a protective factor (OR = 0.76 0.63-0.98). CONCLUSIONS Life styles factors associated with overweight and obesity present different patterns in men and women and is necessary to understand them to identify areas for behavioural intervention in overweight and obesity patients.
Resumo:
The changes in nutritional parameters and adipocytokines after structured intermittent interruption of highly active antiretroviral treatment of patients with chronic HIV infection are analyzed. Twenty-seven patients with chronic HIV infection (median CD4+ T cell count/microl: nadir, 394; at the beginning of structured interruptions, 1041; HIV viral load: nadir, 41,521 copies/ml; at the beginning of structured interruptions <50 copies/ml; median time of previous treatment: 60 months) were evaluated during three cycles of intermittent interruptions of therapy (8 weeks on/4 weeks off). CD4+ T cell count, HIV viral load, anthropometric measures, and serum concentrations of triglycerides, cholesterol, leptin, and tumor necrosis factor and its soluble receptors I and II were determined. After the three cycles of intermittent interruptions of therapy, no significant differences in CD4+ T cell count/microl, viral load, or serum concentrations of cholesterol or triglycerides with reference to baseline values were found. A near-significant higher fatty mass (skinfold thicknesses, at the end, 121 mm, at the beginning, 100 mm, p = 0.100), combined with a significant increase of concentration of leptin (1.5 vs. 4.7 ng/ml, p = 0,044), as well as a decrease in serum concentrations of soluble receptors of tumor necrosis factor (TNFRI, 104 vs. 73 pg/ml, p = 0.022; TNFRII 253 vs. 195 pg/ml, p = 0.098) were detected. Structured intermittent interruption of highly active antiretroviral treatment of patients with chronic HIV infection induces a valuable positive modification in markers of lipid turnover and adipose tissue mass.
Resumo:
As a response to metabolic stress, obese critically-ill patients have the same risk of nutritional deficiency as the non-obese and can develop protein-energy malnutrition with accelerated loss of muscle mass. The primary aim of nutritional support in these patients should be to minimize loss of lean mass and accurately evaluate energy expenditure. However, routinely used formulae can overestimate calorie requirements if the patient's actual weight is used. Consequently, the use of adjusted or ideal weight is recommended with these formulae, although indirect calorimetry is the method of choice. Controversy surrounds the question of whether a strict nutritional support criterion, adjusted to the patient's requirements, should be applied or whether a certain degree of hyponutrition should be allowed. Current evidence suggested that hypocaloric nutrition can improve results, partly due to a lower rate of infectious complications and better control of hyperglycemia. Therefore, hypocaloric and hyperproteic nutrition, whether enteral or parenteral, should be standard practice in the nutritional support of critically-ill obese patients when not contraindicated. Widely accepted recommendations consist of no more than 60-70% of requirements or administration of 11-14 kcal/kg current body weight/day or 22-25 kcal/kg ideal weight/day, with 2-2.5 g/kg ideal weight/day of proteins. In a broad sense, hypocaloric-hyperprotein regimens can be considered specific to obese critically-ill patients, although the complications related to comorbidities in these patients may require other therapeutic possibilities to be considered, with specific nutrients for hyperglycemia, acute respiratory distress syndrome (ARDS) and sepsis. However, there are no prospective randomized trials with this type of nutrition in this specific population subgroup and the available data are drawn from the general population of critically-ill patients. Consequently, caution should be exercised when interpreting these data.
Resumo:
BACKGROUND AND OBJECTIVES Prevalence of hyponutrition in hospitalized patients is very high and it has been shown to be an important prognostic factor. Most of admitted patients depend on hospital food to cover their nutritional demands being important to assess the factors influencing their intake, which may be modified in order to improve it and prevent the consequences of inadequate feeding. In previous works, it has been shown that one of the worst scored characteristics of dishes was the temperature. The aim of this study was to assess the influence of temperature on patient's satisfaction and amount eaten depending on whether the food was served in isothermal trolleys keeping proper food temperature or not. MATERIAL AND METHODS We carried out satisfaction surveys to hospitalized patients having regular diets, served with or without isothermal trolleys. The following data were gathered: age, gender, weight, number of visits, mobility, autonomy, amount of orally taken medication, intake of out-of-hospital foods, qualification of food temperature, presentation and smokiness, amount of food eaten, and reasons for not eating all the content of the tray. RESULTS Of the 363 surveys, 134 (37.96%) were done to patients with isothermal trays and 229 (62.04%) to patients without them. Sixty percent of the patients referred having eaten less than the normal amount within the last week, the most frequent reason being decreased appetite. During lunch and dinner, 69.3% and 67.7%, respectively, ate half or less of the tray content, the main reasons being as follows: lack of appetite (42% at lunch time and 40% at dinner), do not like the food (24.3 and 26.2%) or taste (15.3 and 16.8%). Other less common reasons were the odor, the amount of food, having nausea or vomiting, fatigue, and lack of autonomy. There were no significant differences in the amount eaten by gender, weight, number of visits, amount of medication, and level of physical activity. The food temperature was classified as adequate by 62% of the patients, the presentation by 95%, and smokiness by 85%. When comparing the patients served with or without isothermal trays, there were no differences with regards to baseline characteristics analyzed that might have had an influence on amount eaten. Ninety percent of the patients with isothermal trolley rated the food temperature as good, as compared with 57.2% of the patients with conventional trolley, the difference being statistically significant (P = 0.000). Besides, there were differences in the amount of food eaten between patients with and without isothermal trolley, so that 41% and 27.7% ate all the tray content, respectively, difference being statistically significant (P = 0.007). There were no differences in smokiness or presentation rating. CONCLUSIONS Most of the patients (60%) had decreased appetite during hospital admission. The percentage of hospitalized patients rating the food temperature as being good is higher among patients served with isothermal trolleys. The amount of food eaten by the patients served with isothermal trolleys is significantly higher that in those without them.
Resumo:
CONTEXT: Cirrhosis after viral hepatitis has been identified as a risk factor for osteoporosis in men. However, in postmenopausal women, most studies have evaluated the effect of primary biliary cirrhosis, but little is known about the effect of viral cirrhosis on bone mass [bone mineral density (BMD)] and bone metabolism. OBJECTIVE: Our objective was to assess the effect of viral cirrhosis on BMD and bone metabolism in postmenopausal women. DESIGN: We conducted a cross-sectional descriptive study. SETTING AND PATIENTS: We studied 84 postmenopausal female outpatients with viral cirrhosis and 96 healthy postmenopausal women from the general community. BMD was measured by dual-energy x-ray absorptiometry at lumbar spine (LS) and femoral neck (FN). RESULTS: The percentage with osteoporosis did not significantly differ between patients (LS, 43.1%; FN, 32.2%) and controls (LS, 41.2%; FN, 29.4%), and there was no difference in BMD (z-score) between groups. Serum concentrations of soluble TNF receptors, estradiol, and osteoprotegerin (OPG) were significantly higher in patients vs. controls (P < 0.001, P < 0.05, and P < 0.05, respectively). No significant difference was observed in urinary deoxypyridinoline. Serum OPG levels were positively correlated with soluble TNF receptors (r = 0.35; P < 0.02) and deoxypyridinoline (r = 0.37; P < 0.05). CONCLUSIONS: This study shows that bone mass and bone resorption rates do not differ between postmenopausal women with viral cirrhosis and healthy postmenopausal controls and suggests that viral cirrhosis does not appear to increase the risk of osteoporosis in these women. High serum estradiol and OPG concentrations may contribute to preventing the bone loss associated with viral cirrhosis in postmenopausal women.
Resumo:
BACKGROUND. Either higher levels of initial DNA damage or lower levels of radiation-induced apoptosis in peripheral blood lymphocytes have been associated to increased risk for develop late radiation-induced toxicity. It has been recently published that these two predictive tests are inversely related. The aim of the present study was to investigate the combined role of both tests in relation to clinical radiation-induced toxicity in a set of breast cancer patients treated with high dose hyperfractionated radical radiotherapy. METHODS. Peripheral blood lymphocytes were taken from 26 consecutive patients with locally advanced breast carcinoma treated with high-dose hyperfractioned radical radiotherapy. Acute and late cutaneous and subcutaneous toxicity was evaluated using the Radiation Therapy Oncology Group morbidity scoring schema. The mean follow-up of survivors (n = 13) was 197.23 months. Radiosensitivity of lymphocytes was quantified as the initial number of DNA double-strand breaks induced per Gy and per DNA unit (200 Mbp). Radiation-induced apoptosis (RIA) at 1, 2 and 8 Gy was measured by flow cytometry using annexin V/propidium iodide. RESULTS. Mean DSB/Gy/DNA unit obtained was 1.70 ± 0.83 (range 0.63-4.08; median, 1.46). Radiation-induced apoptosis increased with radiation dose (median 12.36, 17.79 and 24.83 for 1, 2, and 8 Gy respectively). We observed that those "expected resistant patients" (DSB values lower than 1.78 DSB/Gy per 200 Mbp and RIA values over 9.58, 14.40 or 24.83 for 1, 2 and 8 Gy respectively) were at low risk of suffer severe subcutaneous late toxicity (HR 0.223, 95%CI 0.073-0.678, P = 0.008; HR 0.206, 95%CI 0.063-0.677, P = 0.009; HR 0.239, 95%CI 0.062-0.929, P = 0.039, for RIA at 1, 2 and 8 Gy respectively) in multivariate analysis. CONCLUSIONS. A radiation-resistant profile is proposed, where those patients who presented lower levels of initial DNA damage and higher levels of radiation induced apoptosis were at low risk of suffer severe subcutaneous late toxicity after clinical treatment at high radiation doses in our series. However, due to the small sample size, other prospective studies with higher number of patients are needed to validate these results.