1000 resultados para Pruebas cutáneas
Resumo:
Aim: The aim of the study was to investigate the influence of dietary intake of commercial hydrolyzed collagen (Gelatine Royal ®) on bone remodeling in pre-pubertal children. Methods: A randomized double-blind study was carried out in 60 children (9.42 ± 1.31 years) divided into three groups according to the amount of partially hydrolyzed collagen taken daily for 4 months: placebo (G-I, n = 18), collagen (G-II, n = 20) and collagen + calcium (G-III, n = 22) groups. Analyses of the following biochemical markers were carried out: total and bone alkaline phosphatase (tALP and bALP), osteocalcin, tartrate-resistant acid phosphatase (TRAP), type I collagen carboxy terminal telopeptide, lipids, calcium, 25-hydroxyvitamin D, insulin-like growth factor 1 (IGF-1), thyroid-stimulating hormone, free thyroxin and intact parathormone. Results: There was a significantly greater increase in serum IGF-1 in G-III than in G II (p < 0.01) or G-I (p < 0.05) during the study period, and a significantly greater increase in plasma tALP in G-III than in G-I (p < 0.05). Serum bALP behavior significantly (p < 0.05) differed between G-II (increase) and G-I (decrease). Plasma TRAP behavior significantly differed between G-II and G-I (p < 0.01) and between G-III and G-II (p < 0.05). Conclusion: Daily dietary intake of hydrolyzed collagen seems to have a potential role in enhancing bone remodeling at key stages of growth and development.
Resumo:
Also known as ferroxidase ceruloplasmin, belongs to the family of inflammation-sensitive proteins, and its main function to transport copper in the blood. Although, in addition to this transport function, at present, there are numerous studies that have attempted to use the determination of serum concentrations as a predictive indicator of cardiovascular risk in patients who are overweight or obese. The results of this study confirm the existence of a significant correlation between serum ceruloplasmin and nutritional status of the subjects, which means that for the population of students assessed, serum levels of this protein are an important predictor the risk of cardiovascular disease.
Resumo:
INTRODUCTION Human host immune response following infection with the new variant of A/H1N1 pandemic influenza virus (nvH1N1) is poorly understood. We utilize here systemic cytokine and antibody levels in evaluating differences in early immune response in both mild and severe patients infected with nvH1N1. METHODS We profiled 29 cytokines and chemokines and evaluated the haemagglutination inhibition activity as quantitative and qualitative measurements of host immune responses in serum obtained during the first five days after symptoms onset, in two cohorts of nvH1N1 infected patients. Severe patients required hospitalization (n = 20), due to respiratory insufficiency (10 of them were admitted to the intensive care unit), while mild patients had exclusively flu-like symptoms (n = 15). A group of healthy donors was included as control (n = 15). Differences in levels of mediators between groups were assessed by using the non parametric U-Mann Whitney test. Association between variables was determined by calculating the Spearman correlation coefficient. Viral load was performed in serum by using real-time PCR targeting the neuraminidase gene. RESULTS Increased levels of innate-immunity mediators (IP-10, MCP-1, MIP-1beta), and the absence of anti-nvH1N1 antibodies, characterized the early response to nvH1N1 infection in both hospitalized and mild patients. High systemic levels of type-II interferon (IFN-gamma) and also of a group of mediators involved in the development of T-helper 17 (IL-8, IL-9, IL-17, IL-6) and T-helper 1 (TNF-alpha, IL-15, IL-12p70) responses were exclusively found in hospitalized patients. IL-15, IL-12p70, IL-6 constituted a hallmark of critical illness in our study. A significant inverse association was found between IL-6, IL-8 and PaO2 in critical patients. CONCLUSIONS While infection with the nvH1N1 induces a typical innate response in both mild and severe patients, severe disease with respiratory involvement is characterized by early secretion of Th17 and Th1 cytokines usually associated with cell mediated immunity but also commonly linked to the pathogenesis of autoimmune/inflammatory diseases. The exact role of Th1 and Th17 mediators in the evolution of nvH1N1 mild and severe disease merits further investigation as to the detrimental or beneficial role these cytokines play in severe illness.
Resumo:
BACKGROUND Clinical predictors for fatal pulmonary embolism (PE) in patients with venous thromboembolism have never been studied. METHODS AND RESULTS Using data from the international prospective Registro Informatizado de la Enfermedad TromboEmbolica venosa (RIETE) registry about patients with objectively confirmed symptomatic acute venous thromboembolism, we determined independent predictive factors for fatal PE. Between March 2001 and July 2006, 15520 consecutive patients (mean age+/-SD, 66.3+/-16.9 years; 49.7% men) with acute venous thromboembolism were included. Symptomatic deep-vein thrombosis without symptomatic PE was observed in 58.0% (n=9008) of patients, symptomatic nonmassive PE in 40.4% (n=6264), and symptomatic massive PE in 1.6% (n=248). At 3 months, the cumulative rates of overall mortality and fatal PE were 8.65% and 1.68%, respectively. On multivariable analysis, patients with symptomatic nonmassive PE at presentation exhibited a 5.42-fold higher risk of fatal PE compared with patients with deep-vein thrombosis without symptomatic PE (P<0.001). The risk of fatal PE was multiplied by 17.5 in patients presenting with a symptomatic massive PE. Other clinical factors independently associated with an increased risk of fatal PE were immobilization for neurological disease, age >75 years, and cancer. CONCLUSIONS PE remains a potentially fatal disease. The clinical predictors identified in the present study should be included in any clinical risk stratification scheme to optimally adapt the treatment of PE to the risk of the fatal outcome.
Resumo:
BACKGROUND This study was realized thanks to the collaboration of children and adolescents who had been resected from cerebellar tumors. The medulloblastoma group (CE+, n = 7) in addition to surgery received radiation and chemotherapy. The astrocytoma group (CE, n = 13) did not receive additional treatments. Each clinical group was compared in their executive functioning with a paired control group (n = 12). The performances of the clinical groups with respect to controls were compared considering the tumor's localization (vermis or hemisphere) and the affectation (or not) of the dentate nucleus. Executive variables were correlated with the age at surgery, the time between surgery-evaluation and the resected volume. METHODS The executive functioning was assessed by means of WCST, Complex Rey Figure, Controlled Oral Word Association Test (letter and animal categories), Digits span (WISC-R verbal scale) and Stroop test. These tests are very sensitive to dorsolateral PFC and/or to medial frontal cortex functions. The scores for the non-verbal Raven IQ were also obtained. Direct scores were corrected by age and transformed in standard scores using normative data. The neuropsychological evaluation was made at 3.25 (SD = 2.74) years from surgery in CE group and at 6.47 (SD = 2.77) in CE+ group. RESULTS The Medulloblastoma group showed severe executive deficit (= 1.5 SD below normal mean) in all assessed tests, the most severe occurring in vermal patients. The Astrocytoma group also showed executive deficits in digits span, semantic fluency (animal category) and moderate to slight deficit in Stroop (word and colour) tests. In the astrocytoma group, the tumor's localization and dentate affectation showed different profile and level of impairment: moderate to slight for vermal and hemispheric patients respectively. The resected volume, age at surgery and the time between surgery-evaluation correlated with some neuropsychological executive variables. CONCLUSION Results suggest a differential prefrontal-like deficit due to cerebellar lesions and/or cerebellar-frontal diaschisis, as indicate the results in astrocytoma group (without treatments), that also can be generated and/or increased by treatments in the medulloblastoma group. The need for differential rehabilitation strategies for specific clinical groups is remarked. The results are also discussed in the context of the Cerebellar Cognitive Affective Syndrome.
Resumo:
Drug addiction is associated with impaired judgment in unstructured situations in which success depends on self-regulation of behavior according to internal goals (adaptive decision-making). However most executive measures are aimed at assessing decision-making in structured scenarios, in which success is determined by external criteria inherent to the situation (veridical decision-making). The aim of this study was to examine the performance of Substance Abusers (SA, n = 97) and Healthy Comparison participants (HC, n = 81) in two behavioral tasks that mimic the uncertainty inherent in real-life decision-making: the Cognitive Bias Task (CB) and the Iowa Gambling Task (IGT) (administered only to SA). A related goal was to study the interdependence between performances on both tasks. We conducted univariate analyses of variance (ANOVAs) to contrast the decision-making performance of both groups; and used correlation analyses to study the relationship between both tasks. SA showed a marked context-independent decision-making strategy on the CB's adaptive condition, but no differences were found on the veridical conditions in a subsample of SA (n = 34) and HC (n = 22). A high percentage of SA (75%) also showed impaired performance on the IGT. Both tasks were only correlated when no impaired participants were selected. Results indicate that SA show abnormal decision-making performance in unstructured situations, but not in veridical situations.
Resumo:
BACKGROUND Extreme weight conditions (EWC) groups along a continuum may share some biological risk factors and intermediate neurocognitive phenotypes. A core cognitive trait in EWC appears to be executive dysfunction, with a focus on decision making, response inhibition and cognitive flexibility. Differences between individuals in these areas are likely to contribute to the differences in vulnerability to EWC. The aim of the study was to investigate whether there is a common pattern of executive dysfunction in EWC while comparing anorexia nervosa patients (AN), obese subjects (OB) and healthy eating/weight controls (HC). METHODS Thirty five AN patients, fifty two OB and one hundred thirty seven HC were compared using the Wisconsin Card Sorting Test (WCST); Stroop Color and Word Test (SCWT); and Iowa Gambling Task (IGT). All participants were female, aged between 18 and 60 years. RESULTS There was a significant difference in IGT score (F(1.79); p<.001), with AN and OB groups showing the poorest performance compared to HC. On the WCST, AN and OB made significantly more errors than controls (F(25.73); p<.001), and had significantly fewer correct responses (F(2.71); p<.001). Post hoc analysis revealed that the two clinical groups were not significantly different from each other. Finally, OB showed a significant reduced performance in the inhibition response measured with the Stroop test (F(5.11); p<.001) compared with both AN and HC. CONCLUSIONS These findings suggest that EWC subjects (namely AN and OB) have similar dysfunctional executive profile that may play a role in the development and maintenance of such disorders.
Resumo:
BACKGROUND Preanalytical mistakes (PAMs) in samples usually led to rejection upon arrival to the clinical laboratory. However, PAMs might not always be detected and result in clinical problems. Thus, PAMs should be minimized. We detected PAMs in samples from Primary Health Care Centres (PHCC) served by our central laboratory. Thus, the goal of this study was to describe the number and types of PAMs, and to suggest some strategies for improvement. METHODS The presence of PAMs, as sample rejection criteria, in samples submitted from PHCC to our laboratory during October and November 2007 was retrospectively analysed. RESULTS Overall, 3885 PAMs (7.4%) were detected from 52,669 samples for blood analyses. This included missed samples (n=1763; 45.4% of all PAMs, 3.3% of all samples), haemolysed samples (n=1408; 36.2% and 2.7%, respectively), coagulated samples (n=391; 10% and 0.7%, respectively), incorrect sample volume (n=110; 2.8% and 0.2%, respectively), and others (n=213; 5.5% and 0.4%, respectively). For urine samples (n=18,852), 1567 of the samples were missing (8.3%). CONCLUSIONS We found the proportion of PAMs in blood and urine samples to be 3-fold higher than that reported in the literature. Therefore, strategies for improvement directed towards the staff involved, as well as an exhaustive audit of preanalytical process are needed. To attain this goal, we first implemented a continued education programme, financed by our Regional Health Service and focused in Primary Care Nurses.
Resumo:
A 54-year-old woman presented a peri-areolar nodule located in the skin of the right breast. Clinical examination showed a 6 x 5 cm exophytic, lobed, ulcerated, and bleeding nodule. The patient reported that the tumor had grown gradually over a period of 3 months. The patient had been diagnosed 8 years prior to presentation with infiltrating ductal carcinoma of the right breast (pT2NO). This tumor was treated with partial mastectomy (conservative surgery) and lymph node dissection, then subsequently received 30 tangent field radiotherapy sessions to the breast for a total dose of 45 Gy. The rest of her cutaneous exam was normal. There was no family history of any similar tumor.
Resumo:
The evaluation of sepsis severity is complicated by the highly variable and nonspecific nature of clinical signs and symptoms. We studied routinely used biomarkers together with clinical parameters to compare their prognostic value for severe sepsis and evaluate their usefulness.
Resumo:
Animal studies point to an implication of the endocannabinoid system on executive functions. In humans, several studies have suggested an association between acute or chronic use of exogenous cannabinoids (Δ9-tetrahydrocannabinol) and executive impairments. However, to date, no published reports establish the relationship between endocannabinoids, as biomarkers of the cannabinoid neurotransmission system, and executive functioning in humans. The aim of the present study was to explore the association between circulating levels of plasma endocannabinoids N-arachidonoylethanolamine (AEA) and 2-Arachidonoylglycerol (2-AG) and executive functions (decision making, response inhibition and cognitive flexibility) in healthy subjects. One hundred and fifty seven subjects were included and assessed with the Wisconsin Card Sorting Test; Stroop Color and Word Test; and Iowa Gambling Task. All participants were female, aged between 18 and 60 years and spoke Spanish as their first language. Results showed a negative correlation between 2-AG and cognitive flexibility performance (r = -.37; p<.05). A positive correlation was found between AEA concentrations and both cognitive flexibility (r = .59; p<.05) and decision making performance (r = .23; P<.05). There was no significant correlation between either 2-AG (r = -.17) or AEA (r = -.08) concentrations and inhibition response. These results show, in humans, a relevant modulation of the endocannabinoid system on prefrontal-dependent cognitive functioning. The present study might have significant implications for the underlying executive alterations described in some psychiatric disorders currently associated with endocannabinoids deregulation (namely drug abuse/dependence, depression, obesity and eating disorders). Understanding the neurobiology of their dysexecutive profile might certainly contribute to the development of new treatments and pharmacological approaches.
Resumo:
An association between severe iodine deficiency and poor mental development has been found in many studies. We examined the relationship between moderate or mild iodine deficiency and intellectual capacity in order to determine whether problems common to severe iodine deficiency (including mental retardation) also emerge in a more subtle form. We also wished to know whether the classic methodology (comparing iodine-deficient zones with nondeficient zones) is the most adequate, and propose to combine this grouping by zones with urinary iodine presented by individuals in each zone. We measured IQ, manipulative and verbal capacity, attention, visual motor ability and disruptive behaviour, variables that have barely been studied in this kind of investigations. The sample comprised 760 schoolchildren from the province of Jaén (southern Spain). Our results show that children with low levels of iodine intake and with urinary iodine concentration lower than 100 microg/litre had a lower IQ and displayed more disruptive behaviour than children with high levels of the criteria. The other variables were not associated with iodine deficiency.
Resumo:
Objectives. To study the utility of the Mini-Cog test for detection of patients with cognitive impairment (CI) in primary care (PC). Methods. We pooled data from two phase III studies conducted in Spain. Patients with complaints or suspicion of CI were consecutively recruited by PC physicians. The cognitive diagnosis was performed by an expert neurologist, after formal neuropsychological evaluation. The Mini-Cog score was calculated post hoc, and its diagnostic utility was evaluated and compared with the utility of the Mini-Mental State (MMS), the Clock Drawing Test (CDT), and the sum of the MMS and the CDT (MMS + CDT) using the area under the receiver operating characteristic curve (AUC). The best cut points were obtained on the basis of diagnostic accuracy (DA) and kappa index. Results. A total sample of 307 subjects (176 CI) was analyzed. The Mini-Cog displayed an AUC (±SE) of 0.78 ± 0.02, which was significantly inferior to the AUC of the CDT (0.84 ± 0.02), the MMS (0.84 ± 0.02), and the MMS + CDT (0.86 ± 0.02). The best cut point of the Mini-Cog was 1/2 (sensitivity 0.60, specificity 0.90, DA 0.73, and kappa index 0.48 ± 0.05). Conclusions. The utility of the Mini-Cog for detection of CI in PC was very modest, clearly inferior to the MMS or the CDT. These results do not permit recommendation of the Mini-Cog in PC.
Resumo:
There is scarce data about the importance of phylogroups and virulence factors (VF) in bloodstream infections (BSI) caused by extended-spectrum β-lactamase-producing Escherichia coli (ESBLEC). A prospective multicenter Spanish cohort including 191 cases of BSI due to ESBLEC was studied. Phylogroups and 25 VF genes were investigated by PCR. ESBLEC were classified into clusters according to their virulence profiles. The association of phylogropus, VF, and clusters with epidemiological features were studied using multivariate analysis. Overall, 57.6%, 26.7%, and 15.7% of isolates belonged to A/B1, D and B2 phylogroups, respectively. By multivariate analysis (adjusted OR [95% CI]), virulence cluster C2 was independently associated with urinary tract source (5.05 [0.96-25.48]); cluster C4 with sources other than urinary of biliary tract (2.89 [1.05-7.93]), and cluster C5 with BSI in non-predisposed patients (2.80 [0.99-7.93]). Isolates producing CTX-M-9 group ESBLs and from phylogroup D predominated among cluster C2 and C5, while CTX-M-1 group of ESBL and phylogroup B2 predominantes among C4 isolates. These results suggest that host factors and previous antimicrobial use were more important than phylogroup or specific VF in the occurrence of BSI due to ESBLEC. However, some associations between virulence clusters and some specific epidemiological features were found.
Resumo:
Aplicació web per a correcció automàtica de proves.