992 resultados para Iron Age Iberia
Resumo:
Age is the main clinical determinant of large artery stiffness. Central arteries stiffen progressively with age, whereas peripheral muscular arteries change little with age. A number of clinical studies have analyzed the effects of age on aortic stiffness. Increase of central artery stiffness with age is responsible for earlier wave reflections and changes in pressure wave contours. The stiffening of aorta and other central arteries is a potential risk factor for increased cardiovascular morbidity and mortality. Arterial stiffening with aging is accompanied by an elevation in systolic blood pressure (BP) and pulse pressure (PP). Although arterial stiffening with age is a common situation, it has now been confirmed that older subjects with increased arterial stiffness and elevated PP have higher cardiovascular morbidity and mortality. Increase in aortic stiffness with age occurs gradually and continuously, similarly for men and women. Cross-sectional studies have shown that aortic and carotid stiffness (evaluated by the pulse wave velocity) increase with age by approximately 10% to 15% during a period of 10 years. Women always have 5% to 10% lower stiffness than men of the same age. Although large artery stiffness increases with age independently of the presence of cardiovascular risk factors or other associated conditions, the extent of this increase may depend on several environmental or genetic factors. Hypertension may increase arterial stiffness, especially in older subjects. Among other cardiovascular risk factors, diabetes type 1 and 2 accelerates arterial stiffness, whereas the role of dyslipidemia and tobacco smoking is unclear. Arterial stiffness is also present in several cardiovascular and renal diseases. Patients with heart failure, end stage renal disease, and those with atherosclerotic lesions often develop central artery stiffness. Decreased carotid distensibility, increased arterial thickness, and presence of calcifications and plaques often coexist in the same subject. However, relationships between these three alterations of the arterial wall remain to be explored.
Resumo:
BACKGROUND: Iron deficiency is a common and undertreated problem in inflammatory bowel disease (IBD). AIM: To develop an online tool to support treatment choice at the patient-specific level. METHODS: Using the RAND/UCLA Appropriateness Method (RUAM), a European expert panel assessed the appropriateness of treatment regimens for a variety of clinical scenarios in patients with non-anaemic iron deficiency (NAID) and iron deficiency anaemia (IDA). Treatment options included adjustment of IBD medication only, oral iron supplementation, high-/low-dose intravenous (IV) regimens, IV iron plus erythropoietin-stimulating agent (ESA), and blood transfusion. The panel process consisted of two individual rating rounds (1148 treatment indications; 9-point scale) and three plenary discussion meetings. RESULTS: The panel reached agreement on 71% of treatment indications. 'No treatment' was never considered appropriate, and repeat treatment after previous failure was generally discouraged. For 98% of scenarios, at least one treatment was appropriate. Adjustment of IBD medication was deemed appropriate in all patients with active disease. Use of oral iron was mainly considered an option in NAID and mildly anaemic patients without disease activity. IV regimens were often judged appropriate, with high-dose IV iron being the preferred option in 77% of IDA scenarios. Blood transfusion and IV+ESA were indicated in exceptional cases only. CONCLUSIONS: The RUAM revealed high agreement amongst experts on the management of iron deficiency in patients with IBD. High-dose IV iron was more often considered appropriate than other options. To facilitate dissemination of the recommendations, panel outcomes were embedded in an online tool, accessible via http://ferroscope.com/.
Resumo:
Bacteria often possess multiple siderophore-based iron uptake systems for scavenging this vital resource from their environment. However, some siderophores seem redundant, because they have limited iron-binding efficiency and are seldom expressed under iron limitation. Here, we investigate the conundrum of why selection does not eliminate this apparent redundancy. We focus on Pseudomonas aeruginosa, a bacterium that can produce two siderophores-the highly efficient but metabolically expensive pyoverdine, and the inefficient but metabolically cheap pyochelin. We found that the bacteria possess molecular mechanisms to phenotypically switch from mainly producing pyoverdine under severe iron limitation to mainly producing pyochelin when iron is only moderately limited. We further show that strains exclusively producing pyochelin grew significantly better than strains exclusively producing pyoverdine under moderate iron limitation, whereas the inverse was seen under severe iron limitation. This suggests that pyochelin is not redundant, but that switching between siderophore strategies might be beneficial to trade off efficiencies versus costs of siderophores. Indeed, simulations parameterized from our data confirmed that strains retaining the capacity to switch between siderophores significantly outcompeted strains defective for one or the other siderophore under fluctuating iron availabilities. Finally, we discuss how siderophore switching can be viewed as a form of collective decision-making, whereby a coordinated shift in behaviour at the group level emerges as a result of positive and negative feedback loops operating among individuals at the local scale.
Resumo:
Aging is ubiquitous to the human condition. The MRI correlates of healthy aging have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI (qMRI), and diffusion tensor imaging. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analyzing this region. By utilizing a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of aging within the human brainstem in vivo. Using qMRI, tensor-based morphometry (TBM), and voxel-based quantification (VBQ), the volumetric and quantitative changes across healthy adults between 19 and 75 years were characterized. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetization transfer and increase in proton density (PD), accounting for the previously described "midbrain shrinkage." Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterized, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterized by early, pre-clinical involvement of the brainstem, such as Parkinson's and Alzheimer's diseases.
Resumo:
Vitamin B12 and iron deficiencies are common problems in consultations of general internal medicine. They cause different symptoms that can be non-specific. This article makes it possible, from a clinical frame of reference, to answer the following questions: What value of vitamin B12 should we consider a "deficiency", and what is the role of methylmalonate? What is the role of vitamin B12 oral supplements? How should we interpret values of ferritine? How should iron deficiency be investigated? What is the place of intravenous iron administration?
Resumo:
One of the most relevant concerns in long-term survivors of paediatric acute lymphoblastic leukaemia (ALL) is the development of neuropsychological sequelae. The majority of the published studies report on patients treated with chemotherapy and prophylactic central nervous system (CNS) irradiation, little is known about the outcome of patients treated with chemotherapy-only regimens. Using the standardised clinical and neuropsychological instruments of the SPOG Late Effects Study, the intellectual performance of 132 paediatric ALL patients treated with chemotherapy only was compared to that of 100 control patients surviving from diverse non-CNS solid tumours. As a group, ALL and solid tumour survivors showed normal and comparable intellectual performances (mean global IQ 104.6 in both groups). The percentage of patients in the borderline range (global IQ between 70 and 85) was comparable and not higher as expected (10% cases and 13% controls, expected 16%). Only 2 (2%) of the former ALL and 1 (1%) of the solid tumour patients were in the range of mental retardation (global IQ<70). Former known risk factors described in children treated with prophylactic CNS irradiation, like a younger age at diagnosis of ALL and female gender, remained valid in chemotherapy-only treated patients. The abandonment of prophylactic CNS irradiation and its replacement by a more intensive systemic and intrathecal chemotherapy led to a reduction, but not the disappearance of late neuropsychological sequelae.
Resumo:
We study the lysis timing of a bacteriophage population by means of a continuously infection-age-structured population dynamics model. The features of the model are the infection process of bacteria, the natural death process, and the lysis process which means the replication of bacteriophage viruses inside bacteria and the destruction of them. We consider that the length of the lysis timing (or latent period) is distributed according to a general probability distribution function. We have carried out an optimization procedure and we have found the latent period corresponding to the maximal fitness (i.e. maximal growth rate) of the bacteriophage population.
Resumo:
In recent decades the percentage of energy derived from dietary fat has increased. The aim of this study was to explore the relationship between food taste preferences, BMI, age, gender and smoking habits. A computerized questionnaire using a hedonic scale (range 0 to 8) to quantify the liking for sweet and savoury, lean and fat foods, was filled by 233 adults: 171 normal weight (131 women, 40 men) and 62 overweight subjects (BMI > 25 kg/m2 42 women, 20 men). The majority of the subjects had a general preference for savoury lean food irrespective of their BMI or gender. Similarly, preference for sweet lean food was not influenced by the magnitude of the BMI. In contrast, overweight subjects had a preference for sweet fat food (p = 0.05) as well as for savoury fat food (p < 0.05). At any age or BMI, men preferred sweet fat food (p < 0.01). This was not the case for women. Overweight men over forty preferred savoury fat food, in contrast to overweight women of the same age (p < 0.01). The same difference existed between normal weight smokers and non-smokers. This study demonstrates that fat food preference plays a potential role in the development of obesity.
Resumo:
Both late menarcheal age and low calcium intake (Ca intake) during growth are risk factors for osteoporosis, probably by impairing peak bone mass. We investigated whether lasting gain in areal bone mineral density (aBMD) in response to increased Ca intake varies according to menarcheal age and, conversely, whether Ca intake could influence menarcheal age. In an initial study, 144 prepubertal girls were randomized in a double-blind controlled trial to receive either a Ca supplement (Ca-suppl.) of 850 mg/d or placebo from age 7.9-8.9 yr. Mean aBMD gain determined by dual energy x-ray absorptiometry at six sites (radius metaphysis, radius diaphysis, femoral neck, trochanter, femoral diaphysis, and L2-L4) was significantly (P = 0.004) greater in the Ca-suppl. than in the placebo group (27 vs. 21 mg/cm(2)). In 122 girls followed up, menarcheal age was recorded, and aBMD was determined at 16.4 yr of age. Menarcheal age was lower in the Ca-suppl. than in the placebo group (P = 0.048). Menarcheal age and Ca intake were negatively correlated (r = -0.35; P < 0.001), as were aBMD gains from age 7.9-16.4 yr and menarcheal age at all skeletal sites (range: r = -0.41 to r = -0.22; P < 0.001 to P = 0.016). The positive effect of Ca-suppl. on the mean aBMD gain from baseline remained significantly greater in girls below, but not in those above, the median of menarcheal age (13.0 yr). Early menarcheal age (12.1 +/- 0.5 yr): placebo, 286 +/- 36 mg/cm(2); Ca-suppl., 317 +/- 46 (P = 0.009); late menarcheal age (13.9 +/- 0.5 yr): placebo, 284 +/- 58; Ca-suppl., 276 +/- 50 (P > 0.05). The level of Ca intake during prepuberty may influence the timing of menarche, which, in turn, could influence long-term bone mass gain in response to Ca supplementation. Thus, both determinants of early menarcheal age and high Ca intake may positively interact on bone mineral mass accrual.
Resumo:
Severity of urinary tract morbidity increases with intensity and duration of Schistosoma haematobium infection. We assessed the ability of yearly drug therapy to control infection intensity and reduce S. haematobium-associated disease in children 5-21 years old in an endemic area of Kenya. In year I, therapy resulted in reduced prevalence (66% to 22%, P < 0.001) and intensity of S. haematobium infection (20 to 2 eggs/10 mL, urine), with corresponding reductions in the prevalence of hematuria (52% to 19%, P < 0.001). There was not, however, a significant first-year effect on prevalence of urinary tract abnormalities detected by ultrasound. Repeat therapy in years 2 and 3 resulted in significant regression of hydronephrosis and bladder abnormalities (41% to 6% prevalence, P< 0.001), and further reductions in proteinuria. Repeat age-targeted therapy was associated with decreased prevalence of infection among young children (< 5yr) entering into the target age group. Two years after discontinuation of therapy, intensity of S. haematobium infection and ultrasound abnormalities remained suppressed, but hematuria prevalence began to increase (to 33% in 1989). Reinstitution of annual therapy in 1989 and 1990 reversed this trends. We conclude that annual oral therapy provides an effective strategy for control of morbidity due to S. haematobium on population basis, both through regression of disease in treated individuals, and prevention of infection in untreated subjects.
Resumo:
Background and Aims: IL28B polymorphisms, interferon (IFN)-gamma inducible protein-10 (IP-10) levels and the homeostasis model assessment of insulin resistance (HOMA-IR) score have been reported to predict rapid (RVR) and sustained (SVR) virological response in chronic hepatitis C (CHC), but it is not known whether these factors represent independent, clinically useful predictors. The aim of the study was to assess factors (including IL28B polymorphisms, IP-10 levels and HOMA-IR score) independently predicting response to therapy in CHC under real life conditions.Methods: Multivariate analysis of factors predicting RVR and SVR in 280 consecutive, treatment-naive CHC patients treated with pegylated IFN alpha and ribavirin in a prospective multicenter study.Results: Independent predictors of RVR were HCV RNA < 400,000 IU/ml (OR11.37; 95% CI 3.03-42.6), rs12980275 AA (vs. AG/GG) (OR 7.09; 1.97-25.56) and IP-10 (OR 0.04; 0.003-0.56) in HCV genotype 1 patients and lower baseline γ-glutamyl-transferase levels (OR = 0.02; 0.0009-0.31) in HCV genotype 3 patients. Independent predictors of SVR were rs12980275 AA (OR 9.68; 3.44-27.18), age < 40 yrs (OR = 4.79; 1.50-15.34) and HCV RNA < 400,000 IU/ml (OR 2.74; 1.03-7.27) in HCV genotype 1 patients and rs12980275 AA (OR = 6.26; 1.98-19.74) and age < 40 yrs (OR 5.37; 1.54-18.75) in the 88 HCV genotype 1 patients without a RVR. RVR was by itself predictive of SVR in HCV genotype 1 patients (32 of 33, 97%; OR 33.0; 4.06-268.32) and the only independent predictor of SVR in HCV genotype 2 (OR 9.0, 1.72-46.99; p=0.009) or 3 patients (OR 7.8, 1.43-42.67; p=0.01).Conclusions: In HCV genotype 1 patients, IL28B polymorphisms, HCV RNA load and IP-10 independently predict RVR. The combination of IL28B polymorphisms, HCV RNA level and age may yield more accurate pretreatment prediction of SVR. HOMA-IR score is not associated with viral response.
Resumo:
This paper is written in the context of our changing preception of the immunological system as a system with possible biological roles exceding the prevailung view of a system concerned principally with the defense against external pathogens. The view discussed here relates the immunological system inextricably to the metabolism of iron, the circulation of the blood and the resolution of the evolutionary paradox created by oxygen and iron. Indirect evidence for this inextricable relationship between the two systems can be derived from the discrepancy between the theoretical quasi-impossibility of the existence of an iron deficiency state in the adult and the reality of the WHO numbers of people in the world with iron deficiency anemia. With mounting evidence that TNF, IL-1, and T lymphocyte cytokines affect hemopoieisis and iron metabolism it is possible that the reported discrepancy is a reflection of that inextricable interdependence between the two systems in the face of infection. Further direct evidence for a relationship between T cell subset numbers and iron metabolism is presented from the results of a study of T cell populations in patients with hereditary hemochromatosis. The recent finding of a correlation between low CD8+ lymphocite numbers, liver demage associated with HCVpositivity and severity of iron overload in B-thalassemia major patients (umpublished data of RW Grandy; P. Giardina, M. Hilgartner) concludes this review.
Resumo:
This study investigated the direction of effects of temporal and downward social comparisons on self-rated health in very old age. Self-rated health can either reinforce or hinder comparison processes. In the framework of the Swiss Interdisciplinary Longitudinal Study on the Oldest Old, individuals aged 80 to 84 at baseline were interviewed and followed longitudinally for 5 years. Multilevel analyses were used to test the relative importance of temporal and social comparisons on self-rated health evaluations synchronically and diachronically (with a time lag of 12 to 18 months) as well as the direction of these relative influences. Results indicate that (a) at the synchronic level, continuity temporal comparisons have more impact than downward social comparisons on self-rated health; (b) both types of comparison had an independent and positive effect on self-rated health at the diachronic level; (c) self-rated health has an independent synchronic effect on both types of comparison and an independent diachronic effect in temporal comparison.
Resumo:
A cohort of 100 eggs of Triatoma mazzottii Usinger was studied to obtain information on its life cycle. Egg incubation took 24 days; mean duration of 1st, 2nd, 3rd, 4th, and 5th instar nymphs was 27, 36, 39, 46 and 64 days respectively; mean time from egg to adult was 236 days. The total duration of the nymphal stages was 212 days. The total nymph mortality in cohort was 16.3% and the embryonic egg mortality was 14.0%. The grater mortality occured in the 2nd instar. The average number of eggs/female/week was 9.8 during 15 weeks of observation. Of the total eggs laid (2,514), only 58.7% hatched. The total of insects that achieved the adult stage (72), 38 were females (52.8%), and 34 were males (47.2%). The influence of age and feeding on the first mating of T. mazzottii were also studied. It was found that the first mating depended on the male's age and it was on the average 30 days after the last imaginal molt. The female could be mating since 2nd days after the imaginal life. The nutritional status did not play an important role in the capacity of the insect for the first mating.
Resumo:
QUESTION UNDER STUDY: To assess which high-risk acute coronary syndrome (ACS) patient characteristics played a role in prioritising access to intensive care unit (ICU), and whether introducing clinical practice guidelines (CPG) explicitly stating ICU admission criteria altered this practice. PATIENTS AND METHODS: All consecutive patients with ACS admitted to our medical emergency centre over 3 months before and after CPG implementation were prospectively assessed. The impact of demographic and clinical characteristics (age, gender, cardiovascular risk factors, and clinical parameters upon admission) on ICU hospitalisation of high-risk patients (defined as retrosternal pain of prolonged duration with ECG changes and/or positive troponin blood level) was studied by logistic regression. RESULTS: Before and after CPG implementation, 328 and 364 patients, respectively, were assessed for suspicion of ACS. Before CPG implementation, 36 of the 81 high-risk patients (44.4%) were admitted to ICU. After CPG implementation, 35 of the 90 high-risk patients (38.9%) were admitted to ICU. Male patients were more frequently admitted to ICU before CPG implementation (OR=7.45, 95% CI 2.10-26.44), but not after (OR=0.73, 95% CI 0.20-2.66). Age played a significant role in both periods (OR=1.57, 95% CI 1.24-1.99), both young and advanced ages significantly reducing ICU admission, but to a lesser extent after CPG implementation. CONCLUSION: Prioritisation of access to ICU for high-risk ACS patients was age-dependent, but focused on the cardiovascular risk factor profile. CPG implementation explicitly stating ICU admission criteria decreased discrimination against women, but other factors are likely to play a role in bed allocation.