830 resultados para ANEMIA POR DEFICIT DE HIERRO


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Diagnosing attention deficit hyperactivity disorder (ADHD) in adults is difficult when diagnosticians cannot establish an onset before the DSM-IV criterion of age 7 or if the number of symptoms recalled does not achieve DSM’s diagnosis threshold. Method The authors addressed the validity of DSM-IV’s age-at-onset and symptom threshold criteria by comparing four groups of adults: 127 subjects with full ADHD who met all DSM-IV criteria for childhood-onset ADHD, 79 subjects with late-onset ADHD who met all criteria except the age-at-onset criterion, 41 subjects with subthreshold ADHD who did not meet full symptom criteria for ADHD, and 123 subjects without ADHD who did not meet any criteria. The authors hypothesized that subjects with late-onset and subthreshold ADHD would show patterns of psychiatric comorbidity, functional impairment, and familial transmission similar to those seen in subjects with full ADHD. Result Subjects with late-onset and full ADHD had similar patterns of psychiatric comorbidity, functional impairment, and familial transmission. Most children with late onset of ADHD (83%) were younger than 12. Subthreshold ADHD was milder and showed a different pattern of familial transmission than the other forms of ADHD. Conclusions The data about the clinical features of probands and the pattern of transmission of ADHD among relatives found little evidence for the validity of subthreshold ADHD among such subjects, who reported a lifetime history of some symptoms that never met DSM-IV’s threshold for diagnosis. In contrast, the results suggested that late-onset adult ADHD is valid and that DSM-IV’s age-at-onset criterion is too stringent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Diagnosing attention-deficit/hyperactivity disorder (ADHD) in adults is difficult when the diagnostician cannot establish an onset prior to the DSM-IV criterion of age 7 or if the number of symptoms recalled does not achieve the DSM-IV threshold for diagnosis. Because neuropsychological deficits are associated with ADHD, we addressed the validity of the DSM-IV age at onset and symptom threshold criteria by using neuropsychological test scores as external validators. Methods We compared four groups of adults: 1) full ADHD subjects met all DSM-IV criteria for childhood-onset ADHD; 2) late-onset ADHD subjects met all criteria except the age at onset criterion; 3) subthreshold ADHD subjects did not meet full symptom criteria; and 4) non-ADHD subjects did not meet any of the above criteria. Results Late-onset and full ADHD subjects had similar patterns of neuropsychological dysfunction. By comparison, subthreshold ADHD subjects showed few neuropsychological differences with non-ADHD subjects. Conclusions Our results showing similar neuropsychological underpinning in subjects with late-onset ADHD suggest that the DSM-IV age at onset criterion may be too stringent. Our data also suggest that ADHD subjects who failed to ever meet the DSM-IV threshold for diagnosis have a milder form of the disorder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physiological and genetic studies of leaf growth often focus on short-term responses, leaving a gap to whole-plant models that predict biomass accumulation, transpiration and yield at crop scale. To bridge this gap, we developed a model that combines an existing model of leaf 6 expansion in response to short-term environmental variations with a model coordinating the development of all leaves of a plant. The latter was based on: (1) rates of leaf initiation, appearance and end of elongation measured in field experiments; and (2) the hypothesis of an independence of the growth between leaves. The resulting whole-plant leaf model was integrated into the generic crop model APSIM which provided dynamic feedback of environmental conditions to the leaf model and allowed simulation of crop growth at canopy level. The model was tested in 12 field situations with contrasting temperature, evaporative demand and soil water status. In observed and simulated data, high evaporative demand reduced leaf area at the whole-plant level, and short water deficits affected only leaves developing during the stress, either visible or still hidden in the whorl. The model adequately simulated whole-plant profiles of leaf area with a single set of parameters that applied to the same hybrid in all experiments. It was also suitable to predict biomass accumulation and yield of a similar hybrid grown in different conditions. This model extends to field conditions existing knowledge of the environmental controls of leaf elongation, and can be used to simulate how their genetic controls flow through to yield.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Attention deficit hyperactivity disorder (ADHD) is a life-long condition, but because of its historical status as a self-remitting disorder of childhood, empirically validated and reliable methods for the assessment of adults are scarce. In this study, the validity and reliability of the Wender Utah Rating Scale (WURS) and the Adult Problem Questionnaire (APQ), which survey childhood and current symptoms of ADHD, respectively, were studied in a Finnish sample. Methods: The self-rating scales were administered to adults with an ADHD diagnosis (n = 38), healthy control participants (n = 41), and adults diagnosed with dyslexia (n = 37). Items of the self-rating scales were subjected to factor analyses, after which the reliability and discriminatory power of the subscales, derived from the factors, were examined. The effects of group and gender on the subscales of both rating scales were studied. Additionally, the effect of age on the subscales of the WURS was investigated. Finally, the diagnostic accuracy of the total scores was studied. Results: On the basis of the factor analyses, a four-factor structure for the WURS and five-factor structure for the APQ had the best fit to the data. All of the subscales of the APQ and three of the WURS achieved sufficient reliability. The ADHD group had the highest scores on all of the subscales of the APQ, whereas two of the subscales of the WURS did not statistically differ between the ADHD and the Dyslexia group. None of the subscales of the WURS or the APQ was associated with the participant's gender. However, one subscale of the WURS describing dysthymia was positively correlated with the participant's age. With the WURS, the probability of a correct positive classification was .59 in the current sample and .21 when the relatively low prevalence of adult ADHD was taken into account. The probabilities of correct positive classifications with the APQ were .71 and .23, respectively. Conclusions: The WURS and the APQ can provide accurate and reliable information of childhood and adult ADHD symptoms, given some important constraints. Classifications made on the basis of the total scores are reliable predictors of ADHD diagnosis only in populations with a high proportion of ADHD and a low proportion of other similar disorders. The subscale scores can provide detailed information of an individual's symptoms if the characteristics and limitations of each domain are taken into account. Improvements are suggested for two subscales of the WURS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genotype-environment interactions (GEI) limit genetic gain for complex traits such as tolerance to drought. Characterization of the crop environment is an important step in understanding GEI. A modelling approach is proposed here to characterize broadly (large geographic area, long-term period) and locally (field experiment) drought-related environmental stresses, which enables breeders to analyse their experimental trials with regard to the broad population of environments that they target. Water-deficit patterns experienced by wheat crops were determined for drought-prone north-eastern Australia, using the APSIM crop model to account for the interactions of crops with their environment (e.g. feedback of plant growth on water depletion). Simulations based on more than 100 years of historical climate data were conducted for representative locations, soils, and management systems, for a check cultivar, Hartog. The three main environment types identified differed in their patterns of simulated water stress around flowering and during grain-filling. Over the entire region, the terminal drought-stress pattern was most common (50% of production environments) followed by a flowering stress (24%), although the frequencies of occurrence of the three types varied greatly across regions, years, and management. This environment classification was applied to 16 trials relevant to late stages testing of a breeding programme. The incorporation of the independently-determined environment types in a statistical analysis assisted interpretation of the GEI for yield among the 18 representative genotypes by reducing the relative effect of GEI compared with genotypic variance, and helped to identify opportunities to improve breeding and germplasm-testing strategies for this region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: Identifying susceptibility genes for schizophrenia may be complicated by phenotypic heterogeneity, with some evidence suggesting that phenotypic heterogeneity reflects genetic heterogeneity. Objective: To evaluate the heritability and conduct genetic linkage analyses of empirically derived, clinically homogeneous schizophrenia subtypes. Design: Latent class and linkage analysis. Setting: Taiwanese field research centers. Participants: The latent class analysis included 1236 Han Chinese individuals with DSM-IV schizophrenia. These individuals were members of a large affected-sibling-pair sample of schizophrenia (606 ascertained families), original linkage analyses of which detected a maximum logarithm of odds (LOD) of 1.8 (z = 2.88) on chromosome 10q22.3. Main Outcome Measures: Multipoint exponential LOD scores by latent class assignment and parametric heterogeneity LOD scores. Results: Latent class analyses identified 4 classes, with 2 demonstrating familial aggregation. The first (LC2) described a group with severe negative symptoms, disorganization, and pronounced functional impairment, resembling “deficit schizophrenia.” The second (LC3) described a group with minimal functional impairment, mild or absent negative symptoms, and low disorganization. Using the negative/deficit subtype, we detected genome-wide significant linkage to 1q23-25 (LOD = 3.78, empiric genome-wide P = .01). This region was not detected using the DSM-IV schizophrenia diagnosis, but has been strongly implicated in schizophrenia pathogenesis by previous linkage and association studies.Variants in the 1q region may specifically increase risk for a negative/deficit schizophrenia subtype. Alternatively, these results may reflect increased familiality/heritability of the negative class, the presence of multiple 1q schizophrenia risk genes, or a pleiotropic 1q risk locus or loci, with stronger genotype-phenotype correlation with negative/deficit symptoms. Using the second familial latent class, we identified nominally significant linkage to the original 10q peak region. Conclusion: Genetic analyses of heritable, homogeneous phenotypes may improve the power of linkage and association studies of schizophrenia and thus have relevance to the design and analysis of genome-wide association studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tank irrigation systems in the semiarid regions of India are discussed in this paper. To optimize the grain yield of rice, it is essential to start the agricultural operations in the second week of July so that favorable climatic conditions will prevail during flowering and yield formation stages. Because of low inflow during the initial few weeks of the crop season, often farmers are forced to delay planting until sufficient sowing rain and inflow have occurred or to adopt deficit irrigation during this period. The delayed start affects the grain yield, but will lead to an improved irrigation efficiency. A delayed start of agricultural operations with increased irrigation efficiency leads to the energy resources becoming critical during the peak requirement week, particularly those of female labor and animal power. This necessitates augmenting these resources during weeks of their peak use, either by reorganizing the traditional methods of cultivation or by importing from outside the system.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Germline mutations in many of the genes that are involved in homologous recombination (HR)-mediated DNA double-strand break repair (DSBR) are associated with various human genetic disorders and cancer. RAD51 and RAD51 paralogs are important for HR and in the maintenance of genome stability. Despite the identification of five RAD51 paralogs over a decade ago, the molecular mechanism(s) by which RAD51 paralogs regulate HR and genome maintenance remains obscure. In addition to the known roles of RAD51C in early and late stages of HR, it also contributes to activation of the checkpoint kinase CHK2. One recent study identifies biallelic mutation in RAD51C leading to Fanconi anemia-like disorder. Whereas a second study reports monoallelic mutation in RAD51C associated with increased risk of breast and ovarian cancer. These reports show RAD51C is a cancer susceptibility gene. In this review, we focus on describing the functions of RAD51C in HR, DNA damage signaling and as a tumor suppressor with an emphasis on the new roles of RAD51C unveiled by these reports.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current standard of care for hepatitis C virus (HCV) infection - combination therapy with pegylated interferon and ribavirin - elicits sustained responses in only similar to 50% of the patients treated. No alternatives exist for patients who do not respond to combination therapy. Addition of ribavirin substantially improves response rates to interferon and lowers relapse rates following the cessation of therapy, suggesting that increasing ribavirin exposure may further improve treatment response. A key limitation, however, is the toxic side-effect of ribavirin, hemolytic anemia, which often necessitates a reduction of ribavirin dosage and compromises treatment response. Maximizing treatment response thus requires striking a balance between the antiviral and hemolytic activities of ribavirin. Current models of viral kinetics describe the enhancement of treatment response due to ribavirin. Ribavirin-induced anemia, however, remains poorly understood and precludes rational optimization of combination therapy. Here, we develop a new mathematical model of the population dynamics of erythrocytes that quantitatively describes ribavirin-induced anemia in HCV patients. Based on the assumption that ribavirin accumulation decreases erythrocyte lifespan in a dose-dependent manner, model predictions capture several independent experimental observations of the accumulation of ribavirin in erythrocytes and the resulting decline of hemoglobin in HCV patients undergoing combination therapy, estimate the reduced erythrocyte lifespan during therapy, and describe inter-patient variations in the severity of ribavirin-induced anemia. Further, model predictions estimate the threshold ribavirin exposure beyond which anemia becomes intolerable and suggest guidelines for the usage of growth hormones, such as erythropoietin, that stimulate erythrocyte production and avert the reduction of ribavirin dosage, thereby improving treatment response. Our model thus facilitates, in conjunction with models of viral kinetics, the rational identification of treatment protocols that maximize treatment response while curtailing side effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RAD51C, a RAD51 paralog, has been implicated in homologous recombination (HR), and germ line mutations in RAD51C are known to cause Fanconi anemia (FA)-like disorder and breast and ovarian cancers. The role of RAD51C in the FA pathway of DNA interstrand cross-link (ICL) repair and as a tumor suppressor is obscure. Here, we report that RAD51C deficiency leads to ICL sensitivity, chromatid-type errors, and G(2)/M accumulation, which are hallmarks of the FA phenotype. We find that RAD51C is dispensable for ICL unhooking and FANCD2 monoubiquitination but is essential for HR, confirming the downstream role of RAD51C in ICL repair. Furthermore, we demonstrate that RAD51C plays a vital role in the HR-mediated repair of DNA lesions associated with replication. Finally, we show that RAD51C participates in ICL and double strand break-induced DNA damage signaling and controls intra-S-phase checkpoint through CHK2 activation. Our analyses with pathological mutants of RAD51C that were identified in FA and breast and ovarian cancers reveal that RAD51C regulates HR and DNA damage signaling distinctly. Together, these results unravel the critical role of RAD51C in the FA pathway of ICL repair and as a tumor suppressor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Se estudió el patosistema mancha de hierro-café a diferentes altitudes y bajo condiciones de campo para describir sus epidemias y determinar el periodo critico, su efecto sobre la defoliación y la producción de café cereza y la influencia del clima sobre su desarrollo. Se seleccionaron lotes en fincas a 440 y 650 m.s.n.m., en el Pacifico y 850, 1050 y 1200 m.s.n.m., en el norte. En cada lote, se numeraron y marcaron 150 bandolas, distribuidas en tres estratos y durante 40 semanas se recolectaron datos del nümero de nudos, nudos con frutos, hojas, incidencia (%), severidad (%), nümero de esporas, temperatura, humedad relativa y precipitación. Además, se recolectó información concerniente al nivel tecnológico, manejo agronómico y caracteristicas fisicas del lote. Luego, se describieron y compararon las epidemias, tanto a nivel de estrato, como a nivel de ecosistema, relacionándolas en este ültimo caso con las variables climátícas, el inóculo, la defoliación y ,la producción. Se definió el periodo critico, determinando en qué fase de las epidemias ocurrieron las mayores r (tasas aparentes de infección}. También se calculó un indice de importancia de las epidemias para ampliar las comparaciones. La "mancha de hierro" es más agresiva en el norte que en el Pacifico y se desarrolla más rápidamente cuando comienza tarde. Su mejor desarrollo ocurre en el estrato superior. Su ciclo epidémico va de mayo y junio a marzo y abril y su periodo critico ocurre durante las 2-4 primeras semanas, para la incidencia y los primeros 4-7 meses, para la severidad. Durante este periodo la acumulación semanal de enfermedad fue de 1 - 3% para la incidencia y 0.2 - 0.5% para la severidad. No se pudieron obtener resultados categóricos en cuanto a la relación del desarrollo de la enfermedad y las variables climáticas. Los mayores ataques a los frutos ocurrieron donde habia menos sombra y no necesariamente donde ocurrió la mayor tasa de infección en hojas. Cualquier sistema de manejo de la "mancha de hierro" debe fundamentarse en una fertilización y limpieza adecuadas del cafetal y el control quimico aplicado en base al periodo critico y el estrato de mayor desarrollo de la enfermedad. La severidad y la incidencia describieron de igual manera las epidemias del patosistema, pero es más conveniente utilizar la incidencia como elemento para la toma de decisiones de manejo.