871 resultados para Deficit hídrico
Resumo:
Background Although there are many structural neuroimaging studies of attention-deficit/hyperactivity disorder (ADHD) in children, there are inconsistencies across studies and no consensus regarding which brain regions show the most robust area or volumetric reductions relative to control subjects. Our goal was to statistically analyze structural imaging data via a meta-analysis to help resolve these issues. Methods We searched the MEDLINE and PsycINFO databases through January 2005. Studies must have been written in English, used magnetic resonance imaging, and presented the means and standard deviations of regions assessed. Data were extracted by one of the authors and verified independently by another author. Results Analyses were performed using STATA with metan, metabias, and metainf programs. A meta-analysis including all regions across all studies indicated global reductions for ADHD subjects compared with control subjects, standardized mean difference equal to .408, p less than .001. Regions most frequently assessed and showing the largest differences included cerebellar regions, the splenium of the corpus callosum, total and right cerebral volume, and right caudate. Several frontal regions assessed in only two studies also showed large significant differences. Conclusions This meta-analysis provides a quantitative analysis of neuroanatomical abnormalities in ADHD and information that can be used to guide future studies.
Resumo:
Objective Diagnosing attention deficit hyperactivity disorder (ADHD) in adults is difficult when diagnosticians cannot establish an onset before the DSM-IV criterion of age 7 or if the number of symptoms recalled does not achieve DSM’s diagnosis threshold. Method The authors addressed the validity of DSM-IV’s age-at-onset and symptom threshold criteria by comparing four groups of adults: 127 subjects with full ADHD who met all DSM-IV criteria for childhood-onset ADHD, 79 subjects with late-onset ADHD who met all criteria except the age-at-onset criterion, 41 subjects with subthreshold ADHD who did not meet full symptom criteria for ADHD, and 123 subjects without ADHD who did not meet any criteria. The authors hypothesized that subjects with late-onset and subthreshold ADHD would show patterns of psychiatric comorbidity, functional impairment, and familial transmission similar to those seen in subjects with full ADHD. Result Subjects with late-onset and full ADHD had similar patterns of psychiatric comorbidity, functional impairment, and familial transmission. Most children with late onset of ADHD (83%) were younger than 12. Subthreshold ADHD was milder and showed a different pattern of familial transmission than the other forms of ADHD. Conclusions The data about the clinical features of probands and the pattern of transmission of ADHD among relatives found little evidence for the validity of subthreshold ADHD among such subjects, who reported a lifetime history of some symptoms that never met DSM-IV’s threshold for diagnosis. In contrast, the results suggested that late-onset adult ADHD is valid and that DSM-IV’s age-at-onset criterion is too stringent.
Resumo:
Background Diagnosing attention-deficit/hyperactivity disorder (ADHD) in adults is difficult when the diagnostician cannot establish an onset prior to the DSM-IV criterion of age 7 or if the number of symptoms recalled does not achieve the DSM-IV threshold for diagnosis. Because neuropsychological deficits are associated with ADHD, we addressed the validity of the DSM-IV age at onset and symptom threshold criteria by using neuropsychological test scores as external validators. Methods We compared four groups of adults: 1) full ADHD subjects met all DSM-IV criteria for childhood-onset ADHD; 2) late-onset ADHD subjects met all criteria except the age at onset criterion; 3) subthreshold ADHD subjects did not meet full symptom criteria; and 4) non-ADHD subjects did not meet any of the above criteria. Results Late-onset and full ADHD subjects had similar patterns of neuropsychological dysfunction. By comparison, subthreshold ADHD subjects showed few neuropsychological differences with non-ADHD subjects. Conclusions Our results showing similar neuropsychological underpinning in subjects with late-onset ADHD suggest that the DSM-IV age at onset criterion may be too stringent. Our data also suggest that ADHD subjects who failed to ever meet the DSM-IV threshold for diagnosis have a milder form of the disorder.
Resumo:
Physiological and genetic studies of leaf growth often focus on short-term responses, leaving a gap to whole-plant models that predict biomass accumulation, transpiration and yield at crop scale. To bridge this gap, we developed a model that combines an existing model of leaf 6 expansion in response to short-term environmental variations with a model coordinating the development of all leaves of a plant. The latter was based on: (1) rates of leaf initiation, appearance and end of elongation measured in field experiments; and (2) the hypothesis of an independence of the growth between leaves. The resulting whole-plant leaf model was integrated into the generic crop model APSIM which provided dynamic feedback of environmental conditions to the leaf model and allowed simulation of crop growth at canopy level. The model was tested in 12 field situations with contrasting temperature, evaporative demand and soil water status. In observed and simulated data, high evaporative demand reduced leaf area at the whole-plant level, and short water deficits affected only leaves developing during the stress, either visible or still hidden in the whorl. The model adequately simulated whole-plant profiles of leaf area with a single set of parameters that applied to the same hybrid in all experiments. It was also suitable to predict biomass accumulation and yield of a similar hybrid grown in different conditions. This model extends to field conditions existing knowledge of the environmental controls of leaf elongation, and can be used to simulate how their genetic controls flow through to yield.
Resumo:
Objective: Attention deficit hyperactivity disorder (ADHD) is a life-long condition, but because of its historical status as a self-remitting disorder of childhood, empirically validated and reliable methods for the assessment of adults are scarce. In this study, the validity and reliability of the Wender Utah Rating Scale (WURS) and the Adult Problem Questionnaire (APQ), which survey childhood and current symptoms of ADHD, respectively, were studied in a Finnish sample. Methods: The self-rating scales were administered to adults with an ADHD diagnosis (n = 38), healthy control participants (n = 41), and adults diagnosed with dyslexia (n = 37). Items of the self-rating scales were subjected to factor analyses, after which the reliability and discriminatory power of the subscales, derived from the factors, were examined. The effects of group and gender on the subscales of both rating scales were studied. Additionally, the effect of age on the subscales of the WURS was investigated. Finally, the diagnostic accuracy of the total scores was studied. Results: On the basis of the factor analyses, a four-factor structure for the WURS and five-factor structure for the APQ had the best fit to the data. All of the subscales of the APQ and three of the WURS achieved sufficient reliability. The ADHD group had the highest scores on all of the subscales of the APQ, whereas two of the subscales of the WURS did not statistically differ between the ADHD and the Dyslexia group. None of the subscales of the WURS or the APQ was associated with the participant's gender. However, one subscale of the WURS describing dysthymia was positively correlated with the participant's age. With the WURS, the probability of a correct positive classification was .59 in the current sample and .21 when the relatively low prevalence of adult ADHD was taken into account. The probabilities of correct positive classifications with the APQ were .71 and .23, respectively. Conclusions: The WURS and the APQ can provide accurate and reliable information of childhood and adult ADHD symptoms, given some important constraints. Classifications made on the basis of the total scores are reliable predictors of ADHD diagnosis only in populations with a high proportion of ADHD and a low proportion of other similar disorders. The subscale scores can provide detailed information of an individual's symptoms if the characteristics and limitations of each domain are taken into account. Improvements are suggested for two subscales of the WURS.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
Genotype-environment interactions (GEI) limit genetic gain for complex traits such as tolerance to drought. Characterization of the crop environment is an important step in understanding GEI. A modelling approach is proposed here to characterize broadly (large geographic area, long-term period) and locally (field experiment) drought-related environmental stresses, which enables breeders to analyse their experimental trials with regard to the broad population of environments that they target. Water-deficit patterns experienced by wheat crops were determined for drought-prone north-eastern Australia, using the APSIM crop model to account for the interactions of crops with their environment (e.g. feedback of plant growth on water depletion). Simulations based on more than 100 years of historical climate data were conducted for representative locations, soils, and management systems, for a check cultivar, Hartog. The three main environment types identified differed in their patterns of simulated water stress around flowering and during grain-filling. Over the entire region, the terminal drought-stress pattern was most common (50% of production environments) followed by a flowering stress (24%), although the frequencies of occurrence of the three types varied greatly across regions, years, and management. This environment classification was applied to 16 trials relevant to late stages testing of a breeding programme. The incorporation of the independently-determined environment types in a statistical analysis assisted interpretation of the GEI for yield among the 18 representative genotypes by reducing the relative effect of GEI compared with genotypic variance, and helped to identify opportunities to improve breeding and germplasm-testing strategies for this region.
Resumo:
Context: Identifying susceptibility genes for schizophrenia may be complicated by phenotypic heterogeneity, with some evidence suggesting that phenotypic heterogeneity reflects genetic heterogeneity. Objective: To evaluate the heritability and conduct genetic linkage analyses of empirically derived, clinically homogeneous schizophrenia subtypes. Design: Latent class and linkage analysis. Setting: Taiwanese field research centers. Participants: The latent class analysis included 1236 Han Chinese individuals with DSM-IV schizophrenia. These individuals were members of a large affected-sibling-pair sample of schizophrenia (606 ascertained families), original linkage analyses of which detected a maximum logarithm of odds (LOD) of 1.8 (z = 2.88) on chromosome 10q22.3. Main Outcome Measures: Multipoint exponential LOD scores by latent class assignment and parametric heterogeneity LOD scores. Results: Latent class analyses identified 4 classes, with 2 demonstrating familial aggregation. The first (LC2) described a group with severe negative symptoms, disorganization, and pronounced functional impairment, resembling “deficit schizophrenia.” The second (LC3) described a group with minimal functional impairment, mild or absent negative symptoms, and low disorganization. Using the negative/deficit subtype, we detected genome-wide significant linkage to 1q23-25 (LOD = 3.78, empiric genome-wide P = .01). This region was not detected using the DSM-IV schizophrenia diagnosis, but has been strongly implicated in schizophrenia pathogenesis by previous linkage and association studies.Variants in the 1q region may specifically increase risk for a negative/deficit schizophrenia subtype. Alternatively, these results may reflect increased familiality/heritability of the negative class, the presence of multiple 1q schizophrenia risk genes, or a pleiotropic 1q risk locus or loci, with stronger genotype-phenotype correlation with negative/deficit symptoms. Using the second familial latent class, we identified nominally significant linkage to the original 10q peak region. Conclusion: Genetic analyses of heritable, homogeneous phenotypes may improve the power of linkage and association studies of schizophrenia and thus have relevance to the design and analysis of genome-wide association studies.
Resumo:
Tank irrigation systems in the semiarid regions of India are discussed in this paper. To optimize the grain yield of rice, it is essential to start the agricultural operations in the second week of July so that favorable climatic conditions will prevail during flowering and yield formation stages. Because of low inflow during the initial few weeks of the crop season, often farmers are forced to delay planting until sufficient sowing rain and inflow have occurred or to adopt deficit irrigation during this period. The delayed start affects the grain yield, but will lead to an improved irrigation efficiency. A delayed start of agricultural operations with increased irrigation efficiency leads to the energy resources becoming critical during the peak requirement week, particularly those of female labor and animal power. This necessitates augmenting these resources during weeks of their peak use, either by reorganizing the traditional methods of cultivation or by importing from outside the system.
Resumo:
11 p.
Resumo:
Regulators and market participants have become increasingly concerned about the Spanish electricity tariff deficit due to its size and the difficulties to control its growth. The deficit can be traced to inefficiencies in market organization and solutions should be designed to mitigate those inefficiencies. Tariff deficits have allowed for the transfer of part of the present costs of electricity services to future consumers, but this situation has reached a limit and a deep revision of regulation in this market cannot be postponed. In general, solutions that interfere with market prices and signals are not appropriate.
Resumo:
160 p. (Bibliogr. 141-160)
Resumo:
Alterações na frequência e amplitude das inundações pelas marés são fatores reguladores da dinâmica das florestas de mangue. Tais alterações podem estar relacionadas à elevação do nível médio relativo do mar (NMRM), que vem sendo atribuída como fator de impacto decorrente das mudanças climáticas globais atuando sobre os ecossistemas costeiros. Durante a década de 90, o Núcleo de Estudos em Manguezais da Universidade do Estado do Rio de Janeiro observou, em Guaratiba (RJ), um processo de colonização de uma planície hipersalina por espécies de mangue, e começou a monitorá-lo. Após seis anos os dados indicaram a consolidação da colonização e juntamente com o desenvolvimento de outros estudos neste sistema, a elevação do NMRM foi atribuída como principal causa deste processo. Os dados gerados por tal monitoramento, de 1998 até o ano de 2011, constituem a base de dados da presente dissertação, que teve como objetivo analisar o desenvolvimento do processo de colonização e verificar suas relações com fatores meteorológicos locais. No ano de 1998 foram demarcadas seis parcelas justapostas, no sentido floresta-planície hipersalina, até onde era percebida a presença das plantas de mangue. Todos os indivíduos foram identificados com etiquetas numeradas e tiveram suas alturas medidas ao longo de todo monitoramento, da mesma forma que novos indivíduos (recrutas). Quando recrutas eram encontrados em áreas mais distantes da floresta, novas parcelas eram incluídas no monitoramento. No ano de 2011, a colonização já havia avançado 75 m de distância da floresta e as treze parcelas monitoradas indicaram diferentes estágios sucessionais: mais próximas à floresta apresentam consolidação do processo de colonização, com redução de densidade e maior desenvolvimento estrutural; intermediárias, com altas densidades, encontra-se em fase menos avançada da colonização; mais externas caracterizam o início da colonização da área por poucos indivíduos. Notou-se, ainda, que o processo de colonização se dá por meio do estabelecimento de coortes, que demonstram variabilidade nos padrões de desenvolvimento estrutural ao longo dos anos (densidade, distribuição e taxa de crescimento), relacionada às condições ambientais distintas (padrões climatológicos). Quanto à disponibilidade hídrica (entre 1985 e 2011), Guaratiba apresentou predomínio de déficit hídrico, com tendência de amenização ao final do período. Períodos de maior disponibilidade de água no sistema coincidiram com marcação de novas parcelas, bem como com picos de recrutamento e elevadas taxas de crescimento das coortes. Assim, foi possível observar que o desenvolvimento das coortes esteve relacionado às condições de disponibilidade de espaço e luz, além de responderem aos processos meteorológicos, relacionados à disponibilidade de água na região.