35 resultados para Phenotypic Maturation
Resumo:
Barley can be classified into three major agronomic types, based on its seasonal growth habit (SGH): spring, winter and alternative. Winter varieties require exposure to vernalization to promote subsequent flowering and are autumn-sown. Spring varieties proceed to flowering in the absence of vernalization and are sown in the spring. The ‘alternative’ (also known as ‘facultative’) SGH is only loosely defined and can be sown in autumn or spring. Here, we investigate the molecular genetic basis of alternative barley. Analysis of the major barley vernalization (VRN-H1, VRN-H2) and photoperiod (PPD-H1, PPD-H2) response genes in a collection of 386 varieties found alternative SGH to be characterized by specific allelic combinations. Spring varieties possessed spring loci at one or both of the vernalization response loci, combined with long-day non-responsive ppd-H1 alleles and wild-type alleles at the short-day photoperiod response locus, PPD-H2. Winter varieties possessed winter alleles at both vernalization loci, in combination with the mutant ppd-H2 allele conferring delayed flowering under short-day photoperiods. In contrast, all alternative varieties investigated possessed a single spring allele (either at VRN-H1 or at VRN-H2) combined with mutant ppd-H2 alleles. This allelic combination is found only in alternative types and is diagnostic for alternative SGH in the collection studied. Analysis of flowering time under controlled environment found alternative varieties flowered later than spring control lines, with the difference most pronounced under short-day photoperiods. This work provides genetic characterization of the alternative SGH phenotype, allowing precise manipulation of SGH and flowering time within breeding programmes, and provides the molecular tools for classification of all three SGH categories within national variety registration processes.
Resumo:
Poor wheat seed quality in temperate regions is often ascribed to wet production environments. We investigated the possible effect of simulated rain during seed development and maturation on seed longevity in wheat (Triticum aestivum L.) cv. Tybalt grown in the field (2008, 2009) or a polythene tunnel house (2010). To mimic rain, the seed crops were wetted from above with the equivalent of 30mm (2008, 2009) or 25mm rainfall (2010) at different stages of seed development and maturation (17 to 58 DAA, days after 50% anthesis), samples harvested serially, and subsequent air-dry seed longevity estimated. No pre-harvest sprouting occurred. Seed longevity (p50, 50% survival period in experimental hermetic storage at 40°C with c. 15% moisture content) in field-grown controls increased during seed development and maturation attaining maxima at 37 (2008) or 44 DAA (2009); it declined thereafter. Immediate effects of simulated rain at 17-58 DAA in field studies (2008, 2009) on subsequent seed longevity were negative but small, e.g. a 1-4 d delay in seed quality improvement for treatments early in development but with no damage detected at final harvests. In rainfall-protected conditions (2010), simulated rain close to harvest maturity (55-56 DAA) reduced longevity immediately and substantially, with greater damage from two sequential days of wetting than one; again, later harvests provided evidence of recovery in subsequent longevity. In the absence of pre-harvest sprouting, the potentially deleterious effects of rainfall to wheat seed crops on subsequent seed longevity may be reversible in full or in part.
Resumo:
Rates of phenotypic evolution vary widely in nature and these rates may often reflect the intensity of natural selection. Here we outline an approach for detecting exceptional shifts in the rate of phenotypic evolution across phylogenies. We introduce a simple new branch-specific metric ∆V/∆B that divides observed phenotypic change along a branch into two components: (1) that attributable to the background rate (∆B), and (2) that attributable to departures from the background rate (∆V). Where the amount of expected change derived from variation in the rate of morphological evolution doubles that explained by to the background rate (∆V/∆B > 2), we identify this as positive phenotypic selection. We apply our approach to six datasets, finding multiple instances of positive selection in each. Our results support the growing appreciation that the traditional gradual view of phenotypic evolution is rarely upheld, with a more episodic view taking its place. This moves focus away from viewing phenotypic evolution as a simple homogeneous process and facilitates reconciliation with macroevolutionary interpretations from a genetic perspective, paving the way to novel insights into the link between genotype and phenotype. The ability to detect positive selection when genetic data are unavailable or unobtainable represents an attractive prospect for extant species, but when applied to fossil data it can reveal patterns of natural selection in deep time that would otherwise be impossible.
Resumo:
The effects of simulated additional rain (ear wetting, 25 mm) or of rain shelter imposed at different periods after anthesis on grain quality at maturity and the dynamics of grain filling and desiccation were investigated in UK field-grown crops of wheat (Triticum aestivum L., cvar Tybalt) in 2011 and in 2012 when June–August rainfall was 255.0 and 214.6 mm, respectively, and above the decadal mean (157.4 mm). Grain filling and desiccation were quantified well by broken-stick regressions and Gompertz curves, respectively. Rain shelter for 56 (2011) or 70 d (2012) after anthesis, and to a lesser extent during late maturation only, resulted in more rapid desiccation and hence progress to harvest maturity whereas ear wetting had negligible effects, even when applied four times. Grain-filling duration was also affected as above in 2011, but with no significant effect in 2012. In both years, there were strong positive associations between final grain dry weight and duration of filling. The treatments affected all grain quality traits in 2011: nitrogen (N) and sulphur (S) concentrations, N:S ratio, sodium dodecyl sulphate (SDS) sedimentation volume, Hagberg Falling Number (HFN), and the incidence of blackpoint. Only N concentration and blackpoint were affected significantly by treatments in 2012. Rain shelter throughout grain filling reduced N concentration, whereas rain shelter reduced the incidence of blackpoint and ear wetting increased it. In 2011, rain shelter throughout reduced S concentration, increased N:S ratio and reduced SDS. Treatment effects on HFN were not consistent within or between years. Nevertheless, a comparison between the extreme treatment means in 2012 indicated damage from late rain combined with ear wetting resulted in a reduction of c. 0.7 s in HFN/mm August rainfall, whilst that between samples taken immediately after ear wetting at harvest maturity or 7 d later suggested recovery from damage to HFN upon re-drying in planta. Hence, the incidence of blackpoint was the only grain quality trait affected consistently by the diverse treatments. The remaining aspects of grain quality were comparatively resilient to rain incident upon developing and maturing ears of cvar Tybalt. No consistent temporal patterns of sensitivity to shelter or ear wetting were detected for any aspect of grain quality.
Resumo:
The hereditary spastic paraplegias are a heterogeneous group of degenerative disorders that are clinically classified as either pure with predominant lower limb spasticity, or complex where spastic paraplegia is complicated with additional neurological features, and are inherited in autosomal dominant, autosomal recessive or X-linked patterns. Genetic defects have been identified in over 40 different genes, with more than 70 loci in total. Complex recessive spastic paraplegias have in the past been frequently associated with mutations in SPG11 (spatacsin), ZFYVE26/SPG15, SPG7 (paraplegin) and a handful of other rare genes, but many cases remain genetically undefined. The overlap with other neurodegenerative disorders has been implied in a small number of reports, but not in larger disease series. This deficiency has been largely due to the lack of suitable high throughput techniques to investigate the genetic basis of disease, but the recent availability of next generation sequencing can facilitate the identification of disease- causing mutations even in extremely heterogeneous disorders. We investigated a series of 97 index cases with complex spastic paraplegia referred to a tertiary referral neurology centre in London for diagnosis or management. The mean age of onset was 16 years (range 3 to 39). The SPG11 gene was first analysed, revealing homozygous or compound heterozygous mutations in 30/97 (30.9%) of probands, the largest SPG11 series reported to date, and by far the most common cause of complex spastic paraplegia in the UK, with severe and progressive clinical features and other neurological manifestations, linked with magnetic resonance imaging defects. Given the high frequency of SPG11 mutations, we studied the autophagic response to starvation in eight affected SPG11 cases and control fibroblast cell lines, but in our restricted study we did not observe correlations between disease status and autophagic or lysosomal markers. In the remaining cases, next generation sequencing was carried out revealing variants in a number of other known complex spastic paraplegia genes, including five in SPG7 (5/97), four in FA2H (also known as SPG35) (4/97) and two in ZFYVE26/SPG15. Variants were identified in genes usually associated with pure spastic paraplegia and also in the Parkinson’s disease-associated gene ATP13A2, neuronal ceroid lipofuscinosis gene TPP1 and the hereditary motor and sensory neuropathy DNMT1 gene, highlighting the genetic heterogeneity of spastic paraplegia. No plausible genetic cause was identified in 51% of probands, likely indicating the existence of as yet unidentified genes.