374 resultados para Patanelli, Matthew
Resumo:
Common variants at only two loci, FTO and MC4R, have been reproducibly associated with body mass index (BMI) in humans. To identify additional loci, we conducted meta-analysis of 15 genome-wide association studies for BMI (n > 32,000) and followed up top signals in 14 additional cohorts (n > 59,000). We strongly confirm FTO and MC4R and identify six additional loci (P < 5 x 10(-8)): TMEM18, KCTD15, GNPDA2, SH2B1, MTCH2 and NEGR1 (where a 45-kb deletion polymorphism is a candidate causal variant). Several of the likely causal genes are highly expressed or known to act in the central nervous system (CNS), emphasizing, as in rare monogenic forms of obesity, the role of the CNS in predisposition to obesity.
Resumo:
A cardinal property of neural stem cells (NSCs) is their ability to adopt multiple fates upon differentiation. The epigenome is widely seen as a read-out of cellular potential and a manifestation of this can be seen in embryonic stem cells (ESCs), where promoters of many lineage-specific regulators are marked by a bivalent epigenetic signature comprising trimethylation of both lysine 4 and lysine 27 of histone H3 (H3K4me3 and H3K27me3, respectively). Bivalency has subsequently emerged as a powerful epigenetic indicator of stem cell potential. Here, we have interrogated the epigenome during differentiation of ESC-derived NSCs to immature GABAergic interneurons. We show that developmental transitions are accompanied by loss of bivalency at many promoters in line with their increasing developmental restriction from pluripotent ESC through multipotent NSC to committed GABAergic interneuron. At the NSC stage, the promoters of genes encoding many transcriptional regulators required for differentiation of multiple neuronal subtypes and neural crest appear to be bivalent, consistent with the broad developmental potential of NSCs. Upon differentiation to GABAergic neurons, all non-GABAergic promoters resolve to H3K27me3 monovalency, whereas GABAergic promoters resolve to H3K4me3 monovalency or retain bivalency. Importantly, many of these epigenetic changes occur before any corresponding changes in gene expression. Intriguingly, another group of gene promoters gain bivalency as NSCs differentiate toward neurons, the majority of which are associated with functions connected with maturation and establishment and maintenance of connectivity. These data show that bivalency provides a dynamic epigenetic signature of developmental potential in both NSCs and in early neurons. Stem Cells 2013;31:1868-1880.
Resumo:
Many in vitro systems used to examine multipotential neural progenitor cells (NPCs) rely on mitogens including fibroblast growth factor 2 (FGF2) for their continued expansion. However, FGF2 has also been shown to alter the expression of transcription factors (TFs) that determine cell fate. Here, we report that NPCs from the embryonic telencephalon grown without FGF2 retain many of their in vivo characteristics, making them a good model for investigating molecular mechanisms involved in cell fate specification and differentiation. However, exposure of cortical NPCs to FGF2 results in a profound change in the types of neurons generated, switching them from a glutamatergic to a GABAergic phenotype. This change closely correlates with the dramatic upregulation of TFs more characteristic of ventral telencephalic NPCs. In addition, exposure of cortical NPCs to FGF2 maintains their neurogenic potential in vitro, and NPCs spontaneously undergo differentiation following FGF2 withdrawal. These results highlight the importance of TFs in determining the types of neurons generated by NPCs in vitro. In addition, they show that FGF2, as well as acting as a mitogen, changes the developmental capabilities of NPCs. These findings have implications for the cell fate specification of in vitro-expanded NPCs and their ability to generate specific cell types for therapeutic applications. Disclosure of potential conflicts of interest is found at the end of this article.
Resumo:
The emerging discipline of urban ecology is shifting focus from ecological processes embedded within cities to integrative studies of large urban areas as biophysical-social complexes. Yet this discipline lacks a theory. Results from the Baltimore Ecosystem Study, part of the Long Term Ecological Research Network, expose new assumptions and test existing assumptions about urban ecosystems. The findings suggest a broader range of structural and functional relationships than is often assumed for urban ecological systems. We address the relationships between social status and awareness of environmental problems, and between race and environmental hazard. We present patterns of species diversity, riparian function, and stream nitrate loading. In addition, we probe the suitability of land-use models, the diversity of soils, and the potential for urban carbon sequestration. Finally, we illustrate lags between social patterns and vegetation, the biogeochemistry of lawns, ecosystem nutrient retention, and social-biophysical feedbacks. These results suggest a framework for a theory of urban ecosystems.
Resumo:
This latest issue of the series of Typography papers opens with a beautifully illustrated article by the type designer Gerard Unger on ‘Romanesque’ letters. A further installment of Eric Kindel’s pathbreaking history of stencil letters is published in contributions by him, Fred Smeijers, and James Mosley. Maurice Göldner writes the first history of an early twentieth-century German typefounder, Brüder Butter. William Berkson and Peter Enneson recover the notion of ‘readability’ through a history of the collaboration between Matthew Luckiesh and the Linotype Company. Paul Luna discusses the role of pictures in dictionaries. Titus Nemeth describes a new form of Arabic type for metal composition. The whole gathering shows the remarkable variety and vitality of typography now.
Resumo:
Bayesian analysis is given of an instrumental variable model that allows for heteroscedasticity in both the structural equation and the instrument equation. Specifically, the approach for dealing with heteroscedastic errors in Geweke (1993) is extended to the Bayesian instrumental variable estimator outlined in Rossi et al. (2005). Heteroscedasticity is treated by modelling the variance for each error using a hierarchical prior that is Gamma distributed. The computation is carried out by using a Markov chain Monte Carlo sampling algorithm with an augmented draw for the heteroscedastic case. An example using real data illustrates the approach and shows that ignoring heteroscedasticity in the instrument equation when it exists may lead to biased estimates.
Resumo:
Bank of England notes of £20 denomination have been studied using infrared spectroscopy in order to generate a method to identify forged notes. An aim of this work was to develop a non-destructive method so that a small, compact Fourier transform infrared spectrometer (FT-IR) instrument could be used by bank workers, police departments or others such as shop assistants to identify forged notes in a non-lab setting. The ease of use of the instrument is the key to this method, as well as the relatively low cost. The presence of a peak at 1400 cm−1 arising from νasym () from the blank paper section of a forged note proved to be a successful indicator of the note’s illegality for the notes that we studied. Moreover, differences between the spectra of forged and genuine £20 notes were observed in the ν(OH) (ca. 3500 cm−1), ν(CH) (ca. 2900 cm−1) and ν(CO) (ca. 1750 cm−1) regions of the IR spectrum recorded for the polymer film covering the holographic strip. In cases where these simple tests fail, we have shown how an infrared microscope can be used to further differentiate genuine and forged banknotes by producing infrared maps of selected areas of the note contrasting inks with background paper.
Resumo:
This article looks at the controversial music genre Oi! in relation to youth cultural identity in late 1970s’ and early 1980s’ Britain. As a form of British punk associated with skinheads, Oi! has oft-been dismissed as racist and bound up in the politics of the far right. It is argued here, however, that such a reading is too simplistic and ignores the more complex politics contained both within Oi! and the various youth cultural currents that revolved around the term ‘punk’ at this time. Taking as its starting point the Centre for Contemporary Cultural Studies’ conception of youth culture as a site of potential ‘resistance’, the article explores the substance and motifs of Oi!’s protest to locate its actual and perceived meaning within a far wider political and socio-economic context. More broadly, it seeks to demonstrate the value of historians examining youth culture as a formative and contested socio-cultural space within which young people discover, comprehend, and express their desires, opinions, and disaffections.
Resumo:
Final year research projects are an important part of undergraduate chemistry courses, allowing students to enhance transferable skills in teamworking, problem solving and presentations, at the same time as learning valuable practical skills. Several recent reports have highlighted the importance of research based studies as part of undergraduate courses. ‘We need to encourage universities to explore new models of curriculum. They should all incorporate research based study for undergraduates to cultivate awareness of research careers, to train students in research skills for employment, and to sustain the advantages of a research teaching connection,’ wrote Paul Ramsden from James Cook University, Australia, in a 2008 report for the UK’s Higher Education Academy.1 A 2010 report published by the Biopharma Skills Consortium – that promotes collaboration across the higher education sector in the area of biopharma – also stated that: ‘Companies seek recruits well placed to acclimatise quickly to the work environment. They are looking for recruits who can deploy a range of generic skills in the application of their knowledge.’2
Resumo:
Computational formalisms have been pushing the boundaries of the field of computing for the last 80 years and much debate has surrounded what computing entails; what it is, and what it is not. This paper seeks to explore the boundaries of the ideas of computation and provide a framework for enabling a constructive discussion of computational ideas. First, a review of computing is given, ranging from Turing Machines to interactive computing. Then, a variety of natural physical systems are considered for their computational qualities. From this exploration, a framework is presented under which all dynamical systems can be considered as instances of the class of abstract computational platforms. An abstract computational platform is defined by both its intrinsic dynamics and how it allows computation that is meaningful to an external agent through the configuration of constraints upon those dynamics. It is asserted that a platform’s computational expressiveness is directly related to the freedom with which constraints can be placed. Finally, the requirements for a formal constraint description language are considered and it is proposed that Abstract State Machines may provide a reasonable basis for such a language.
Resumo:
This paper reports the results of a 2-year study of water quality in the River Enborne, a rural river in lowland England. Concentrations of nitrogen and phosphorus species and other chemical determinands were monitored both at high-frequency (hourly), using automated in situ instrumentation, and by manual weekly sampling and laboratory analysis. The catchment land use is largely agricultural, with a population density of 123 persons km−2. The river water is largely derived from calcareous groundwater, and there are high nitrogen and phosphorus concentrations. Agricultural fertiliser is the dominant source of annual loads of both nitrogen and phosphorus. However, the data show that sewage effluent discharges have a disproportionate effect on the river nitrogen and phosphorus dynamics. At least 38% of the catchment population use septic tank systems, but the effects are hard to quantify as only 6% are officially registered, and the characteristics of the others are unknown. Only 4% of the phosphorus input and 9% of the nitrogen input is exported from the catchment by the river, highlighting the importance of catchment process understanding in predicting nutrient concentrations. High-frequency monitoring will be a key to developing this vital process understanding.
Resumo:
Considerable effort is presently being devoted to producing high-resolution sea surface temperature (SST) analyses with a goal of spatial grid resolutions as low as 1 km. Because grid resolution is not the same as feature resolution, a method is needed to objectively determine the resolution capability and accuracy of SST analysis products. Ocean model SST fields are used in this study as simulated “true” SST data and subsampled based on actual infrared and microwave satellite data coverage. The subsampled data are used to simulate sampling errors due to missing data. Two different SST analyses are considered and run using both the full and the subsampled model SST fields, with and without additional noise. The results are compared as a function of spatial scales of variability using wavenumber auto- and cross-spectral analysis. The spectral variance at high wavenumbers (smallest wavelengths) is shown to be attenuated relative to the true SST because of smoothing that is inherent to both analysis procedures. Comparisons of the two analyses (both having grid sizes of roughly ) show important differences. One analysis tends to reproduce small-scale features more accurately when the high-resolution data coverage is good but produces more spurious small-scale noise when the high-resolution data coverage is poor. Analysis procedures can thus generate small-scale features with and without data, but the small-scale features in an SST analysis may be just noise when high-resolution data are sparse. Users must therefore be skeptical of high-resolution SST products, especially in regions where high-resolution (~5 km) infrared satellite data are limited because of cloud cover.
Resumo:
Background Despite the frequent isolation of Salmonella enterica sub. enterica serovars Derby and Mbandaka from livestock in the UK and USA little is known about the biological processes maintaining their prevalence. Statistics for Salmonella isolations from livestock production in the UK show that S. Derby is most commonly associated with pigs and turkeys and S. Mbandaka with cattle and chickens. Here we compare the first sequenced genomes of S. Derby and S. Mbandaka as a basis for further analysis of the potential host adaptations that contribute to their distinct host species distributions. Results Comparative functional genomics using the RAST annotation system showed that predominantly mechanisms that relate to metabolite utilisation, in vivo and ex vivo persistence and pathogenesis distinguish S. Derby from S. Mbandaka. Alignment of the genome nucleotide sequences of S. Derby D1 and D2 and S. Mbandaka M1 and M2 with Salmonella pathogenicity islands (SPI) identified unique complements of genes associated with host adaptation. We also describe a new genomic island with a putative role in pathogenesis, SPI-23. SPI-23 is present in several S. enterica serovars, including S. Agona, S. Dublin and S. Gallinarum, it is absent in its entirety from S. Mbandaka. Conclusions We discovered a new 37 Kb genomic island, SPI-23, in the chromosome sequence of S. Derby, encoding 42 ORFS, ten of which are putative TTSS effector proteins. We infer from full-genome synonymous SNP analysis that these two serovars diverged, between 182kya and 625kya coinciding with the divergence of domestic pigs. The differences between the genomes of these serovars suggest they have been exposed to different stresses including, phage, transposons and prolonged externalisation. The two serovars possess distinct complements of metabolic genes; many of which cluster into pathways for catabolism of carbon sources.