40 resultados para Challenge posed by omics data to compositional analysis-paucity of independent samples (n)
Resumo:
Research on the cortical sources of nociceptive laser-evoked brain potentials (LEPs) began almost two decades ago (Tarkka and Treede, 1993). Whereas there is a large consensus on the sources of the late part of the LEP waveform (N2 and P2 waves), the relative contribution of the primary somatosensory cortex (S1) to the early part of the LEP waveform (N1 wave) is still debated. To address this issue we recorded LEPs elicited by the stimulation of four limbs in a large population (n=35). Early LEP generators were estimated both at single-subject and group level, using three different approaches: distributed source analysis, dipolar source modeling, and probabilistic independent component analysis (ICA). We show that the scalp distribution of the earliest LEP response to hand stimulation was maximal over the central-parietal electrodes contralateral to the stimulated side, while that of the earliest LEP response to foot stimulation was maximal over the central-parietal midline electrodes. Crucially, all three approaches indicated hand and foot S1 areas as generators of the earliest LEP response. Altogether, these findings indicate that the earliest part of the scalp response elicited by a selective nociceptive stimulus is largely explained by activity in the contralateral S1, with negligible contribution from the secondary somatosensory cortex (S2).
Resumo:
In vitro batch culture fermentations were conducted with grape seed polyphenols and human faecal microbiota, in order to monitor both changes in precursor flavan-3-ols and the formation of microbial-derived metabolites. By the application of UPLC-DAD-ESI-TQ MS, monomers, and dimeric and trimeric procyanidins were shown to be degraded during the first 10 h of fermentation, with notable inter-individual differences being observed between fermentations. This period (10 h) also coincided with the maximum formation of intermediate metabolites, such as 5-(3′,4′-dihydroxyphenyl)-γ-valerolactone and 4-hydroxy-5-(3′,4′-dihydroxyphenyl)-valeric acid, and of several phenolic acids, including 3-(3,4-dihydroxyphenyl)-propionic acid, 3,4-dihydroxyphenylacetic acid, 4-hydroxymandelic acid, and gallic acid (5–10 h maximum formation). Later phases of the incubations (10–48 h) were characterised by the appearance of mono- and non-hydroxylated forms of previous metabolites by dehydroxylation reactions. Of particular interest was the detection of γ-valerolactone, which was seen for the first time as a metabolite from the microbial catabolism of flavan-3-ols. Changes registered during fermentation were finally summarised by a principal component analysis (PCA). Results revealed that 5-(3′,4′-dihydroxyphenyl)-γ-valerolactone was a key metabolite in explaining inter-individual differences and delineating the rate and extent of the microbial catabolism of flavan-3-ols, which could finally affect absorption and bioactivity of these compounds.
Resumo:
This paper reports the findings from two large scale national on-line surveys carried out in 2009 and 2010, which explored the state of history teaching in English secondary schools. Large variation in provision was identified within comprehensive schools in response to national policy decisions and initiatives. Using the data from the surveys and school level data that is publicly available, this study examines situated factors, particularly the nature of the school intake, the numbers of pupils with special educational needs and the socio-economic status of the area surrounding the school, and the impact these have on the provision of history education. The findings show that there is a growing divide between those students that have access to the ‘powerful knowledge’, provided by subjects like history, and those that do not.
Resumo:
This work presents a model study for the formation of a dimeric dioxomolybdenum(VI) complex [MoO2L]2, generated by simultaneous satisfaction of acceptor and donor character existing in the corresponding monomeric Mo(VI) complex MoO2L. This mononuclear complex is specially designed to contain a coordinatively unsaturated Mo(VI) acceptor centre and a free donor group, (e.g. –NH2 group) strategically placed in the ligand skeleton [H2L = 2-hydroxyacetophenonehydrazone of 2-aminobenzoylhydrazine]. Apart from the dimer [MoO2L]2, complexes of the type MoO2L·B (where B = CH3OH, γ-picoline and imidazole) are also reported. All the complexes are characterized by elemental analysis, spectroscopic (UV–Vis, IR, 1H NMR) techniques and cyclic voltammetry. Single crystal X-ray structures of [MoO2L]2 (1), MoO2L·CH3OH (2), and MoO2L.(γ-pic) (3) have been determined and discussed. DFT calculation on these complexes corroborates experimental data and provides clue for the facile formation of this type of dimer not reported previously. The process of dimer formation may also be viewed as an interaction between two molecules of a specially designed complex acting as a monodentate ligand. This work is expected to open up a new field of design and synthesis of dimeric complexes through the process of symbiotic donor–acceptor (acid–base) interaction between two molecules of a specially designed monomer.
Resumo:
Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.
Resumo:
With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.
Resumo:
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.
Resumo:
Debate over the late Quaternary megafaunal extinctions has focussed on whether human colonisation or climatic changes were more important drivers of extinction, with few extinctions being unambiguously attributable to either. Most analyses have been geographically or taxonomically restricted and the few quantitative global analyses have been limited by coarse temporal resolution or overly simplified climate reconstructions or proxies. We present a global analysis of the causes of these extinctions which uses high-resolution climate reconstructions and explicitly investigates the sensitivity of our results to uncertainty in the palaeological record. Our results show that human colonisation was the dominant driver of megafaunal extinction across the world but that climatic factors were also important. We identify the geographic regions where future research is likely to have the most impact, with our models reliably predicting extinctions across most of the world, with the notable exception of mainland Asia where we fail to explain the apparently low rate of extinction found in in the fossil record. Our results are highly robust to uncertainties in the palaeological record, and our main conclusions are unlikely to change qualitatively following minor improvements or changes in the dates of extinctions and human colonisation.
Resumo:
The debate associated with the qualifications of business school faculty has raged since the 1959 release of the Gordon–Howell and Pierson reports, which encouraged business schools in the USA to enhance their legitimacy by increasing their faculties’ doctoral qualifications and scholarly rigor. Today, the legitimacy of specific faculty qualifications remains one of the most discussed topics in management education, attracting the interest of administrators, faculty, and accreditation agencies. Based on new institutional theory and the institutional logics perspective, this paper examines convergence and innovation in business schools through an analysis of faculty hiring criteria. The qualifications examined are academic degree, scholarly publications, teaching experience, and professional experience. Three groups of schools are examined based on type of university, position within a media ranking system, and accreditation by the Association to Advance Collegiate Schools of Business. Data are gathered using a content analysis of 441 faculty postings from business schools based in the USA over two time periods. Contrary to claims of global convergence, we find most qualifications still vary by group, even in the mature US market. Moreover, innovative hiring is more likely to be found in non-elite schools.
Resumo:
This volume reports on the results of the Glastonbury Abbey Archaeological Archive Project, a collaboration between the University of Reading and the Trustees of Glastonbury Abbey, funded principally by the Arts and Humanities Research Council. The project has reassessed and reinterpreted all known archaeological records from the 1908–79 excavations and made the complete dataset available to the public through a digital archive hosted by the Archaeology Data Service (http://dx.doi.org/10.5284/1022585). The scope of the project has included the full analysis of the archaeological collections of Glastonbury Abbey by thirty-one leading specialists, including chemical and compositional analysis of glass and metal and petrological analysis of pottery and tile, and a comprehensive geophysical survey conducted by GSB Prospection Ltd. For the first time, it has been possible to achieve a framework of independent dating based on reassessment of the finds and radiocarbon dating of surviving organic material from the 1950s excavations. The principal aim of the Glastonbury Abbey Archaeological Project was to set aside previous assumptions based on the historical and legendary traditions and to provide a rigorous reassessment of the archive of antiquarian excavations. This research has revealed that some of the best known archaeological ‘facts’ about Glastonbury are themselves myths perpetuated by the abbey’s excavators.