932 resultados para Challenge posed by omics data to compositional analysis-paucity of independent samples (n)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a method that employs Earth Observation (EO) data to calculate spatiotemporal estimates of soil heat flux, G, using a physically-based method (the Analytical Method). The method involves a harmonic analysis of land surface temperature (LST) data. It also requires an estimate of near-surface soil thermal inertia; this property depends on soil textural composition and varies as a function of soil moisture content. The EO data needed to drive the model equations, and the ground-based data required to provide verification of the method, were obtained over the Fakara domain within the African Monsoon Multidisciplinary Analysis (AMMA) program. LST estimates (3 km × 3 km, one image 15 min−1) were derived from MSG-SEVIRI data. Soil moisture estimates were obtained from ENVISAT-ASAR data, while estimates of leaf area index, LAI, (to calculate the effect of the canopy on G, largely due to radiation extinction) were obtained from SPOT-HRV images. The variation of these variables over the Fakara domain, and implications for values of G derived from them, were discussed. Results showed that this method provides reliable large-scale spatiotemporal estimates of G. Variations in G could largely be explained by the variability in the model input variables. Furthermore, it was shown that this method is relatively insensitive to model parameters related to the vegetation or soil texture. However, the strong sensitivity of thermal inertia to soil moisture content at low values of relative saturation (<0.2) means that in arid or semi-arid climates accurate estimates of surface soil moisture content are of utmost importance, if reliable estimates of G are to be obtained. This method has the potential to improve large-scale evaporation estimates, to aid land surface model prediction and to advance research that aims to explain failure in energy balance closure of meteorological field studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experimental method is described which enables the inelastically scattered X-ray component to be removed from diffractometer data prior to radial density function analysis. At each scattering angle an energy spectrum is generated from a Si(Li) detector combined with a multi-channel analyser from which the coherently scattered component is separated. The data obtained from organic polymers has an improved signal/noise ratio at high values of scattering angle, and a commensurate enhancement of resolution of the RDF at low r is demonstrated for the case of PMMA (ICI `Perspex'). The method obviates the need for the complicated correction for multiple scattering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on the cortical sources of nociceptive laser-evoked brain potentials (LEPs) began almost two decades ago (Tarkka and Treede, 1993). Whereas there is a large consensus on the sources of the late part of the LEP waveform (N2 and P2 waves), the relative contribution of the primary somatosensory cortex (S1) to the early part of the LEP waveform (N1 wave) is still debated. To address this issue we recorded LEPs elicited by the stimulation of four limbs in a large population (n=35). Early LEP generators were estimated both at single-subject and group level, using three different approaches: distributed source analysis, dipolar source modeling, and probabilistic independent component analysis (ICA). We show that the scalp distribution of the earliest LEP response to hand stimulation was maximal over the central-parietal electrodes contralateral to the stimulated side, while that of the earliest LEP response to foot stimulation was maximal over the central-parietal midline electrodes. Crucially, all three approaches indicated hand and foot S1 areas as generators of the earliest LEP response. Altogether, these findings indicate that the earliest part of the scalp response elicited by a selective nociceptive stimulus is largely explained by activity in the contralateral S1, with negligible contribution from the secondary somatosensory cortex (S2).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In vitro batch culture fermentations were conducted with grape seed polyphenols and human faecal microbiota, in order to monitor both changes in precursor flavan-3-ols and the formation of microbial-derived metabolites. By the application of UPLC-DAD-ESI-TQ MS, monomers, and dimeric and trimeric procyanidins were shown to be degraded during the first 10 h of fermentation, with notable inter-individual differences being observed between fermentations. This period (10 h) also coincided with the maximum formation of intermediate metabolites, such as 5-(3′,4′-dihydroxyphenyl)-γ-valerolactone and 4-hydroxy-5-(3′,4′-dihydroxyphenyl)-valeric acid, and of several phenolic acids, including 3-(3,4-dihydroxyphenyl)-propionic acid, 3,4-dihydroxyphenylacetic acid, 4-hydroxymandelic acid, and gallic acid (5–10 h maximum formation). Later phases of the incubations (10–48 h) were characterised by the appearance of mono- and non-hydroxylated forms of previous metabolites by dehydroxylation reactions. Of particular interest was the detection of γ-valerolactone, which was seen for the first time as a metabolite from the microbial catabolism of flavan-3-ols. Changes registered during fermentation were finally summarised by a principal component analysis (PCA). Results revealed that 5-(3′,4′-dihydroxyphenyl)-γ-valerolactone was a key metabolite in explaining inter-individual differences and delineating the rate and extent of the microbial catabolism of flavan-3-ols, which could finally affect absorption and bioactivity of these compounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports the findings from two large scale national on-line surveys carried out in 2009 and 2010, which explored the state of history teaching in English secondary schools. Large variation in provision was identified within comprehensive schools in response to national policy decisions and initiatives. Using the data from the surveys and school level data that is publicly available, this study examines situated factors, particularly the nature of the school intake, the numbers of pupils with special educational needs and the socio-economic status of the area surrounding the school, and the impact these have on the provision of history education. The findings show that there is a growing divide between those students that have access to the ‘powerful knowledge’, provided by subjects like history, and those that do not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a model study for the formation of a dimeric dioxomolybdenum(VI) complex [MoO2L]2, generated by simultaneous satisfaction of acceptor and donor character existing in the corresponding monomeric Mo(VI) complex MoO2L. This mononuclear complex is specially designed to contain a coordinatively unsaturated Mo(VI) acceptor centre and a free donor group, (e.g. –NH2 group) strategically placed in the ligand skeleton [H2L = 2-hydroxyacetophenonehydrazone of 2-aminobenzoylhydrazine]. Apart from the dimer [MoO2L]2, complexes of the type MoO2L·B (where B = CH3OH, γ-picoline and imidazole) are also reported. All the complexes are characterized by elemental analysis, spectroscopic (UV–Vis, IR, 1H NMR) techniques and cyclic voltammetry. Single crystal X-ray structures of [MoO2L]2 (1), MoO2L·CH3OH (2), and MoO2L.(γ-pic) (3) have been determined and discussed. DFT calculation on these complexes corroborates experimental data and provides clue for the facile formation of this type of dimer not reported previously. The process of dimer formation may also be viewed as an interaction between two molecules of a specially designed complex acting as a monodentate ligand. This work is expected to open up a new field of design and synthesis of dimeric complexes through the process of symbiotic donor–acceptor (acid–base) interaction between two molecules of a specially designed monomer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Debate over the late Quaternary megafaunal extinctions has focussed on whether human colonisation or climatic changes were more important drivers of extinction, with few extinctions being unambiguously attributable to either. Most analyses have been geographically or taxonomically restricted and the few quantitative global analyses have been limited by coarse temporal resolution or overly simplified climate reconstructions or proxies. We present a global analysis of the causes of these extinctions which uses high-resolution climate reconstructions and explicitly investigates the sensitivity of our results to uncertainty in the palaeological record. Our results show that human colonisation was the dominant driver of megafaunal extinction across the world but that climatic factors were also important. We identify the geographic regions where future research is likely to have the most impact, with our models reliably predicting extinctions across most of the world, with the notable exception of mainland Asia where we fail to explain the apparently low rate of extinction found in in the fossil record. Our results are highly robust to uncertainties in the palaeological record, and our main conclusions are unlikely to change qualitatively following minor improvements or changes in the dates of extinctions and human colonisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The debate associated with the qualifications of business school faculty has raged since the 1959 release of the Gordon–Howell and Pierson reports, which encouraged business schools in the USA to enhance their legitimacy by increasing their faculties’ doctoral qualifications and scholarly rigor. Today, the legitimacy of specific faculty qualifications remains one of the most discussed topics in management education, attracting the interest of administrators, faculty, and accreditation agencies. Based on new institutional theory and the institutional logics perspective, this paper examines convergence and innovation in business schools through an analysis of faculty hiring criteria. The qualifications examined are academic degree, scholarly publications, teaching experience, and professional experience. Three groups of schools are examined based on type of university, position within a media ranking system, and accreditation by the Association to Advance Collegiate Schools of Business. Data are gathered using a content analysis of 441 faculty postings from business schools based in the USA over two time periods. Contrary to claims of global convergence, we find most qualifications still vary by group, even in the mature US market. Moreover, innovative hiring is more likely to be found in non-elite schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This volume reports on the results of the Glastonbury Abbey Archaeological Archive Project, a collaboration between the University of Reading and the Trustees of Glastonbury Abbey, funded principally by the Arts and Humanities Research Council. The project has reassessed and reinterpreted all known archaeological records from the 1908–79 excavations and made the complete dataset available to the public through a digital archive hosted by the Archaeology Data Service (http://dx.doi.org/10.5284/1022585). The scope of the project has included the full analysis of the archaeological collections of Glastonbury Abbey by thirty-one leading specialists, including chemical and compositional analysis of glass and metal and petrological analysis of pottery and tile, and a comprehensive geophysical survey conducted by GSB Prospection Ltd. For the first time, it has been possible to achieve a framework of independent dating based on reassessment of the finds and radiocarbon dating of surviving organic material from the 1950s excavations. The principal aim of the Glastonbury Abbey Archaeological Project was to set aside previous assumptions based on the historical and legendary traditions and to provide a rigorous reassessment of the archive of antiquarian excavations. This research has revealed that some of the best known archaeological ‘facts’ about Glastonbury are themselves myths perpetuated by the abbey’s excavators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to assess in vitro the influence of Er:YAG laser irradiation distance on the shear strength of the bond between an adhesive restorative system and primary dentin. A total of 60 crowns of primary molars were embedded in acrylic resin and mechanically ground to expose a flat dentin surface and were randomly assigned to six groups (n = 10). The control group was etched with 37% phosphoric acid. The remaining five groups were irradiated (80 mJ, 2 Hz) at different irradiation distances (11, 12, 16, 17 and 20 mm), followed by acid etching. An adhesive agent (Single Bond) was applied to the bonding sites, and resin cylinders (Filtek Z250) were prepared. The shear bond strength tests were performed in a universal testing machine (0.5 mm/min). Data were submitted to statistical analysis using one-way ANOVA and the Kruskal-Wallis test (p < 0.05). The mean shear bond strengths were: 7.32 +/- 3.83, 5.07 +/- 2.62, 6.49 +/- 1.64, 7.71 +/- 0.66, 7.33 +/- 0.02, and 9.65 +/- 2.41 MPa in the control group and the groups irradiated at 11, 12, 16, 17, and 20 mm, respectively. The differences between the bond strengths in groups II and IV and between the bond strengths in groups II and VI were statistically significant (p < 0.05). Increasing the laser irradiation distance resulted in increasing shear strength of the bond to primary dentin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

P>Vegetable oils can be extracted using ethanol as solvent. The main goal of this work was to evaluate the ethanol performance on the extraction process of rice bran oil. The influence of process variables, solvent hydration and temperature was evaluated using the response surface methodology, aiming to maximise the soluble substances and gamma-oryzanol transfer and minimise the free fatty acids extraction and the liquid content in the underflow solid. It can be noted that oil solubility in ethanol was highly affected by the water content. The free fatty acids extraction is improved by increasing the moisture content in the solvent. Regarding the gamma-oryzanol, it can be observed that its extraction is affected by temperature when low level of water is added to ethanol. On the other hand, the influence of temperature is minimised with high levels of water in the ethanol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many of the controversies around the concept of homology rest on the subjectivity inherent to primary homology propositions. Dynamic homology partially solves this problem, but there has been up to now scant application of it outside of the molecular domain. This is probably because morphological and behavioural characters are rich in properties, connections and qualities, so that there is less space for conflicting character delimitations. Here we present a new method for the direct optimization of behavioural data, a method that relies on the richness of this database to delimit the characters, and on dynamic procedures to establish character state identity. We use between-species congruence in the data matrix and topological stability to choose the best cladogram. We test the methodology using sequences of predatory behaviour in a group of spiders that evolved the highly modified predatory technique of spitting glue onto prey. The cladogram recovered is fully compatible with previous analyses in the literature, and thus the method seems consistent. Besides the advantage of enhanced objectivity in character proposition, the new procedure allows the use of complex, context-dependent behavioural characters in an evolutionary framework, an important step towards the practical integration of the evolutionary and ecological perspectives on diversity. (C) The Willi Hennig Society 2010.