73 resultados para content-based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background A whole-genome genotyping array has previously been developed for Malus using SNP data from 28 Malus genotypes. This array offers the prospect of high throughput genotyping and linkage map development for any given Malus progeny. To test the applicability of the array for mapping in diverse Malus genotypes, we applied the array to the construction of a SNPbased linkage map of an apple rootstock progeny. Results Of the 7,867 Malus SNP markers on the array, 1,823 (23.2 %) were heterozygous in one of the two parents of the progeny, 1,007 (12.8 %) were heterozygous in both parental genotypes, whilst just 2.8 % of the 921 Pyrus SNPs were heterozygous. A linkage map spanning 1,282.2 cM was produced comprising 2,272 SNP markers, 306 SSR markers and the S-locus. The length of the M432 linkage map was increased by 52.7 cM with the addition of the SNP markers, whilst marker density increased from 3.8 cM/marker to 0.5 cM/marker. Just three regions in excess of 10 cM remain where no markers were mapped. We compared the positions of the mapped SNP markers on the M432 map with their predicted positions on the ‘Golden Delicious’ genome sequence. A total of 311 markers (13.7 % of all mapped markers) mapped to positions that conflicted with their predicted positions on the ‘Golden Delicious’ pseudo-chromosomes, indicating the presence of paralogous genomic regions or misassignments of genome sequence contigs during the assembly and anchoring of the genome sequence. Conclusions We incorporated data for the 2,272 SNP markers onto the map of the M432 progeny and have presented the most complete and saturated map of the full 17 linkage groups of M. pumila to date. The data were generated rapidly in a high-throughput semi-automated pipeline, permitting significant savings in time and cost over linkage map construction using microsatellites. The application of the array will permit linkage maps to be developed for QTL analyses in a cost-effective manner, and the identification of SNPs that have been assigned erroneous positions on the ‘Golden Delicious’ reference sequence will assist in the continued improvement of the genome sequence assembly for that variety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Fruit and vegetable-rich diets are associated with a reduced cardiovascular disease (CVD) risk. This protective effect may be a result of the phytochemicals present within fruits and vegetables (F&V). However, there can be considerable variation in the content of phytochemical composition of whole F&V depending on growing location, cultivar, season and agricultural practices, etc. Therefore, the present study investigated the effects of consuming fruits and vegetables as puree-based drinks (FVPD) daily on vasodilation, phytochemical bioavailability, antioxidant status and other CVD risk factors. FVPD was chosen to provide a standardised source of F&V material that could be delivered from the same batch to all subjects during each treatment arm of the study. Methods: Thirty-nine subjects completed the randomised, controlled, cross-over dietary intervention. Subjects were randomised to consume 200 mL of FVPD (or fruit-flavoured control), daily for 6 weeks with an 8-week washout period between treatments. Dietary intake was measured using two 5-day diet records during each cross-over arm of the study. Blood and urine samples were collected before and after each intervention and vasodilation assessed in 19 subjects using laser Doppler imaging with iontophoresis. Results: FVPD significantly increased dietary vitamin C and carotenoids (P < 0.001), and concomitantly increased plasma α- and β-carotene (P < 0.001) with a near-significant increase in endothelium-dependent vasodilation (P = 0.060). Conclusions: Overall, the findings obtained in the present study showed that FVPD were a useful vehicle to increase fruit and vegetable intake, significantly increasing dietary and plasma phytochemical concentrations with a trend towards increased endothelium-dependent vasodilation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here we explore the physico-chemical properties of a peptide amphiphile obtained by chemical conjugation of the collagenstimulating peptide KTTKS with 10,12-pentacosadiynoic acid which photopolymerizes as a stable and extended polydiacetylene. We investigate the self-assembly of this new polymer and rationalize its peculiar behavior in terms of a thermal conformational transition. Surprisingly, this polymer shows a thermal transition associated with a non-cooperative increase in b-sheet content at high temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed quantitative microstructural study coupled with cathodoluminescence and geochemical analyses on marbles from Naxos demonstrates that the analysis of microstructures is the most sensitive method to define the origin of marbles within, and between, different regions. Microstructure examination can only be used as an accurate provenance tool if a correction for the second-phase content is considered. If second phases are not considered, a large spread of different microstructures occurs within sample sites, making a separation between neighbouring outcrops difficult or impossible. Moreover, this study shows that the origin of a marble is defined more precisely if the microstructural observations are coupled with cathodoluminescence data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The butanol-HCl spectrophotometric assay is widely used for quantifying extractable and insoluble condensed tannins (CT, syn. proanthocyanidins) in foods, feeds, and foliage of herbaceous and woody plants, but the method underestimates total CT content when applied directly to plant material. To improve CT quantitation, we tested various cosolvents with butanol-HCl and found that acetone increased anthocyanidin yields from two forage Lotus species having contrasting procyanidin and prodelphinidin compositions. A butanol-HCl-iron assay run with 50% (v/v) acetone gave linear responses with Lotus CT standards and increased estimates of total CT in Lotus herbage and leaves by up to 3.2-fold over the conventional method run without acetone. The use of thiolysis to determine the purity of CT standards further improved quantitation. Gel-state 13C and 1H–13C HSQC NMR spectra of insoluble residues collected after butanol-HCl assays revealed that acetone increased anthocyanidin yields by facilitating complete solubilization of CT from tissue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical pathways have been adopted for various diseases in clinical departments for quality improvement as a result of standardization of medical activities in treatment process. Using knowledge-based decision support on the basis of clinical pathways is a promising strategy to improve medical quality effectively. However, the clinical pathway knowledge has not been fully integrated into treatment process and thus cannot provide comprehensive support to the actual work practice. Therefore this paper proposes a knowledgebased clinical pathway management method which contributes to make use of clinical knowledge to support and optimize medical practice. We have developed a knowledgebased clinical pathway management system to demonstrate how the clinical pathway knowledge comprehensively supports the treatment process. The experiences from the use of this system show that the treatment quality can be effectively improved by the extracted and classified clinical pathway knowledge, seamless integration of patient-specific clinical pathway recommendations with medical tasks and the evaluating pathway deviations for optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to make best use of the opportunities provided by space missions such as the Radiation Belt Storm Probes, we determine the response of complementary subionospheric radiowave propagation measurements (VLF), riometer absorption measurements (CNA), and GPS-produced total electron content (vTEC) to different energetic electron precipitation (EEP). We model the relative sensitivity and responses of these instruments to idealised monoenergetic beams of precipitating electrons, and more realistic EEP spectra chosen to represent radiation belts and substorm precipitation. In the monoenergetic beam case, we find riometers are more sensitive to the same EEP event occurring during the day than during the night, while subionospheric VLF shows the opposite relationship, and the change in vTEC is independent. In general, the subionospheric VLF measurements are much more sensitive than the other two techniques for EEP over 200 keV, responding to flux magnitudes two-three orders of magnitude smaller than detectable by a riometer. Detectable TEC changes only occur for extreme monoenergetic fluxes. For the radiation belt EEP case, clearly detectable subionospheric VLF responses are produced by daytime fluxes that are ~10 times lower than required for riometers, while nighttime fluxes can be 10,000 times lower. Riometers are likely to respond only to radiation belt fluxes during the largest EEP events and vTEC is unlikely to be significantly disturbed by radiation belt EEP. For the substorm EEP case both the riometer absorption and the subionospheric VLF technique respond significantly, as does the change in vTEC, which is likely to be detectable at ~3-4 TECu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social tagging has become very popular around the Internet as well as in research. The main idea behind tagging is to allow users to provide metadata to the web content from their perspective to facilitate categorization and retrieval. There are many factors that influence users' tag choice. Many studies have been conducted to reveal these factors by analysing tagging data. This paper uses two theories to identify these factors, namely the semiotics theory and activity theory. The former treats tags as signs and the latter treats tagging as an activity. The paper uses both theories to analyse tagging behaviour by explaining all aspects of a tagging system, including tags, tagging system components and the tagging activity. The theoretical analysis produced a framework that was used to identify a number of factors. These factors can be considered as categories that can be consulted to redirect user tagging choice in order to support particular tagging behaviour, such as cross-lingual tagging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are well-known difficulties in making measurements of the moisture content of baked goods (such as bread, buns, biscuits, crackers and cake) during baking or at the oven exit; in this paper several sensing methods are discussed, but none of them are able to provide direct measurement with sufficient precision. An alternative is to use indirect inferential methods. Some of these methods involve dynamic modelling, with incorporation of thermal properties and using techniques familiar in computational fluid dynamics (CFD); a method of this class that has been used for the modelling of heat and mass transfer in one direction during baking is summarized, which may be extended to model transport of moisture within the product and also within the surrounding atmosphere. The concept of injecting heat during the baking process proportional to the calculated heat load on the oven has been implemented in a control scheme based on heat balance zone by zone through a continuous baking oven, taking advantage of the high latent heat of evaporation of water. Tests on biscuit production ovens are reported, with results that support a claim that the scheme gives more reproducible water distribution in the final product than conventional closed loop control of zone ambient temperatures, thus enabling water content to be held more closely within tolerance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ancestral human populations had diets containing more indigestible plant material than present-day diets in industrialized countries. One hypothesis for the rise in prevalence of obesity is that physiological mechanisms for controlling appetite evolved to match a diet with plant fiber content higher than that of present-day diets. We investigated how diet affects gut microbiota and colon cells by comparing human microbial communities with those from a primate that has an extreme plant-based diet, namely, the gelada baboon, which is a grazer. The effects of potato (high starch) versus grass (high lignin and cellulose) diets on human-derived versus gelada-derived fecal communities were compared in vitro. We especially focused on the production of short-chain fatty acids, which are hypothesized to be key metabolites influencing appetite regulation pathways. The results confirmed that diet has a major effect on bacterial numbers, short-chain fatty acid production, and the release of hormones involved in appetite suppression. The potato diet yielded greater production of short-chain fatty acids and hormone release than the grass diet, even in the gelada cultures, which we had expected should be better adapted to the grass diet. The strong effects of diet on hormone release could not be explained, however, solely by short-chain fatty acid concentrations. Nuclear magnetic resonance spectroscopy found changes in additional metabolites, including betaine and isoleucine, that might play key roles in inhibiting and stimulating appetite suppression pathways. Our study results indicate that a broader array of metabolites might be involved in triggering gut hormone release in humans than previously thought. IMPORTANCE: One theory for rising levels of obesity in western populations is that the body's mechanisms for controlling appetite evolved to match ancestral diets with more low-energy plant foods. We investigated this idea by comparing the effects of diet on appetite suppression pathways via the use of gut bacterial communities from humans and gelada baboons, which are modern-day primates with an extreme diet of low-energy plant food, namely, grass. We found that diet does play a major role in affecting gut bacteria and the production of a hormone that suppresses appetite but not in the direction predicted by the ancestral diet hypothesis. Also, bacterial products were correlated with hormone release that were different from those normally thought to play this role. By comparing microbiota and diets outside the natural range for modern humans, we found a relationship between diet and appetite pathways that was more complex than previously hypothesized on the basis of more-controlled studies of the effects of single compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to overcome divergence of estimation with the same data, the proposed digital costing process adopts an integrated design of information system to design the process knowledge and costing system together. By employing and extending a widely used international standard, industry foundation classes, the system can provide an integrated process which can harvest information and knowledge of current quantity surveying practice of costing method and data. Knowledge of quantification is encoded from literatures, motivation case and standards. It can reduce the time consumption of current manual practice. The further development will represent the pricing process in a Bayesian Network based knowledge representation approach. The hybrid types of knowledge representation can produce a reliable estimation for construction project. In a practical term, the knowledge management of quantity surveying can improve the system of construction estimation. The theoretical significance of this study lies in the fact that its content and conclusion make it possible to develop an automatic estimation system based on hybrid knowledge representation approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preparation of nonaqueous microemulsions using food-acceptable components is reported. The effect of oil on the formation of microemulsions stabilized by lecithin (Epikuron 200) and containing propylene glycol as immiscible solvent was investigated. When the triglycerides were used as oil, three types of phase behavior were noted, namely, a two-phase cloudy region (occurring at low lecithin concentrations), a liquid crystalline (LC) phase (occurring at high surfactant and low oil concentrations), and a clear monophasic microemulsion region. The extent of this clear one-phase region was found to be dependent upon the molecular volume of the oil being solubilized. Large molecular volume oils, such as soybean and sunflower oils, produced a small microemulsion region, whereas the smallest molecular volume triglyceride, tributyrin, produced a large, clear monophasic region. Use of the ethyl ester, ethyl oleate, as oil produced a clear, monophasic region of a size comparable to that seen with tributyrin. Substitution of some of the propylene glycol with water greatly reduced the extent of the clear one-phase region and increased the extent of the liquid crystalline region. In contrast, ethanol enhanced the clear, monophasic region by decreasing the LC phase. Replacement of some of the lecithin with the micelle-forming nonionic surfactant Tween 80 to produce mixed lecithin/Tween 80 mixtures of weight ratios (Km) 1:2 and 1:3 did not significantly alter the phase behavior, although there was a marginal increase in the area of the two-phase, cloudy region of the phase diagram. The use of the lower phosphatidylcholine content lecithin, Epikuron 170, in place of Epikuron 200 resulted in a reduction in the LC region for all of the systems investigated. In conclusion, these studies show that it is possible to prepare one-phase, clear lecithin-based microemulsions over a wide range of compositions using components that are food-acceptable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel method for retrieving high-resolution, three-dimensional (3-D) nonprecipitating cloud fields in both overcast and broken-cloud situations. The method uses scanning cloud radar and multiwavelength zenith radiances to obtain gridded 3-D liquid water content (LWC) and effective radius (re) and 2-D column mean droplet number concentration (Nd). By using an adaption of the ensemble Kalman filter, radiances are used to constrain the optical properties of the clouds using a forward model that employs full 3-D radiative transfer while also providing full error statistics given the uncertainty in the observations. To evaluate the new method, we first perform retrievals using synthetic measurements from a challenging cumulus cloud field produced by a large-eddy simulation snapshot. Uncertainty due to measurement error in overhead clouds is estimated at 20% in LWC and 6% in re, but the true error can be greater due to uncertainties in the assumed droplet size distribution and radiative transfer. Over the entire domain, LWC and re are retrieved with average error 0.05–0.08 g m-3 and ~2 μm, respectively, depending on the number of radiance channels used. The method is then evaluated using real data from the Atmospheric Radiation Measurement program Mobile Facility at the Azores. Two case studies are considered, one stratocumulus and one cumulus. Where available, the liquid water path retrieved directly above the observation site was found to be in good agreement with independent values obtained from microwave radiometer measurements, with an error of 20 g m-2.