793 resultados para content-based retrieval
Resumo:
A detailed quantitative microstructural study coupled with cathodoluminescence and geochemical analyses on marbles from Naxos demonstrates that the analysis of microstructures is the most sensitive method to define the origin of marbles within, and between, different regions. Microstructure examination can only be used as an accurate provenance tool if a correction for the second-phase content is considered. If second phases are not considered, a large spread of different microstructures occurs within sample sites, making a separation between neighbouring outcrops difficult or impossible. Moreover, this study shows that the origin of a marble is defined more precisely if the microstructural observations are coupled with cathodoluminescence data.
Resumo:
The butanol-HCl spectrophotometric assay is widely used for quantifying extractable and insoluble condensed tannins (CT, syn. proanthocyanidins) in foods, feeds, and foliage of herbaceous and woody plants, but the method underestimates total CT content when applied directly to plant material. To improve CT quantitation, we tested various cosolvents with butanol-HCl and found that acetone increased anthocyanidin yields from two forage Lotus species having contrasting procyanidin and prodelphinidin compositions. A butanol-HCl-iron assay run with 50% (v/v) acetone gave linear responses with Lotus CT standards and increased estimates of total CT in Lotus herbage and leaves by up to 3.2-fold over the conventional method run without acetone. The use of thiolysis to determine the purity of CT standards further improved quantitation. Gel-state 13C and 1H–13C HSQC NMR spectra of insoluble residues collected after butanol-HCl assays revealed that acetone increased anthocyanidin yields by facilitating complete solubilization of CT from tissue.
Resumo:
Clinical pathways have been adopted for various diseases in clinical departments for quality improvement as a result of standardization of medical activities in treatment process. Using knowledge-based decision support on the basis of clinical pathways is a promising strategy to improve medical quality effectively. However, the clinical pathway knowledge has not been fully integrated into treatment process and thus cannot provide comprehensive support to the actual work practice. Therefore this paper proposes a knowledgebased clinical pathway management method which contributes to make use of clinical knowledge to support and optimize medical practice. We have developed a knowledgebased clinical pathway management system to demonstrate how the clinical pathway knowledge comprehensively supports the treatment process. The experiences from the use of this system show that the treatment quality can be effectively improved by the extracted and classified clinical pathway knowledge, seamless integration of patient-specific clinical pathway recommendations with medical tasks and the evaluating pathway deviations for optimization.
Resumo:
In order to make best use of the opportunities provided by space missions such as the Radiation Belt Storm Probes, we determine the response of complementary subionospheric radiowave propagation measurements (VLF), riometer absorption measurements (CNA), and GPS-produced total electron content (vTEC) to different energetic electron precipitation (EEP). We model the relative sensitivity and responses of these instruments to idealised monoenergetic beams of precipitating electrons, and more realistic EEP spectra chosen to represent radiation belts and substorm precipitation. In the monoenergetic beam case, we find riometers are more sensitive to the same EEP event occurring during the day than during the night, while subionospheric VLF shows the opposite relationship, and the change in vTEC is independent. In general, the subionospheric VLF measurements are much more sensitive than the other two techniques for EEP over 200 keV, responding to flux magnitudes two-three orders of magnitude smaller than detectable by a riometer. Detectable TEC changes only occur for extreme monoenergetic fluxes. For the radiation belt EEP case, clearly detectable subionospheric VLF responses are produced by daytime fluxes that are ~10 times lower than required for riometers, while nighttime fluxes can be 10,000 times lower. Riometers are likely to respond only to radiation belt fluxes during the largest EEP events and vTEC is unlikely to be significantly disturbed by radiation belt EEP. For the substorm EEP case both the riometer absorption and the subionospheric VLF technique respond significantly, as does the change in vTEC, which is likely to be detectable at ~3-4 TECu.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.
Resumo:
We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).
Resumo:
Satellite data are increasingly used to provide observation-based estimates of the effects of aerosols on climate. The Aerosol-cci project, part of the European Space Agency's Climate Change Initiative (CCI), was designed to provide essential climate variables for aerosols from satellite data. Eight algorithms, developed for the retrieval of aerosol properties using data from AATSR (4), MERIS (3) and POLDER, were evaluated to determine their suitability for climate studies. The primary result from each of these algorithms is the aerosol optical depth (AOD) at several wavelengths, together with the Ångström exponent (AE) which describes the spectral variation of the AOD for a given wavelength pair. Other aerosol parameters which are possibly retrieved from satellite observations are not considered in this paper. The AOD and AE (AE only for Level 2) were evaluated against independent collocated observations from the ground-based AERONET sun photometer network and against “reference” satellite data provided by MODIS and MISR. Tools used for the evaluation were developed for daily products as produced by the retrieval with a spatial resolution of 10 × 10 km2 (Level 2) and daily or monthly aggregates (Level 3). These tools include statistics for L2 and L3 products compared with AERONET, as well as scoring based on spatial and temporal correlations. In this paper we describe their use in a round robin (RR) evaluation of four months of data, one month for each season in 2008. The amount of data was restricted to only four months because of the large effort made to improve the algorithms, and to evaluate the improvement and current status, before larger data sets will be processed. Evaluation criteria are discussed. Results presented show the current status of the European aerosol algorithms in comparison to both AERONET and MODIS and MISR data. The comparison leads to a preliminary conclusion that the scores are similar, including those for the references, but the coverage of AATSR needs to be enhanced and further improvements are possible for most algorithms. None of the algorithms, including the references, outperforms all others everywhere. AATSR data can be used for the retrieval of AOD and AE over land and ocean. PARASOL and one of the MERIS algorithms have been evaluated over ocean only and both algorithms provide good results.
Resumo:
There are well-known difficulties in making measurements of the moisture content of baked goods (such as bread, buns, biscuits, crackers and cake) during baking or at the oven exit; in this paper several sensing methods are discussed, but none of them are able to provide direct measurement with sufficient precision. An alternative is to use indirect inferential methods. Some of these methods involve dynamic modelling, with incorporation of thermal properties and using techniques familiar in computational fluid dynamics (CFD); a method of this class that has been used for the modelling of heat and mass transfer in one direction during baking is summarized, which may be extended to model transport of moisture within the product and also within the surrounding atmosphere. The concept of injecting heat during the baking process proportional to the calculated heat load on the oven has been implemented in a control scheme based on heat balance zone by zone through a continuous baking oven, taking advantage of the high latent heat of evaporation of water. Tests on biscuit production ovens are reported, with results that support a claim that the scheme gives more reproducible water distribution in the final product than conventional closed loop control of zone ambient temperatures, thus enabling water content to be held more closely within tolerance.
Resumo:
Retrieving a subset of items can cause the forgetting of other items, a phenomenon referred to as retrieval-induced forgetting. According to some theorists, retrieval-induced forgetting is the consequence of an inhibitory mechanism that acts to reduce the accessibility of non-target items that interfere with the retrieval of target items. Other theorists argue that inhibition is unnecessary to account for retrieval-induced forgetting, contending instead that the phenomenon can be best explained by non-inhibitory mechanisms, such as strength-based competition or blocking. The current paper provides the first major meta-analysis of retrieval-induced forgetting, conducted with the primary purpose of quantitatively evaluating the multitude of findings that have been used to contrast these two theoretical viewpoints. The results largely supported inhibition accounts, but also provided some challenging evidence, with the nature of the results often varying as a function of how retrieval-induced forgetting was assessed. Implications for further research and theory development are discussed.
Resumo:
Ancestral human populations had diets containing more indigestible plant material than present-day diets in industrialized countries. One hypothesis for the rise in prevalence of obesity is that physiological mechanisms for controlling appetite evolved to match a diet with plant fiber content higher than that of present-day diets. We investigated how diet affects gut microbiota and colon cells by comparing human microbial communities with those from a primate that has an extreme plant-based diet, namely, the gelada baboon, which is a grazer. The effects of potato (high starch) versus grass (high lignin and cellulose) diets on human-derived versus gelada-derived fecal communities were compared in vitro. We especially focused on the production of short-chain fatty acids, which are hypothesized to be key metabolites influencing appetite regulation pathways. The results confirmed that diet has a major effect on bacterial numbers, short-chain fatty acid production, and the release of hormones involved in appetite suppression. The potato diet yielded greater production of short-chain fatty acids and hormone release than the grass diet, even in the gelada cultures, which we had expected should be better adapted to the grass diet. The strong effects of diet on hormone release could not be explained, however, solely by short-chain fatty acid concentrations. Nuclear magnetic resonance spectroscopy found changes in additional metabolites, including betaine and isoleucine, that might play key roles in inhibiting and stimulating appetite suppression pathways. Our study results indicate that a broader array of metabolites might be involved in triggering gut hormone release in humans than previously thought. IMPORTANCE: One theory for rising levels of obesity in western populations is that the body's mechanisms for controlling appetite evolved to match ancestral diets with more low-energy plant foods. We investigated this idea by comparing the effects of diet on appetite suppression pathways via the use of gut bacterial communities from humans and gelada baboons, which are modern-day primates with an extreme diet of low-energy plant food, namely, grass. We found that diet does play a major role in affecting gut bacteria and the production of a hormone that suppresses appetite but not in the direction predicted by the ancestral diet hypothesis. Also, bacterial products were correlated with hormone release that were different from those normally thought to play this role. By comparing microbiota and diets outside the natural range for modern humans, we found a relationship between diet and appetite pathways that was more complex than previously hypothesized on the basis of more-controlled studies of the effects of single compounds.
Resumo:
In order to overcome divergence of estimation with the same data, the proposed digital costing process adopts an integrated design of information system to design the process knowledge and costing system together. By employing and extending a widely used international standard, industry foundation classes, the system can provide an integrated process which can harvest information and knowledge of current quantity surveying practice of costing method and data. Knowledge of quantification is encoded from literatures, motivation case and standards. It can reduce the time consumption of current manual practice. The further development will represent the pricing process in a Bayesian Network based knowledge representation approach. The hybrid types of knowledge representation can produce a reliable estimation for construction project. In a practical term, the knowledge management of quantity surveying can improve the system of construction estimation. The theoretical significance of this study lies in the fact that its content and conclusion make it possible to develop an automatic estimation system based on hybrid knowledge representation approach.
Resumo:
The preparation of nonaqueous microemulsions using food-acceptable components is reported. The effect of oil on the formation of microemulsions stabilized by lecithin (Epikuron 200) and containing propylene glycol as immiscible solvent was investigated. When the triglycerides were used as oil, three types of phase behavior were noted, namely, a two-phase cloudy region (occurring at low lecithin concentrations), a liquid crystalline (LC) phase (occurring at high surfactant and low oil concentrations), and a clear monophasic microemulsion region. The extent of this clear one-phase region was found to be dependent upon the molecular volume of the oil being solubilized. Large molecular volume oils, such as soybean and sunflower oils, produced a small microemulsion region, whereas the smallest molecular volume triglyceride, tributyrin, produced a large, clear monophasic region. Use of the ethyl ester, ethyl oleate, as oil produced a clear, monophasic region of a size comparable to that seen with tributyrin. Substitution of some of the propylene glycol with water greatly reduced the extent of the clear one-phase region and increased the extent of the liquid crystalline region. In contrast, ethanol enhanced the clear, monophasic region by decreasing the LC phase. Replacement of some of the lecithin with the micelle-forming nonionic surfactant Tween 80 to produce mixed lecithin/Tween 80 mixtures of weight ratios (Km) 1:2 and 1:3 did not significantly alter the phase behavior, although there was a marginal increase in the area of the two-phase, cloudy region of the phase diagram. The use of the lower phosphatidylcholine content lecithin, Epikuron 170, in place of Epikuron 200 resulted in a reduction in the LC region for all of the systems investigated. In conclusion, these studies show that it is possible to prepare one-phase, clear lecithin-based microemulsions over a wide range of compositions using components that are food-acceptable.
Resumo:
We present a novel method for retrieving high-resolution, three-dimensional (3-D) nonprecipitating cloud fields in both overcast and broken-cloud situations. The method uses scanning cloud radar and multiwavelength zenith radiances to obtain gridded 3-D liquid water content (LWC) and effective radius (re) and 2-D column mean droplet number concentration (Nd). By using an adaption of the ensemble Kalman filter, radiances are used to constrain the optical properties of the clouds using a forward model that employs full 3-D radiative transfer while also providing full error statistics given the uncertainty in the observations. To evaluate the new method, we first perform retrievals using synthetic measurements from a challenging cumulus cloud field produced by a large-eddy simulation snapshot. Uncertainty due to measurement error in overhead clouds is estimated at 20% in LWC and 6% in re, but the true error can be greater due to uncertainties in the assumed droplet size distribution and radiative transfer. Over the entire domain, LWC and re are retrieved with average error 0.05–0.08 g m-3 and ~2 μm, respectively, depending on the number of radiance channels used. The method is then evaluated using real data from the Atmospheric Radiation Measurement program Mobile Facility at the Azores. Two case studies are considered, one stratocumulus and one cumulus. Where available, the liquid water path retrieved directly above the observation site was found to be in good agreement with independent values obtained from microwave radiometer measurements, with an error of 20 g m-2.
Resumo:
Across five experiments, the temporal regularity and content of an irrelevant speech stream were varied and their effects on a serial recall task examined. Variations of the content, but not the rhythm, of the irrelevant speech stimuli reliably disrupted serial recall performance in all experiments. Bayesian analyses supported the null hypothesis over the hypothesis that irregular rhythms would disrupt memory to a greater extent than regular rhythms. Pooling the data in a combined analysis revealed that regular presentation of the irrelevant speech was significantly more disruptive to serial recall than irregular presentation. These results are consistent with the idea that auditory distraction is sensitive to both intra-item and inter-item relations and challenge an orienting-based account of auditory distraction.