100 resultados para Content-Based Retrieval
Resumo:
The high complexity of cloud parameterizations now held in models puts more pressure on observational studies to provide useful means to evaluate them. One approach to the problem put forth in the modelling community is to evaluate under what atmospheric conditions the parameterizations fail to simulate the cloud properties and under what conditions they do a good job. It is the ambition of this paper to characterize the variability of the statistical properties of tropical ice clouds in different tropical "regimes" recently identified in the literature to aid the development of better process-oriented parameterizations in models. For this purpose, the statistical properties of non-precipitating tropical ice clouds over Darwin, Australia are characterized using ground-based radar-lidar observations from the Atmospheric Radiation Measurement (ARM) Program. The ice cloud properties analysed are the frequency of ice cloud occurrence, the morphological properties (cloud top height and thickness), and the microphysical and radiative properties (ice water content, visible extinction, effective radius, and total concentration). The variability of these tropical ice cloud properties is then studied as a function of the large-scale cloud regimes derived from the International Satellite Cloud Climatology Project (ISCCP), the amplitude and phase of the Madden-Julian Oscillation (MJO), and the large-scale atmospheric regime as derived from a long-term record of radiosonde observations over Darwin. The vertical variability of ice cloud occurrence and microphysical properties is largest in all regimes (1.5 order of magnitude for ice water content and extinction, a factor 3 in effective radius, and three orders of magnitude in concentration, typically). 98 % of ice clouds in our dataset are characterized by either a small cloud fraction (smaller than 0.3) or a very large cloud fraction (larger than 0.9). In the ice part of the troposphere three distinct layers characterized by different statistically-dominant microphysical processes are identified. The variability of the ice cloud properties as a function of the large-scale atmospheric regime, cloud regime, and MJO phase is large, producing mean differences of up to a factor 8 in the frequency of ice cloud occurrence between large-scale atmospheric regimes and mean differences of a factor 2 typically in all microphysical properties. Finally, the diurnal cycle of the frequency of occurrence of ice clouds is also very different between regimes and MJO phases, with diurnal amplitudes of the vertically-integrated frequency of ice cloud occurrence ranging from as low as 0.2 (weak diurnal amplitude) to values in excess of 2.0 (very large diurnal amplitude). Modellers should now use these results to check if their model cloud parameterizations are capable of translating a given atmospheric forcing into the correct statistical ice cloud properties.
Resumo:
The need for consistent assimilation of satellite measurements for numerical weather prediction led operational meteorological centers to assimilate satellite radiances directly using variational data assimilation systems. More recently there has been a renewed interest in assimilating satellite retrievals (e.g., to avoid the use of relatively complicated radiative transfer models as observation operators for data assimilation). The aim of this paper is to provide a rigorous and comprehensive discussion of the conditions for the equivalence between radiance and retrieval assimilation. It is shown that two requirements need to be satisfied for the equivalence: (i) the radiance observation operator needs to be approximately linear in a region of the state space centered at the retrieval and with a radius of the order of the retrieval error; and (ii) any prior information used to constrain the retrieval should not underrepresent the variability of the state, so as to retain the information content of the measurements. Both these requirements can be tested in practice. When these requirements are met, retrievals can be transformed so as to represent only the portion of the state that is well constrained by the original radiance measurements and can be assimilated in a consistent and optimal way, by means of an appropriate observation operator and a unit matrix as error covariance. Finally, specific cases when retrieval assimilation can be more advantageous (e.g., when the estimate sought by the operational assimilation system depends on the first guess) are discussed.
Resumo:
Developed in response to the new challenges of the social Web, this study investigates how involvement with brand-related user-generated content (UGC) affects consumers’ perceptions of brands. The authors develop a model that provides new insights into the links between drivers of UGC creation, involvement, and consumer-based brand equity. Expert opinions were sought on a hypothesized model, which further was tested through data from an online survey of 202 consumers. The results provide guidance for managerial initiatives involving UGC campaigns for brand building. The findings indicate that consumer perceptions of co-creation, community, and self-concept have a positive impact on UGC involvement that, in turn, positively affects consumer based brand equity. These empirical results have significant implications for avoiding problems and building deeper relationships between consumers and brands in the age of social media.
Resumo:
Accurate observations of cloud microphysical properties are needed for evaluating and improving the representation of cloud processes in climate models and better estimate of the Earth radiative budget. However, large differences are found in current cloud products retrieved from ground-based remote sensing measurements using various retrieval algorithms. Understanding the differences is an important step to address uncertainties in the cloud retrievals. In this study, an in-depth analysis of nine existing ground-based cloud retrievals using ARM remote sensing measurements is carried out. We place emphasis on boundary layer overcast clouds and high level ice clouds, which are the focus of many current retrieval development efforts due to their radiative importance and relatively simple structure. Large systematic discrepancies in cloud microphysical properties are found in these two types of clouds among the nine cloud retrieval products, particularly for the cloud liquid and ice particle effective radius. Note that the differences among some retrieval products are even larger than the prescribed uncertainties reported by the retrieval algorithm developers. It is shown that most of these large differences have their roots in the retrieval theoretical bases, assumptions, as well as input and constraint parameters. This study suggests the need to further validate current retrieval theories and assumptions and even the development of new retrieval algorithms with more observations under different cloud regimes.
Resumo:
Background A whole-genome genotyping array has previously been developed for Malus using SNP data from 28 Malus genotypes. This array offers the prospect of high throughput genotyping and linkage map development for any given Malus progeny. To test the applicability of the array for mapping in diverse Malus genotypes, we applied the array to the construction of a SNPbased linkage map of an apple rootstock progeny. Results Of the 7,867 Malus SNP markers on the array, 1,823 (23.2 %) were heterozygous in one of the two parents of the progeny, 1,007 (12.8 %) were heterozygous in both parental genotypes, whilst just 2.8 % of the 921 Pyrus SNPs were heterozygous. A linkage map spanning 1,282.2 cM was produced comprising 2,272 SNP markers, 306 SSR markers and the S-locus. The length of the M432 linkage map was increased by 52.7 cM with the addition of the SNP markers, whilst marker density increased from 3.8 cM/marker to 0.5 cM/marker. Just three regions in excess of 10 cM remain where no markers were mapped. We compared the positions of the mapped SNP markers on the M432 map with their predicted positions on the ‘Golden Delicious’ genome sequence. A total of 311 markers (13.7 % of all mapped markers) mapped to positions that conflicted with their predicted positions on the ‘Golden Delicious’ pseudo-chromosomes, indicating the presence of paralogous genomic regions or misassignments of genome sequence contigs during the assembly and anchoring of the genome sequence. Conclusions We incorporated data for the 2,272 SNP markers onto the map of the M432 progeny and have presented the most complete and saturated map of the full 17 linkage groups of M. pumila to date. The data were generated rapidly in a high-throughput semi-automated pipeline, permitting significant savings in time and cost over linkage map construction using microsatellites. The application of the array will permit linkage maps to be developed for QTL analyses in a cost-effective manner, and the identification of SNPs that have been assigned erroneous positions on the ‘Golden Delicious’ reference sequence will assist in the continued improvement of the genome sequence assembly for that variety.
Resumo:
Background: Fruit and vegetable-rich diets are associated with a reduced cardiovascular disease (CVD) risk. This protective effect may be a result of the phytochemicals present within fruits and vegetables (F&V). However, there can be considerable variation in the content of phytochemical composition of whole F&V depending on growing location, cultivar, season and agricultural practices, etc. Therefore, the present study investigated the effects of consuming fruits and vegetables as puree-based drinks (FVPD) daily on vasodilation, phytochemical bioavailability, antioxidant status and other CVD risk factors. FVPD was chosen to provide a standardised source of F&V material that could be delivered from the same batch to all subjects during each treatment arm of the study. Methods: Thirty-nine subjects completed the randomised, controlled, cross-over dietary intervention. Subjects were randomised to consume 200 mL of FVPD (or fruit-flavoured control), daily for 6 weeks with an 8-week washout period between treatments. Dietary intake was measured using two 5-day diet records during each cross-over arm of the study. Blood and urine samples were collected before and after each intervention and vasodilation assessed in 19 subjects using laser Doppler imaging with iontophoresis. Results: FVPD significantly increased dietary vitamin C and carotenoids (P < 0.001), and concomitantly increased plasma α- and β-carotene (P < 0.001) with a near-significant increase in endothelium-dependent vasodilation (P = 0.060). Conclusions: Overall, the findings obtained in the present study showed that FVPD were a useful vehicle to increase fruit and vegetable intake, significantly increasing dietary and plasma phytochemical concentrations with a trend towards increased endothelium-dependent vasodilation.
Resumo:
Here we explore the physico-chemical properties of a peptide amphiphile obtained by chemical conjugation of the collagenstimulating peptide KTTKS with 10,12-pentacosadiynoic acid which photopolymerizes as a stable and extended polydiacetylene. We investigate the self-assembly of this new polymer and rationalize its peculiar behavior in terms of a thermal conformational transition. Surprisingly, this polymer shows a thermal transition associated with a non-cooperative increase in b-sheet content at high temperature.
Resumo:
A detailed quantitative microstructural study coupled with cathodoluminescence and geochemical analyses on marbles from Naxos demonstrates that the analysis of microstructures is the most sensitive method to define the origin of marbles within, and between, different regions. Microstructure examination can only be used as an accurate provenance tool if a correction for the second-phase content is considered. If second phases are not considered, a large spread of different microstructures occurs within sample sites, making a separation between neighbouring outcrops difficult or impossible. Moreover, this study shows that the origin of a marble is defined more precisely if the microstructural observations are coupled with cathodoluminescence data.
Resumo:
The butanol-HCl spectrophotometric assay is widely used for quantifying extractable and insoluble condensed tannins (CT, syn. proanthocyanidins) in foods, feeds, and foliage of herbaceous and woody plants, but the method underestimates total CT content when applied directly to plant material. To improve CT quantitation, we tested various cosolvents with butanol-HCl and found that acetone increased anthocyanidin yields from two forage Lotus species having contrasting procyanidin and prodelphinidin compositions. A butanol-HCl-iron assay run with 50% (v/v) acetone gave linear responses with Lotus CT standards and increased estimates of total CT in Lotus herbage and leaves by up to 3.2-fold over the conventional method run without acetone. The use of thiolysis to determine the purity of CT standards further improved quantitation. Gel-state 13C and 1H–13C HSQC NMR spectra of insoluble residues collected after butanol-HCl assays revealed that acetone increased anthocyanidin yields by facilitating complete solubilization of CT from tissue.
Resumo:
Clinical pathways have been adopted for various diseases in clinical departments for quality improvement as a result of standardization of medical activities in treatment process. Using knowledge-based decision support on the basis of clinical pathways is a promising strategy to improve medical quality effectively. However, the clinical pathway knowledge has not been fully integrated into treatment process and thus cannot provide comprehensive support to the actual work practice. Therefore this paper proposes a knowledgebased clinical pathway management method which contributes to make use of clinical knowledge to support and optimize medical practice. We have developed a knowledgebased clinical pathway management system to demonstrate how the clinical pathway knowledge comprehensively supports the treatment process. The experiences from the use of this system show that the treatment quality can be effectively improved by the extracted and classified clinical pathway knowledge, seamless integration of patient-specific clinical pathway recommendations with medical tasks and the evaluating pathway deviations for optimization.
Resumo:
In order to make best use of the opportunities provided by space missions such as the Radiation Belt Storm Probes, we determine the response of complementary subionospheric radiowave propagation measurements (VLF), riometer absorption measurements (CNA), and GPS-produced total electron content (vTEC) to different energetic electron precipitation (EEP). We model the relative sensitivity and responses of these instruments to idealised monoenergetic beams of precipitating electrons, and more realistic EEP spectra chosen to represent radiation belts and substorm precipitation. In the monoenergetic beam case, we find riometers are more sensitive to the same EEP event occurring during the day than during the night, while subionospheric VLF shows the opposite relationship, and the change in vTEC is independent. In general, the subionospheric VLF measurements are much more sensitive than the other two techniques for EEP over 200 keV, responding to flux magnitudes two-three orders of magnitude smaller than detectable by a riometer. Detectable TEC changes only occur for extreme monoenergetic fluxes. For the radiation belt EEP case, clearly detectable subionospheric VLF responses are produced by daytime fluxes that are ~10 times lower than required for riometers, while nighttime fluxes can be 10,000 times lower. Riometers are likely to respond only to radiation belt fluxes during the largest EEP events and vTEC is unlikely to be significantly disturbed by radiation belt EEP. For the substorm EEP case both the riometer absorption and the subionospheric VLF technique respond significantly, as does the change in vTEC, which is likely to be detectable at ~3-4 TECu.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.
Resumo:
We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).