41 resultados para Inspection
Resumo:
From the concentrations of dissolved atmospheric noble gases in water, a so-called “noble gas temperature” (NGT) can be determined that corresponds to the temperature of the water when it was last in contact with the atmosphere. Here we demonstrate that the NGT concept is applicable to water inclusions in cave stalagmites, and yields NGTs that are in good agreement with the ambient air temperatures in the caves. We analysed samples from two Holocene and one undated stalagmite. The three stalagmites originate from three caves located in different climatic regions having modern mean annual air temperatures of 27 °C, 12 °C and 8 °C, respectively. In about half of the samples analysed Kr and Xe concentrations originated entirely from the two well-defined noble gas components air-saturated water and atmospheric air, which allowed NGTs to be determined successfully from Kr and Xe concentrations. One stalagmite seems to be particularly suitable for NGT determination, as almost all of its samples yielded the modern cave temperature. Notably, this stalagmite contains a high proportion of primary water inclusions, which seem to preserve the temperature-dependent signature well in their Kr and Xe concentrations. In future work on stalagmites detailed microscopic inspection of the fluid inclusions prior to noble gas analysis is therefore likely to be crucial in increasing the number of successful NGT determinations.
Resumo:
Infrared polarization and intensity imagery provide complementary and discriminative information in image understanding and interpretation. In this paper, a novel fusion method is proposed by effectively merging the information with various combination rules. It makes use of both low-frequency and highfrequency images components from support value transform (SVT), and applies fuzzy logic in the combination process. Images (both infrared polarization and intensity images) to be fused are firstly decomposed into low-frequency component images and support value image sequences by the SVT. Then the low-frequency component images are combined using a fuzzy combination rule blending three sub-combination methods of (1) region feature maximum, (2) region feature weighting average, and (3) pixel value maximum; and the support value image sequences are merged using a fuzzy combination rule fusing two sub-combination methods of (1) pixel energy maximum and (2) region feature weighting. With the variables of two newly defined features, i.e. the low-frequency difference feature for low-frequency component images and the support-value difference feature for support value image sequences, trapezoidal membership functions are proposed and developed in tuning the fuzzy fusion process. Finally the fused image is obtained by inverse SVT operations. Experimental results of visual inspection and quantitative evaluation both indicate the superiority of the proposed method to its counterparts in image fusion of infrared polarization and intensity images.
Resumo:
Research has highlighted the usefulness of the Gilt–Equity Yield Ratio (GEYR) as a predictor of UK stock returns. This paper extends recent studies by endogenising the threshold at which the GEYR switches from being low to being high or vice versa, thus improving the arbitrary nature of the determination of the threshold employed in the extant literature. It is observed that a decision rule for investing in equities or bonds, based on the forecasts from a regime switching model, yields higher average returns with lower variability than a static portfolio containing any combinations of equities and bonds. A closer inspection of the results reveals that the model has power to forecast when investors should steer clear of equities, although the trading profits generated are insufficient to outweigh the associated transaction costs.
Resumo:
Whole-genome sequencing (WGS) could potentially provide a single platform for extracting all the information required to predict an organism’s phenotype. However, its ability to provide accurate predictions has not yet been demonstrated in large independent studies of specific organisms. In this study, we aimed to develop a genotypic prediction method for antimicrobial susceptibilities. The whole genomes of 501 unrelated Staphylococcus aureus isolates were sequenced, and the assembled genomes were interrogated using BLASTn for a panel of known resistance determinants (chromosomal mutations and genes carried on plasmids). Results were compared with phenotypic susceptibility testing for 12 commonly used antimicrobial agents (penicillin, methicillin, erythromycin, clindamycin, tetracycline, ciprofloxacin, vancomycin, trimethoprim, gentamicin, fusidic acid, rifampin, and mupirocin) performed by the routine clinical laboratory. We investigated discrepancies by repeat susceptibility testing and manual inspection of the sequences and used this information to optimize the resistance determinant panel and BLASTn algorithm. We then tested performance of the optimized tool in an independent validation set of 491 unrelated isolates, with phenotypic results obtained in duplicate by automated broth dilution (BD Phoenix) and disc diffusion. In the validation set, the overall sensitivity and specificity of the genomic prediction method were 0.97 (95% confidence interval [95% CI], 0.95 to 0.98) and 0.99 (95% CI, 0.99 to 1), respectively, compared to standard susceptibility testing methods. The very major error rate was 0.5%, and the major error rate was 0.7%. WGS was as sensitive and specific as routine antimicrobial susceptibility testing methods. WGS is a promising alternative to culture methods for resistance prediction in S. aureus and ultimately other major bacterial pathogens.
Resumo:
We study the behavior and emotional arousal of the participants in an experimental auction, leading to an asymmetric social dilemma involving an auctioneer and two bidders. An antisocial transfer (bribe) which is beneficial for the auctioneer (official) is paid, if promised, by the winner of the auction. Some pro-social behavior on both the auctioneers' and the bidders' sides is observed even in the absence of any punishment mechanism (Baseline, Treatment 0). However, pro-social behavior is adopted by the vast majority of subjects when the loser of the auction can inspect the transaction between the winner and the auctioneer (Inspection, Treatment 1). The inspection and punishment mechanism is such that, if a bribe is (not) revealed, both corrupt agents (the denouncing bidder) lose(s) this period's payoffs. This renders the inspection option unprofitable for the loser and is rarely used, especially towards the end of the session, when pro-social behavior becomes pervasive. Subjects' emotional arousal was obtained through skin conductance responses. Generally speaking, our findings suggest that stronger emotions are associated with decisions deviating from pure monetary reward maximization, rather than with (un)ethical behavior per se. In fact, using response times as a measure of the subject's reflection during the decision-making process, we can associate emotional arousal with the conflict between primary or instinctive and secondary or contemplative motivations and, more specifically, with deviations from the subject's pure monetary interest.
Resumo:
Within the ESA Climate Change Initiative (CCI) project Aerosol_cci (2010–2013), algorithms for the production of long-term total column aerosol optical depth (AOD) datasets from European Earth Observation sensors are developed. Starting with eight existing pre-cursor algorithms three analysis steps are conducted to improve and qualify the algorithms: (1) a series of experiments applied to one month of global data to understand several major sensitivities to assumptions needed due to the ill-posed nature of the underlying inversion problem, (2) a round robin exercise of "best" versions of each of these algorithms (defined using the step 1 outcome) applied to four months of global data to identify mature algorithms, and (3) a comprehensive validation exercise applied to one complete year of global data produced by the algorithms selected as mature based on the round robin exercise. The algorithms tested included four using AATSR, three using MERIS and one using PARASOL. This paper summarizes the first step. Three experiments were conducted to assess the potential impact of major assumptions in the various aerosol retrieval algorithms. In the first experiment a common set of four aerosol components was used to provide all algorithms with the same assumptions. The second experiment introduced an aerosol property climatology, derived from a combination of model and sun photometer observations, as a priori information in the retrievals on the occurrence of the common aerosol components. The third experiment assessed the impact of using a common nadir cloud mask for AATSR and MERIS algorithms in order to characterize the sensitivity to remaining cloud contamination in the retrievals against the baseline dataset versions. The impact of the algorithm changes was assessed for one month (September 2008) of data: qualitatively by inspection of monthly mean AOD maps and quantitatively by comparing daily gridded satellite data against daily averaged AERONET sun photometer observations for the different versions of each algorithm globally (land and coastal) and for three regions with different aerosol regimes. The analysis allowed for an assessment of sensitivities of all algorithms, which helped define the best algorithm versions for the subsequent round robin exercise; all algorithms (except for MERIS) showed some, in parts significant, improvement. In particular, using common aerosol components and partly also a priori aerosol-type climatology is beneficial. On the other hand the use of an AATSR-based common cloud mask meant a clear improvement (though with significant reduction of coverage) for the MERIS standard product, but not for the algorithms using AATSR. It is noted that all these observations are mostly consistent for all five analyses (global land, global coastal, three regional), which can be understood well, since the set of aerosol components defined in Sect. 3.1 was explicitly designed to cover different global aerosol regimes (with low and high absorption fine mode, sea salt and dust).
Resumo:
In this pilot study water was extracted from samples of two Holocene stalagmites from Socotra Island, Yemen, and one Eemian stalagmite from southern continental Yemen. The amount of water extracted per unit mass of stalagmite rock, termed "water yield" hereafter, serves as a measure of its total water content. Based on direct correlation plots of water yields and δ18Ocalcite and on regime shift analyses, we demonstrate that for the studied stalagmites the water yield records vary systematically with the corresponding oxygen isotopic compositions of the calcite (δ18Ocalcite). Within each stalagmite lower δ18Ocalcite values are accompanied by lower water yields and vice versa. The δ18Ocalcite records of the studied stalagmites have previously been interpreted to predominantly reflect the amount of rainfall in the area; thus, water yields can be linked to drip water supply. Higher, and therefore more continuous drip water supply caused by higher rainfall rates, supports homogeneous deposition of calcite with low porosity and therefore a small fraction of water-filled inclusions, resulting in low water yields of the respective samples. A reduction of drip water supply fosters irregular growth of calcite with higher porosity, leading to an increase of the fraction of water-filled inclusions and thus higher water yields. The results are consistent with the literature on stalagmite growth and supported by optical inspection of thin sections of our samples. We propose that for a stalagmite from a dry tropical or subtropical area, its water yield record represents a novel paleo-climate proxy recording changes in drip water supply, which can in turn be interpreted in terms of associated rainfall rates.
Resumo:
Contamination of the electroencephalogram (EEG) by artifacts greatly reduces the quality of the recorded signals. There is a need for automated artifact removal methods. However, such methods are rarely evaluated against one another via rigorous criteria, with results often presented based upon visual inspection alone. This work presents a comparative study of automatic methods for removing blink, electrocardiographic, and electromyographic artifacts from the EEG. Three methods are considered; wavelet, blind source separation (BSS), and multivariate singular spectrum analysis (MSSA)-based correction. These are applied to data sets containing mixtures of artifacts. Metrics are devised to measure the performance of each method. The BSS method is seen to be the best approach for artifacts of high signal to noise ratio (SNR). By contrast, MSSA performs well at low SNRs but at the expense of a large number of false positive corrections.
Resumo:
Empirical Mode Decomposition (EMD) is a data driven technique for extraction of oscillatory components from data. Although it has been introduced over 15 years ago, its mathematical foundations are still missing which also implies lack of objective metrics for decomposed set evaluation. Most common technique for assessing results of EMD is their visual inspection, which is very subjective. This article provides objective measures for assessing EMD results based on the original definition of oscillatory components.
Resumo:
National food control systems are vital tools in governing the safety and quality of food intended for human consumption. This study of the Omani system was conducted to evaluate the effectiveness of the current food controls in place for protecting, in particular, the public health from emerging biological and chemical hazards. In response to this situation, a survey was undertaken within the different food safety authorities in Oman to examine the different elements of the national food control systems in terms of their existing food control management, food legislation, food inspection, food analysis laboratories and information, education and communications. Officials from the different authorities were interviewed and results were captured in prepared questionnaires. Overall examinations of the challenges, strength and weakness of the existing system have been highlighted. The findings of the study indicate significant progress is being made and the creation by the government of a national Centre for Food Safety and Quality is a significant positive step.
Resumo:
1. Bee populations and other pollinators face multiple, synergistically acting threats, which have led to population declines, loss of local species richness and pollination services, and extinctions. However, our understanding of the degree, distribution and causes of declines is patchy, in part due to inadequate monitoring systems, with the challenge of taxonomic identification posing a major logistical barrier. Pollinator conservation would benefit from a high-throughput identification pipeline. 2. We show that the metagenomic mining and resequencing of mitochondrial genomes (mitogenomics) can be applied successfully to bulk samples of wild bees. We assembled the mitogenomes of 48 UK bee species and then shotgun-sequenced total DNA extracted from 204 whole bees that had been collected in 10 pan-trap samples from farms in England and been identified morphologically to 33 species. Each sample data set was mapped against the 48 reference mitogenomes. 3. The morphological and mitogenomic data sets were highly congruent. Out of 63 total species detections in the morphological data set, the mitogenomic data set made 59 correct detections (93�7% detection rate) and detected six more species (putative false positives). Direct inspection and an analysis with species-specific primers suggested that these putative false positives were most likely due to incorrect morphological IDs. Read frequency significantly predicted species biomass frequency (R2 = 24�9%). Species lists, biomass frequencies, extrapolated species richness and community structure were recovered with less error than in a metabarcoding pipeline. 4. Mitogenomics automates the onerous task of taxonomic identification, even for cryptic species, allowing the tracking of changes in species richness and istributions. A mitogenomic pipeline should thus be able to contain costs, maintain consistently high-quality data over long time series, incorporate retrospective taxonomic revisions and provide an auditable evidence trail. Mitogenomic data sets also provide estimates of species counts within samples and thus have potential for tracking population trajectories.