883 resultados para basis sets
Resumo:
The Advanced Along-Track Scanning Radiometer (AATSR) was launched on Envisat in March 2002. The AATSR instrument is designed to retrieve precise and accurate global sea surface temperature (SST) that, combined with the large data set collected from its predecessors, ATSR and ATSR-2, will provide a long term record of SST data that is greater than 15 years. This record can be used for independent monitoring and detection of climate change. The AATSR validation programme has successfully completed its initial phase. The programme involves validation of the AATSR derived SST values using in situ radiometers, in situ buoys and global SST fields from other data sets. The results of the initial programme presented here will demonstrate that the AATSR instrument is currently close to meeting its scientific objectives of determining global SST to an accuracy of 0.3 K (one sigma). For night time data, the analysis gives a warm bias of between +0.04 K (0.28 K) for buoys to +0.06 K (0.20 K) for radiometers, with slightly higher errors observed for day time data, showing warm biases of between +0.02 (0.39 K) for buoys to +0.11 K (0.33 K) for radiometers. They show that the ATSR series of instruments continues to be the world leader in delivering accurate space-based observations of SST, which is a key climate parameter.
Resumo:
High-density oligonucleotide (oligo) arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L.) Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM) probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P) stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ webcite and may be used to facilitate transcriptomic analyses of a wide range of plant and animal species in the absence of custom arrays.
Resumo:
Contrary to the widespread belief that people are positively motivated by reward incentives, some studies have shown that performance-based extrinsic reward can actually undermine a person's intrinsic motivation to engage in a task. This “undermining effect” has timely practical implications, given the burgeoning of performance-based incentive systems in contemporary society. It also presents a theoretical challenge for economic and reinforcement learning theories, which tend to assume that monetary incentives monotonically increase motivation. Despite the practical and theoretical importance of this provocative phenomenon, however, little is known about its neural basis. Herein we induced the behavioral undermining effect using a newly developed task, and we tracked its neural correlates using functional MRI. Our results show that performance-based monetary reward indeed undermines intrinsic motivation, as assessed by the number of voluntary engagements in the task. We found that activity in the anterior striatum and the prefrontal areas decreased along with this behavioral undermining effect. These findings suggest that the corticobasal ganglia valuation system underlies the undermining effect through the integration of extrinsic reward value and intrinsic task value.
Resumo:
The ripening processes of 24 apple cultivars were examined in the United Kingdom National Fruit Collection in 2010. Basically the starch content, and additionally ground colour, water-soluble solids content and flesh firmness were studied during ripening. The degradation of the starch content was evaluated using a 0–10 scale. A starch degradation value of 50% was taken to be the optimum harvest date, with harvest beginning at a value of 40% and finishing at 60%. Depending on the cultivar, this represented a harvest window of 9 to 21 days. Later ripening cultivars matured more slowly, leading to a longer harvesting period, with the exception of cv. Feuillemorte. Pronounced differences were observed among the cultivars on the basis of the starch degradation pattern, allowing them to be divided into four groups. Separate charts were elaborated for each group that are recommended for use in practice.
Resumo:
This article presents the description of stencilling by Gilles Filleau des Billettes. The description sets out a method for stencilling letters, words, and texts, and specifies equipment for doing the work; it forms the basis for a reconstruction of the equipment and method, which is presented in a parallel article in this volume of Typography papers (see E. Kindel, 'A reconstruction of stencilling based on the description by Gilles Filleau des Billettes', Typography papers, 9, pp. 28–65). The original French text, approximately 10,000 words in length, is here transcribed and accompanied by a parallel English translation. Introductory notes on the preparation of both texts are provided; images of stencil letters found among the papers of Sébastien Truchet, Des Billettes’s colleague, are shown in an appendix.
Resumo:
This article is the guest editors' introduction to a special issue on using Social Network Research in the field of Human Resource Management. The goals of the special issue are: (1) to draw attention to the points of integration between the two fields, (2) to showcase research that applies social network perspectives and methodology to issues relevant to HRM and (3) to identify common challenges where future collaborative efforts could contribute to advancements in both fields.
Resumo:
This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to re-balance the class distribution of the original imbalanced data set under the principle that synthetic data sample follows the same statistical properties. Based on the over-sampled training data, the radial basis function (RBF) classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier’s structure and the parameters of RBF kernels are determined using a particle swarm optimisation algorithm based on the criterion of minimising the leave-one-out misclassification rate. The effectiveness of the proposed PDFOS approach is demonstrated by the empirical study on several imbalanced data sets.
Resumo:
The work involves investigation of a type of wireless power system wherein its analysis will yield the construction of a prototype modeled as a singular technological artifact. It is through exploration of the artifact that forms the intellectual basis for not only its prototypical forms, but suggestive of variant forms not yet discovered. Through the process it is greatly clarified the role of the artifact, its most suitable application given the constraints on the delivery problem, and optimization strategies to improve it. In order to improve maturity and contribute to a body of knowledge, this document proposes research utilizing mid-field region, efficient inductive-transfer for the purposes of removing wired connections and electrical contacts. While the description seems enough to state the purpose of this work, it does not convey the compromises of having to redraw the lines of demarcation between near and far-field in the traditional method of broadcasting. Two striking scenarios are addressed in this thesis: Firstly, the mathematical explanation of wireless power is due to J.C. Maxwell's original equations, secondly, the behavior of wireless power in the circuit is due to Joseph Larmor's fundamental works on the dynamics of the field concept. A model of propagation will be presented which matches observations in experiments. A modified model of the dipole will be presented to address the phenomena observed in the theory and experiments. Two distinct sets of experiments will test the concept of single and two coupled-modes. In a more esoteric context of the zero and first-order magnetic field, the suggestion of a third coupled-mode is presented. Through the remaking of wireless power in this context, it is the intention of the author to show the reader that those things lost to history, bound to a path of complete obscurity, are once again innovative and useful ideas.
Resumo:
Within the SPARC Data Initiative, the first comprehensive assessment of the quality of 13 water vapor products from 11 limb-viewing satellite instruments (LIMS, SAGE II, UARS-MLS, HALOE, POAM III, SMR, SAGE III, MIPAS, SCIAMACHY, ACE-FTS, and Aura-MLS) obtained within the time period 1978-2010 has been performed. Each instrument's water vapor profile measurements were compiled into monthly zonal mean time series on a common latitude-pressure grid. These time series serve as basis for the "climatological" validation approach used within the project. The evaluations include comparisons of monthly or annual zonal mean cross sections and seasonal cycles in the tropical and extratropical upper troposphere and lower stratosphere averaged over one or more years, comparisons of interannual variability, and a study of the time evolution of physical features in water vapor such as the tropical tape recorder and polar vortex dehydration. Our knowledge of the atmospheric mean state in water vapor is best in the lower and middle stratosphere of the tropics and midlatitudes, with a relative uncertainty of. 2-6% (as quantified by the standard deviation of the instruments' multiannual means). The uncertainty increases toward the polar regions (+/- 10-15%), the mesosphere (+/- 15%), and the upper troposphere/lower stratosphere below 100 hPa (+/- 30-50%), where sampling issues add uncertainty due to large gradients and high natural variability in water vapor. The minimum found in multiannual (1998-2008) mean water vapor in the tropical lower stratosphere is 3.5 ppmv (+/- 14%), with slightly larger uncertainties for monthly mean values. The frequently used HALOE water vapor data set shows consistently lower values than most other data sets throughout the atmosphere, with increasing deviations from the multi-instrument mean below 100 hPa in both the tropics and extratropics. The knowledge gained from these comparisons and regarding the quality of the individual data sets in different regions of the atmosphere will help to improve model-measurement comparisons (e.g., for diagnostics such as the tropical tape recorder or seasonal cycles), data merging activities, and studies of climate variability.
Resumo:
Concern that European forest biodiversity is depleted and declining has provoked widespread efforts to improve management practices. To gauge the success of these actions, appropriate monitoring of forest ecosystems is paramount. Multi-species indicators are frequently used to assess the state of biodiversity and its response to implemented management, but generally applicable and objective methodologies for species' selection are lacking. Here we use a niche-based approach, underpinned by coarse quantification of species' resource use, to objectively select species for inclusion in a pan-European forest bird indicator. We identify both the minimum number of species required to deliver full resource coverage and the most sensitive species' combination, and explore the trade-off between two key characteristics, sensitivity and redundancy, associated with indicators comprising different numbers of species. We compare our indicator to an existing forest bird indicator selected on the basis of expert opinion and show it is more representative of the wider community. We also present alternative indicators for regional and forest type specific monitoring and show that species' choice can have a significant impact on the indicator and consequent projections about the state of the biodiversity it represents. Furthermore, by comparing indicator sets drawn from currently monitored species and the full forest bird community, we identify gaps in the coverage of the current monitoring scheme. We believe that adopting this niche-based framework for species' selection supports the objective development of multi-species indicators and that it has good potential to be extended to a range of habitats and taxa.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.
Resumo:
We consider a generic basic semi-algebraic subset S of the space of generalized functions, that is a set given by (not necessarily countably many) polynomial constraints. We derive necessary and sufficient conditions for an infinite sequence of generalized functions to be realizable on S, namely to be the moment sequence of a finite measure concentrated on S. Our approach combines the classical results about the moment problem on nuclear spaces with the techniques recently developed to treat the moment problem on basic semi-algebraic sets of Rd. In this way, we determine realizability conditions that can be more easily verified than the well-known Haviland type conditions. Our result completely characterizes the support of the realizing measure in terms of its moments. As concrete examples of semi-algebraic sets of generalized functions, we consider the set of all Radon measures and the set of all the measures having bounded Radon–Nikodym density w.r.t. the Lebesgue measure.