84 resultados para Search-based technique
Resumo:
In order to make best use of the opportunities provided by space missions such as the Radiation Belt Storm Probes, we determine the response of complementary subionospheric radiowave propagation measurements (VLF), riometer absorption measurements (CNA), and GPS-produced total electron content (vTEC) to different energetic electron precipitation (EEP). We model the relative sensitivity and responses of these instruments to idealised monoenergetic beams of precipitating electrons, and more realistic EEP spectra chosen to represent radiation belts and substorm precipitation. In the monoenergetic beam case, we find riometers are more sensitive to the same EEP event occurring during the day than during the night, while subionospheric VLF shows the opposite relationship, and the change in vTEC is independent. In general, the subionospheric VLF measurements are much more sensitive than the other two techniques for EEP over 200 keV, responding to flux magnitudes two-three orders of magnitude smaller than detectable by a riometer. Detectable TEC changes only occur for extreme monoenergetic fluxes. For the radiation belt EEP case, clearly detectable subionospheric VLF responses are produced by daytime fluxes that are ~10 times lower than required for riometers, while nighttime fluxes can be 10,000 times lower. Riometers are likely to respond only to radiation belt fluxes during the largest EEP events and vTEC is unlikely to be significantly disturbed by radiation belt EEP. For the substorm EEP case both the riometer absorption and the subionospheric VLF technique respond significantly, as does the change in vTEC, which is likely to be detectable at ~3-4 TECu.
Resumo:
Using a discrete wavelet transform with a Meyer wavelet basis, we present a new quantitative algorithm for determining the onset time of Pi1 and Pi2 ULF waves in the nightside ionosphere with ∼20- to 40-s resolution at substorm expansion phase onset. We validate the algorithm by comparing both the ULF wave onset time and location to the optical onset determined by the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE)–Far Ultraviolet Imager (FUV) instrument. In each of the six events analyzed, five substorm onsets and one pseudobreakup, the ULF onset is observed prior to the global optical onset observed by IMAGE at a station closely conjugate to the optical onset. The observed ULF onset times expand both latitudinally and longitudinally away from an epicenter of ULF wave power in the ionosphere. We further discuss the utility of the algorithm for diagnosing pseudobreakups and the relationship of the ULF onset epicenter to the meridians of elements of the substorm current wedge. The importance of the technique for establishing the causal sequence of events at substorm onset, especially in support of the multisatellite Time History of Events and Macroscale Interactions During Substorms (THEMIS) mission, is also described.
Resumo:
We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.
Resumo:
Radiometric data in the visible domain acquired by satellite remote sensing have proven to be powerful for monitoring the states of the ocean, both physical and biological. With the help of these data it is possible to understand certain variations in biological responses of marine phytoplankton on ecological time scales. Here, we implement a sequential data-assimilation technique to estimate from a conventional nutrient–phytoplankton–zooplankton (NPZ) model the time variations of observed and unobserved variables. In addition, we estimate the time evolution of two biological parameters, namely, the specific growth rate and specific mortality of phytoplankton. Our study demonstrates that: (i) the series of time-varying estimates of specific growth rate obtained by sequential data assimilation improves the fitting of the NPZ model to the satellite-derived time series: the model trajectories are closer to the observations than those obtained by implementing static values of the parameter; (ii) the estimates of unobserved variables, i.e., nutrient and zooplankton, obtained from an NPZ model by implementation of a pre-defined parameter evolution can be different from those obtained on applying the sequences of parameters estimated by assimilation; and (iii) the maximum estimated specific growth rate of phytoplankton in the study area is more sensitive to the sea-surface temperature than would be predicted by temperature-dependent functions reported previously. The overall results of the study are potentially useful for enhancing our understanding of the biological response of phytoplankton in a changing environment.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud-masks. Here, this is done over both land and ocean using night-time (infrared) imagery. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 87% and 48% for ocean and land, respectively using the Bayesian technique, compared to 74% and 39%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
Numerical Weather Prediction (NWP) fields are used to assist the detection of cloud in satellite imagery. Simulated observations based on NWP are used within a framework based on Bayes' theorem to calculate a physically-based probability of each pixel with an imaged scene being clear or cloudy. Different thresholds can be set on the probabilities to create application-specific cloud masks. Here, the technique is shown to be suitable for daytime applications over land and sea, using visible and near-infrared imagery, in addition to thermal infrared. We use a validation dataset of difficult cloud detection targets for the Spinning Enhanced Visible and Infrared Imager (SEVIRI) achieving true skill scores of 89% and 73% for ocean and land, respectively using the Bayesian technique, compared to 90% and 70%, respectively for the threshold-based techniques associated with the validation dataset.
Resumo:
We propose and demonstrate a fully probabilistic (Bayesian) approach to the detection of cloudy pixels in thermal infrared (TIR) imagery observed from satellite over oceans. Using this approach, we show how to exploit the prior information and the fast forward modelling capability that are typically available in the operational context to obtain improved cloud detection. The probability of clear sky for each pixel is estimated by applying Bayes' theorem, and we describe how to apply Bayes' theorem to this problem in general terms. Joint probability density functions (PDFs) of the observations in the TIR channels are needed; the PDFs for clear conditions are calculable from forward modelling and those for cloudy conditions have been obtained empirically. Using analysis fields from numerical weather prediction as prior information, we apply the approach to imagery representative of imagers on polar-orbiting platforms. In comparison with the established cloud-screening scheme, the new technique decreases both the rate of failure to detect cloud contamination and the false-alarm rate by one quarter. The rate of occurrence of cloud-screening-related errors of >1 K in area-averaged SSTs is reduced by 83%. Copyright © 2005 Royal Meteorological Society.
Resumo:
Background: Affymetrix GeneChip arrays are widely used for transcriptomic studies in a diverse range of species. Each gene is represented on a GeneChip array by a probe- set, consisting of up to 16 probe-pairs. Signal intensities across probe- pairs within a probe-set vary in part due to different physical hybridisation characteristics of individual probes with their target labelled transcripts. We have previously developed a technique to study the transcriptomes of heterologous species based on hybridising genomic DNA (gDNA) to a GeneChip array designed for a different species, and subsequently using only those probes with good homology. Results: Here we have investigated the effects of hybridising homologous species gDNA to study the transcriptomes of species for which the arrays have been designed. Genomic DNA from Arabidopsis thaliana and rice (Oryza sativa) were hybridised to the Affymetrix Arabidopsis ATH1 and Rice Genome GeneChip arrays respectively. Probe selection based on gDNA hybridisation intensity increased the number of genes identified as significantly differentially expressed in two published studies of Arabidopsis development, and optimised the analysis of technical replicates obtained from pooled samples of RNA from rice. Conclusion: This mixed physical and bioinformatics approach can be used to optimise estimates of gene expression when using GeneChip arrays.
Resumo:
Research in social psychology has shown that public attitudes towards feminism are mostly based on stereotypical views linking feminism with leftist politics and lesbian orientation. It is claimed that such attitudes are due to the negative and sexualised media construction of feminism. Studies concerned with the media representation of feminism seem to confirm this tendency. While most of this research provides significant insights into the representation of feminism, the findings are often based on a small sample of texts. Also, most of the research was conducted in an Anglo-American setting. This study attempts to address some of the shortcomings of previous work by examining the discourse of feminism in a large corpus of German and British newspaper data. It does so by employing the tools of Corpus Linguistics. By investigating the collocation profiles of the search term feminism, we provide evidence of salient discourse patterns surrounding feminism in two different cultural contexts.
Resumo:
Two previous reconstructions of palaeovegetation across the whole of China were performed using a simple classification of plant functional types (PFTs). Now a more explicit, global PFT classification scheme has been developed, and a substantial number of additional pollen records have become available. Here we apply the global scheme of PFTs to a comprehensive set of pollen records available from China to test the applicability of the global scheme of PFTs in China, and to obtain a well-founded reconstruction of changing palaeovegetation patterns. A total of 806 pollen surface samples, 188 mid-Holocene (MH, 6000 14C yr BP) and 50 last glacial maximum (LGM, 18,000 14C yr BP) pollen records were used to reconstruct vegetation patterns in China, based on a new global classification system of PFTs and a standard numerical technique for biome assignment (biomization). The biome reconstruction based on pollen surface samples showed convincing agreement with present potential natural vegetation. Coherent patterns of change in biome distribution between MH, LGM and present are observed. In the MH, cold and cool-temperate evergreen needleleaf forests and mixed forests, temperate deciduous broadleaf forest, and warm-temperate evergreen broadleaf and mixed forest in eastern China were shifted northward by 200–500 km. Cold-deciduous forest in northeastern China was replaced by cold evergreen needleleaf forest while in central northern China, cold-deciduous forest was present at some sites now occupied by temperate grassland and desert. The forest–grassland boundary was 200–300 km west of its present position. Temperate xerophytic shrubland, temperate grassland and desert covered a large area on the Tibetan Plateau, but the area of tundra was reduced. Treeline was 300–500 m higher than present in Tibet. These changes imply generally warmer winters, longer growing seasons and more precipitation during the MH. Westward shifts of the forest–shrubland–grassland and grassland–desert boundaries imply greater moisture availability in the MH, consistent with a stronger summer monsoon. During the LGM, in contrast, cold-deciduous forest, cool-temperate evergreen needleleaf forest, cool mixed forests, warm-temperate evergreen broadleaf and mixed forest in eastern China were displaced to the south by 300–1000 km, while temperate deciduous broadleaf forest, pure warm-temperate evergreen forest, tropical semi-evergreen and evergreen broadleaf forests were restricted or absent from the mainland of southern China, implying colder winters than present. Strong shifts of temperate xerophytic shrubland, temperate grassland and desert to the south and east in northern and western China and on the Tibetan Plateau imply drier conditions than present.
Resumo:
This paper examines the lead–lag relationship between the FTSE 100 index and index futures price employing a number of time series models. Using 10-min observations from June 1996–1997, it is found that lagged changes in the futures price can help to predict changes in the spot price. The best forecasting model is of the error correction type, allowing for the theoretical difference between spot and futures prices according to the cost of carry relationship. This predictive ability is in turn utilised to derive a trading strategy which is tested under real-world conditions to search for systematic profitable trading opportunities. It is revealed that although the model forecasts produce significantly higher returns than a passive benchmark, the model was unable to outperform the benchmark after allowing for transaction costs.
Resumo:
14C-dated pollen and lake-level data from Europe are used to assess the spatial patterns of climate change between 6000 yr BP and present, as simulated by the NCAR CCM1 (National Center for Atmospheric Research, Community Climate Model, version 1) in response to the change in the Earth’s orbital parameters during this perod. First, reconstructed 6000 yr BP values of bioclimate variables obtained from pollen and lake-level data with the constrained-analogue technique are compared with simulated values. Then a 6000 yr BP biome map obtained from pollen data with an objective biome reconstruction (biomization) technique is compared with BIOME model results derived from the same simulation. Data and simulations agree in some features: warmer-than-present growing seasons in N and C Europe allowed forests to extend further north and to higher elevations than today, and warmer winters in C and E Europe prevented boreal conifers from spreading west. More generally, however, the agreement is poor. Predominantly deciduous forest types in Fennoscandia imply warmer winters than the model allows. The model fails to simulate winters cold enough, or summers wet enough, to allow temperate deciduous forests their former extended distribution in S Europe, and it incorrectly simulates a much expanded area of steppe vegetation in SE Europe. Similar errors have also been noted in numerous 6000 yr BP simulations with prescribed modern sea surface temperatures. These errors are evidently not resolved by the inclusion of interactive sea-surface conditions in the CCM1. Accurate representation of mid-Holocene climates in Europe may require the inclusion of dynamical ocean–atmosphere and/or vegetation–atmosphere interactions that most palaeoclimate model simulations have so far disregarded.
Resumo:
Matrix-assisted laser desorption/ionisation (MALDI) mass spectrometry (MS) is a highly versatile and sensitive analytical technique, which is known for its soft ionisation of biomolecules such as peptides and proteins. Generally, MALDI MS analysis requires little sample preparation, and in some cases like MS profiling it can be automated through the use of robotic liquid-handling systems. For more than a decade now, MALDI MS has been extensively utilised in the search for biomarkers that could aid clinicians in diagnosis, prognosis, and treatment decision making. This review examines the various MALDI-based MS techniques like MS imaging, MS profiling and proteomics in-depth analysis where MALDI MS follows fractionation and separation methods such as gel electrophoresis, and how these have contributed to prostate cancer biomarker research. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge.
Resumo:
Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.
Resumo:
The current state of the art in the planning and coordination of autonomous vehicles is based upon the presence of speed lanes. In a traffic scenario where there is a large diversity between vehicles the removal of speed lanes can generate a significantly higher traffic bandwidth. Vehicle navigation in such unorganized traffic is considered. An evolutionary based trajectory planning technique has the advantages of making driving efficient and safe, however it also has to surpass the hurdle of computational cost. In this paper, we propose a real time genetic algorithm with Bezier curves for trajectory planning. The main contribution is the integration of vehicle following and overtaking behaviour for general traffic as heuristics for the coordination between vehicles. The resultant coordination strategy is fast and near-optimal. As the vehicles move, uncertainties may arise which are constantly adapted to, and may even lead to either the cancellation of an overtaking procedure or the initiation of one. Higher level planning is performed by Dijkstra's algorithm which indicates the route to be followed by the vehicle in a road network. Re-planning is carried out when a road blockage or obstacle is detected. Experimental results confirm the success of the algorithm subject to optimal high and low-level planning, re-planning and overtaking.