939 resultados para Frequency range selection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter presents an effective approach for selection of appropriate terrain modeling methods in forming a digital elevation model (DEM). This approach achieves a balance between modeling accuracy and modeling speed. A terrain complexity index is defined to represent a terrain's complexity. A support vector machine (SVM) classifies terrain surfaces into either complex or moderate based on this index associated with the terrain elevation range. The classification result recommends a terrain modeling method for a given data set in accordance with its required modeling accuracy. Sample terrain data from the lunar surface are used in constructing an experimental data set. The results have shown that the terrain complexity index properly reflects the terrain complexity, and the SVM classifier derived from both the terrain complexity index and the terrain elevation range is more effective and generic than that designed from either the terrain complexity index or the terrain elevation range only. The statistical results have shown that the average classification accuracy of SVMs is about 84.3% ± 0.9% for terrain types (complex or moderate). For various ratios of complex and moderate terrain types in a selected data set, the DEM modeling speed increases up to 19.5% with given DEM accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Code for Sustainable Homes (the Code) will require new homes in the United Kingdom to be ‘zero carbon’ from 2016. Drawing upon an evolutionary innovation perspective, this paper contributes to a gap in the literature by investigating which low and zero carbon technologies are actually being used by house builders, rather than the prevailing emphasis on the potentiality of these technologies. Using the results from a questionnaire three empirical contributions are made. First, house builders are selecting a narrow range of technologies. Second, these choices are made to minimise the disruption to their standard design and production templates (SDPTs). Finally, the coalescence around a small group of technologies is expected to intensify with solar-based technologies predicted to become more important. This paper challenges the dominant technical rationality in the literature that technical efficiency and cost benefits are the primary drivers for technology selection. These drivers play an important role but one which is mediated by the logic of maintaining the SDPTs of the house builders. This emphasises the need for construction diffusion of innovation theory to be problematized and developed within the context of business and market regimes constrained and reproduced by resilient technological trajectories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By combining electrostatic measurements of lightning-induced electrostatic field changes with radio frequency lightning location, some field changes from exceptionally distant lightning events are apparent which are inconsistent with the usual inverse cube of distance. Furthermore, by using two measurement sites, a transition zone can be identified beyond which the electric field response reverses polarity. For these severe lightning events, we infer a horizontally extensive charge sheet above a thunderstorm, consistent with a mesospheric halo of several hundred kilometers’ extent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Affymetrix GeneChip arrays are widely used for transcriptomic studies in a diverse range of species. Each gene is represented on a GeneChip array by a probe- set, consisting of up to 16 probe-pairs. Signal intensities across probe- pairs within a probe-set vary in part due to different physical hybridisation characteristics of individual probes with their target labelled transcripts. We have previously developed a technique to study the transcriptomes of heterologous species based on hybridising genomic DNA (gDNA) to a GeneChip array designed for a different species, and subsequently using only those probes with good homology. Results: Here we have investigated the effects of hybridising homologous species gDNA to study the transcriptomes of species for which the arrays have been designed. Genomic DNA from Arabidopsis thaliana and rice (Oryza sativa) were hybridised to the Affymetrix Arabidopsis ATH1 and Rice Genome GeneChip arrays respectively. Probe selection based on gDNA hybridisation intensity increased the number of genes identified as significantly differentially expressed in two published studies of Arabidopsis development, and optimised the analysis of technical replicates obtained from pooled samples of RNA from rice. Conclusion: This mixed physical and bioinformatics approach can be used to optimise estimates of gene expression when using GeneChip arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-density oligonucleotide (oligo) arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L.) Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM) probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P) stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ webcite and may be used to facilitate transcriptomic analyses of a wide range of plant and animal species in the absence of custom arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radar refractivity retrievals can capture near-surface humidity changes, but noisy phase changes of the ground clutter returns limit the accuracy for both klystron- and magnetron-based systems. Observations with a C-band (5.6 cm) magnetron weather radar indicate that the correction for phase changes introduced by local oscillator frequency changes leads to refractivity errors no larger than 0.25 N units: equivalent to a relative humidity change of only 0.25% at 20°C. Requested stable local oscillator (STALO) frequency changes were accurate to 0.002 ppm based on laboratory measurements. More serious are the random phase change errors introduced when targets are not at the range-gate center and there are changes in the transmitter frequency (ΔfTx) or the refractivity (ΔN). Observations at C band with a 2-μs pulse show an additional 66° of phase change noise for a ΔfTx of 190 kHz (34 ppm); this allows the effect due to ΔN to be predicted. Even at S band with klystron transmitters, significant phase change noise should occur when a large ΔN develops relative to the reference period [e.g., ~55° when ΔN = 60 for the Next Generation Weather Radar (NEXRAD) radars]. At shorter wavelengths (e.g., C and X band) and with magnetron transmitters in particular, refractivity retrievals relative to an earlier reference period are even more difficult, and operational retrievals may be restricted to changes over shorter (e.g., hourly) periods of time. Target location errors can be reduced by using a shorter pulse or identified by a new technique making alternate measurements at two closely spaced frequencies, which could even be achieved with a dual–pulse repetition frequency (PRF) operation of a magnetron transmitter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concern that European forest biodiversity is depleted and declining has provoked widespread efforts to improve management practices. To gauge the success of these actions, appropriate monitoring of forest ecosystems is paramount. Multi-species indicators are frequently used to assess the state of biodiversity and its response to implemented management, but generally applicable and objective methodologies for species' selection are lacking. Here we use a niche-based approach, underpinned by coarse quantification of species' resource use, to objectively select species for inclusion in a pan-European forest bird indicator. We identify both the minimum number of species required to deliver full resource coverage and the most sensitive species' combination, and explore the trade-off between two key characteristics, sensitivity and redundancy, associated with indicators comprising different numbers of species. We compare our indicator to an existing forest bird indicator selected on the basis of expert opinion and show it is more representative of the wider community. We also present alternative indicators for regional and forest type specific monitoring and show that species' choice can have a significant impact on the indicator and consequent projections about the state of the biodiversity it represents. Furthermore, by comparing indicator sets drawn from currently monitored species and the full forest bird community, we identify gaps in the coverage of the current monitoring scheme. We believe that adopting this niche-based framework for species' selection supports the objective development of multi-species indicators and that it has good potential to be extended to a range of habitats and taxa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Dietary assessment methods are important tools for nutrition research. Online dietary assessment tools have the potential to become invaluable methods of assessing dietary intake because, compared with traditional methods, they have many advantages including the automatic storage of input data and the immediate generation of nutritional outputs. Objective: The aim of this study was to develop an online food frequency questionnaire (FFQ) for dietary data collection in the “Food4Me” study and to compare this with the validated European Prospective Investigation of Cancer (EPIC) Norfolk printed FFQ. Methods: The Food4Me FFQ used in this analysis was developed to consist of 157 food items. Standardized color photographs were incorporated in the development of the Food4Me FFQ to facilitate accurate quantification of the portion size of each food item. Participants were recruited in two centers (Dublin, Ireland and Reading, United Kingdom) and each received the online Food4Me FFQ and the printed EPIC-Norfolk FFQ in random order. Participants completed the Food4Me FFQ online and, for most food items, participants were requested to choose their usual serving size among seven possibilities from a range of portion size pictures. The level of agreement between the two methods was evaluated for both nutrient and food group intakes using the Bland and Altman method and classification into quartiles of daily intake. Correlations were calculated for nutrient and food group intakes. Results: A total of 113 participants were recruited with a mean age of 30 (SD 10) years (40.7% male, 46/113; 59.3%, 67/113 female). Cross-classification into exact plus adjacent quartiles ranged from 77% to 97% at the nutrient level and 77% to 99% at the food group level. Agreement at the nutrient level was highest for alcohol (97%) and lowest for percent energy from polyunsaturated fatty acids (77%). Crude unadjusted correlations for nutrients ranged between .43 and .86. Agreement at the food group level was highest for “other fruits” (eg, apples, pears, oranges) and lowest for “cakes, pastries, and buns”. For food groups, correlations ranged between .41 and .90. Conclusions: The results demonstrate that the online Food4Me FFQ has good agreement with the validated printed EPIC-Norfolk FFQ for assessing both nutrient and food group intakes, rendering it a useful tool for ranking individuals based on nutrient and food group intakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes a case study involving information technology managers and their new programmer recruitment policy, but the primary interest is methodological. The processes of issue generation and selection and model conceptualization are described. Early use of “magnetic hexagons” allowed the generation of a range of issues, most of which would not have emerged if system dynamics elicitation techniques had been employed. With the selection of a specific issue, flow diagraming was used to conceptualize a model, computer implementation and scenario generation following naturally. Observations are made on the processes of system dynamics modeling, particularly on the need to employ general techniques of knowledge elicitation in the early stages of interventions. It is proposed that flexible approaches should be used to generate, select, and study the issues, since these reduce any biasing of the elicitation toward system dynamics problems and also allow the participants to take up the most appropriate problem- structuring approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The equations of Milsom are evaluated, giving the ground range and group delay of radio waves propagated via the horizontally stratified model ionosphere proposed by Bradley and Dudeney. Expressions for the ground range which allow for the effects of the underlying E- and F1-regions are used to evaluate the basic maximum usable frequency or M-factors for single F-layer hops. An algorithm for the rapid calculation of the M-factor at a given range is developed, and shown to be accurate to within 5%. The results reveal that the M(3000)F2-factor scaled from vertical-incidence ionograms using the standard URSI procedure can be up to 7.5% in error. A simple addition to the algorithm effects a correction to ionogram values to make these accurate to 0.5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bacteria possess a range of mechanisms to move in different environments, and these mechanisms have important direct and correlated impacts on the virulence of opportunistic pathogens. Bacteria use two surface organelles to facilitate motility: a single polar flagellum, and type IV pili, enabling swimming in aqueous habitats and twitching along hard surfaces, respectively. Here, we address whether there are trade-offs between these motility mechanisms, and hence whether different environments could select for altered motility. We experimentally evolved initially isogenic Pseudomonas aeruginosa under conditions that favored the different types of motility, and found evidence for a trade-off mediated by antagonistic pleiotropy between swimming and twitching. Moreover, changes in motility resulted in correlated changes in other behaviors, including biofilm formation and growth within an insect host. This suggests environmental origins of a particular motile opportunistic pathogen could predictably influence motility and virulence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a systematic approach to determine the most suitable analogue redesign method to be used for forward-type converters under digital voltage mode control. The focus of the method is to achieve the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration with pre-warping have the largest phase margins. An algorithm has been developed to determine the frequency of the crossing point where the recommended discretisation method changes. An accurate model of the power stage is used for simulation and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent closeness between the simulation and experimental results is presented. This work provides a concrete example to allow academics and engineers to systematically choose a discretisation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global controls on month-by-month fractional burnt area (2000–2005) were investigated by fitting a generalised linear model (GLM) to Global Fire Emissions Database (GFED) data, with 11 predictor variables representing vegetation, climate, land use and potential ignition sources. Burnt area is shown to increase with annual net primary production (NPP), number of dry days, maximum temperature, grazing-land area, grass/shrub cover and diurnal temperature range, and to decrease with soil moisture, cropland area and population density. Lightning showed an apparent (weak) negative influence, but this disappeared when pure seasonal-cycle effects were taken into account. The model predicts observed geographic and seasonal patterns, as well as the emergent relationships seen when burnt area is plotted against each variable separately. Unimodal relationships with mean annual temperature and precipitation, population density and gross domestic product (GDP) are reproduced too, and are thus shown to be secondary consequences of correlations between different controls (e.g. high NPP with high precipitation; low NPP with low population density and GDP). These findings have major implications for the design of global fire models, as several assumptions in current models – most notably, the widely assumed dependence of fire frequency on ignition rates – are evidently incorrect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current UK intake of non-milk extrinsic sugars (NMES) is above recommendations. Reducing the sugar content of processed high sugar foods through reformulation is one option for reducing consumption of NMES at a population level. However, reformulation can alter the sensory attributes of food products and influence consumer liking. This study evaluated consumer acceptance of a selection of products that are commercially-available in the UK; these included regular and sugar-reduced baked beans, strawberry jam, milk chocolate, cola and cranberry & raspberry juice. Sweeteners were present in the reformulated chocolate (maltitol), cola (aspartame and acesulfame-K) and juice (sucralose) samples. Healthy, non-smoking consumers (n = 116; 55 men, 61 women, age: 33 ± 9 years; BMI: 25.7 ± 4.6 kg/m2) rated the products for overall liking and on liking of appearance, flavor and texture using a nine-point hedonic scale. There were significant differences between standard and reduced sugar products in consumers’ overall liking and on liking of each modality (appearance, flavor and texture; all P < 0.0001). For overall liking, only the regular beans and cola were significantly more liked than their reformulated counterparts (P < 0.0001). Cluster analysis identified three consumer clusters that were representative of different patterns of consumer liking. For the largest cluster (cluster 3: 45%), there was a significant difference in mean liking scores across all products, except jam. Differences in liking were predominantly driven by sweet taste in 2 out of 3 clusters. The current research has demonstrated that a high proportion of consumers prefer conventional products over sugar-reduced products across a wide range of product types (45%) or across selected products (27%), when tasted unbranded, and so there is room for further optimization of commercial reduced sugar products that were evaluated in the current study. Future work should evaluate strategies to facilitate compliance to dietary recommendations on NMES and free sugars, such as the impact of sugar-reduced food exposure on their acceptance.