18 resultados para Effective notch method

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new state estimator algorithm is based on a neurofuzzy network and the Kalman filter algorithm. The major contribution of the paper is recognition of a bias problem in the parameter estimation of the state-space model and the introduction of a simple, effective prefiltering method to achieve unbiased parameter estimates in the state-space model, which will then be applied for state estimation using the Kalman filtering algorithm. Fundamental to this method is a simple prefiltering procedure using a nonlinear principal component analysis method based on the neurofuzzy basis set. This prefiltering can be performed without prior system structure knowledge. Numerical examples demonstrate the effectiveness of the new approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Targeted Induced Loci Lesions IN Genomes (TILLING) is increasingly being used to generate and identify mutations in target genes of crop genomes. TILLING populations of several thousand lines have been generated in a number of crop species including Brassica rapa. Genetic analysis of mutants identified by TILLING requires an efficient, high-throughput and cost effective genotyping method to track the mutations through numerous generations. High resolution melt (HRM) analysis has been used in a number of systems to identify single nucleotide polymorphisms (SNPs) and insertion/deletions (IN/DELs) enabling the genotyping of different types of samples. HRM is ideally suited to high-throughput genotyping of multiple TILLING mutants in complex crop genomes. To date it has been used to identify mutants and genotype single mutations. The aim of this study was to determine if HRM can facilitate downstream analysis of multiple mutant lines identified by TILLING in order to characterise allelic series of EMS induced mutations in target genes across a number of generations in complex crop genomes. Results: We demonstrate that HRM can be used to genotype allelic series of mutations in two genes, BraA.CAX1a and BraA.MET1.a in Brassica rapa. We analysed 12 mutations in BraA.CAX1.a and five in BraA.MET1.a over two generations including a back-cross to the wild-type. Using a commercially available HRM kit and the Lightscanner™ system we were able to detect mutations in heterozygous and homozygous states for both genes. Conclusions: Using HRM genotyping on TILLING derived mutants, it is possible to generate an allelic series of mutations within multiple target genes rapidly. Lines suitable for phenotypic analysis can be isolated approximately 8-9 months (3 generations) from receiving M3 seed of Brassica rapa from the RevGenUK TILLING service.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas and performing a monochromatic radiation calculation for each point. In this presentation it is shown that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K/day due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such that they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide, and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K/day can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K/day for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This letter presents an effective approach for selection of appropriate terrain modeling methods in forming a digital elevation model (DEM). This approach achieves a balance between modeling accuracy and modeling speed. A terrain complexity index is defined to represent a terrain's complexity. A support vector machine (SVM) classifies terrain surfaces into either complex or moderate based on this index associated with the terrain elevation range. The classification result recommends a terrain modeling method for a given data set in accordance with its required modeling accuracy. Sample terrain data from the lunar surface are used in constructing an experimental data set. The results have shown that the terrain complexity index properly reflects the terrain complexity, and the SVM classifier derived from both the terrain complexity index and the terrain elevation range is more effective and generic than that designed from either the terrain complexity index or the terrain elevation range only. The statistical results have shown that the average classification accuracy of SVMs is about 84.3% ± 0.9% for terrain types (complex or moderate). For various ratios of complex and moderate terrain types in a selected data set, the DEM modeling speed increases up to 19.5% with given DEM accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The difference between cirrus emissivities at 8 and 11 μm is sensitive to the mean effective ice crystal size of the cirrus cloud, De. By using single scattering properties of ice crystals shaped as planar polycrystals, diameters of up to about 70 μm can be retrieved, instead of up to 45 μm assuming spheres or hexagonal columns. The method described in this article is used for a global determination of mean effective ice crystal sizes of cirrus clouds from TOVS satellite observations. A sensitivity study of the De retrieval to uncertainties in hypotheses on ice crystal shape, size distributions, and temperature profiles, as well as in vertical and horizontal cloud heterogeneities shows that uncertainties can be as large as 30%. However, the TOVS data set is one of few data sets which provides global and long-term coverage. Having analyzed the years 1987–1991, it was found that measured effective ice crystal diameters De are stable from year to year. For 1990 a global median De of 53.5 μm was determined. Averages distinguishing ocean/land, season, and latitude lie between 23 μm in winter over Northern Hemisphere midlatitude land and 64 μm in the tropics. In general, larger Des are found in regions with higher atmospheric water vapor and for cirrus with a smaller effective emissivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the key issues encountered in testing during the development of high-speed networking hardware systems by documenting a practical method for "real-life like" testing. The proposed method is empowered by modern and commonly available Field Programmable Gate Array (FPGA) technology. Innovative application of standard FPGA blocks in combination with reconfigurability are used as a back-bone of the method. A detailed elaboration of the method is given so as to serve as a general reference. The method is fully characterised and compared to alternatives through a case study proving it to be the most efficient and effective one at a reasonable cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary synthesis methods, as originally described by Dobrowolski, have been shown in previous literature to be an effective method of obtaining anti-reflection coating designs. To make this method even more effective, the combination of a good starting design, the best suited thin-film materials, a realistic optimization target function and a non-gradient optimization method are used in an algorithm written for a PC. Several broadband anti-reflection designs obtained by this new design method are given as examples of its usefulness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Meta-analyses show that cognitive behaviour therapy for psychosis (CBT-P) improves distressing positive symptoms. However, it is a complex intervention involving a range of techniques. No previous study has assessed the delivery of the different elements of treatment and their effect on outcome. Our aim was to assess the differential effect of type of treatment delivered on the effectiveness of CBT-P, using novel statistical methodology. Method. The Psychological Prevention of Relapse in Psychosis (PRP) trial was a multi-centre randomized controlled trial (RCT) that compared CBT-P with treatment as usual (TAU). Therapy was manualized, and detailed evaluations of therapy delivery and client engagement were made. Follow-up assessments were made at 12 and 24 months. In a planned analysis, we applied principal stratification (involving structural equation modelling with finite mixtures) to estimate intention-to-treat (ITT) effects for subgroups of participants, defined by qualitative and quantitative differences in receipt of therapy, while maintaining the constraints of randomization. Results. Consistent delivery of full therapy, including specific cognitive and behavioural techniques, was associated with clinically and statistically significant increases in months in remission, and decreases in psychotic and affective symptoms. Delivery of partial therapy involving engagement and assessment was not effective. Conclusions. Our analyses suggest that CBT-P is of significant benefit on multiple outcomes to patients able to engage in the full range of therapy procedures. The novel statistical methods illustrated in this report have general application to the evaluation of heterogeneity in the effects of treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of building energy efficiency is one of the most effective measures for reducing building energy consumption. This paper proposes a holistic method (HMEEB) for assessing and certifying building energy efficiency based on the D-S (Dempster-Shafer) theory of evidence and the Evidential Reasoning (ER) approach. HMEEB has three main features: (i) it provides both a method to assess and certify building energy efficiency, and exists as an analytical tool to identify improvement opportunities; (ii) it combines a wealth of information on building energy efficiency assessment, including identification of indicators and a weighting mechanism; and (iii) it provides a method to identify and deal with inherent uncertainties within the assessment procedure. This paper demonstrates the robustness, flexibility and effectiveness of the proposed method, using two examples to assess the energy efficiency of two residential buildings, both located in the ‘Hot Summer and Cold Winter’ zone in China. The proposed certification method provides detailed recommendations for policymakers in the context of carbon emission reduction targets and promoting energy efficiency in the built environment. The method is transferable to other countries and regions, using an indicator weighting system to modify local climatic, economic and social factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liquid clouds play a profound role in the global radiation budget but it is difficult to remotely retrieve their vertical profile. Ordinary narrow field-of-view (FOV) lidars receive a strong return from such clouds but the information is limited to the first few optical depths. Wideangle multiple-FOV lidars can isolate radiation scattered multiple times before returning to the instrument, often penetrating much deeper into the cloud than the singly-scattered signal. These returns potentially contain information on the vertical profile of extinction coefficient, but are challenging to interpret due to the lack of a fast radiative transfer model for simulating them. This paper describes a variational algorithm that incorporates a fast forward model based on the time-dependent two-stream approximation, and its adjoint. Application of the algorithm to simulated data from a hypothetical airborne three-FOV lidar with a maximum footprint width of 600m suggests that this approach should be able to retrieve the extinction structure down to an optical depth of around 6, and total opticaldepth up to at least 35, depending on the maximum lidar FOV. The convergence behavior of Gauss-Newton and quasi-Newton optimization schemes are compared. We then present results from an application of the algorithm to observations of stratocumulus by the 8-FOV airborne “THOR” lidar. It is demonstrated how the averaging kernel can be used to diagnose the effective vertical resolution of the retrieved profile, and therefore the depth to which information on the vertical structure can be recovered. This work enables exploitation of returns from spaceborne lidar and radar subject to multiple scattering more rigorously than previously possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter has tested the canopy height profile (CHP) methodology as a way of effective leaf area index (LAIe) and vertical vegetation profile retrieval at a single-tree level. Waveform and discrete airborne LiDAR data from six swaths, as well as from the combined data of six swaths, were used to extract the LAIe of a single live Callitris glaucophylla tree. LAIe was extracted from raw waveform as an intermediate step in the CHP methodology, with two different vegetation-ground reflectance ratios. Discrete point LAIe estimates were derived from the gap probability using the following: 1) single ground returns and 2) all ground returns. LiDAR LAIe retrievals were subsequently compared to hemispherical photography estimates, yielding mean values within ±7% of the latter, depending on the method used. The CHP of a single dead Callitris glaucophylla tree, representing the distribution of vegetation material, was verified with a field profile manually reconstructed from convergent photographs taken with a fixed-focal-length camera. A binwise comparison of the two profiles showed very high correlation between the data reaching R2 of 0.86 for the CHP from combined swaths. Using a study-area-adjusted reflectance ratio improved the correlation between the profiles, but only marginally in comparison to using an arbitrary ratio of 0.5 for the laser wavelength of 1550 nm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Taxonomic free sorting (TFS) is a fast, reliable and new technique in sensory science. The method extends the typical free sorting task where stimuli are grouped according to similarities, by asking respondents to combine their groups two at a time to produce a hierarchy. Previously, TFS has been used for the visual assessment of packaging whereas this study extends the range of potential uses of the technique to incorporate full sensory analysis by the target consumer, which, when combined with hedonic liking scores, was used to generate a novel preference map. Furthermore, to fully evaluate the efficacy of using the sorting method, the technique was evaluated with a healthy older adult consumer group. Participants sorted eight products into groups and described their reason at each stage as they combined those groups, producing a consumer-specific vocabulary. This vocabulary was combined with hedonic data from a separate group of older adults, to give the external preference map. Taxonomic sorting is a simple, fast and effective method for use with older adults, and its combination with liking data can yield a preference map constructed entirely from target consumer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method and oligonucleotide compound for inhibiting replication of a nidovirus in virus-infected animal cells are disclosed. The compound (i) has a nuclease-resistant backbone, (ii) is capable of uptake by the infected cells, (iii) contains between 8-25 nucleotide bases, and (iv) has a sequence capable of disrupting base pairing between the transcriptional regulatory sequences in the 5′ leader region of the positive-strand viral genome and negative-strand 3′ subgenomic region. In practicing the method, infected cells are exposed to the compound in an amount effective to inhibit viral replication.