878 resultados para Multi-resolution Method
Resumo:
Multi-rate multicarrier DS-CDMA is a potentially attractive multiple access method for future wireless networks that must support multimedia, and thus multi-rate, traffic. Considering that high performance detection such as coherent demodulation needs the explicit knowledge of the channel, this paper proposes a subspace-based blind adaptive algorithm for timing acquisition and channel estimation in asynchronous multirate multicarrier DS-CDMA systems, which is applicable to both multicode and variable spreading factor systems.
Resumo:
In this paper, we propose a new on-line learning algorithm for the non-linear system identification: the swarm intelligence aided multi-innovation recursive least squares (SI-MRLS) algorithm. The SI-MRLS algorithm applies the particle swarm optimization (PSO) to construct a flexible radial basis function (RBF) model so that both the model structure and output weights can be adapted. By replacing an insignificant RBF node with a new one based on the increment of error variance criterion at every iteration, the model remains at a limited size. The multi-innovation RLS algorithm is used to update the RBF output weights which are known to have better accuracy than the classic RLS. The proposed method can produces a parsimonious model with good performance. Simulation result are also shown to verify the SI-MRLS algorithm.
Resumo:
Foundation construction process has been an important key point in a successful construction engineering. The frequency of using diaphragm wall construction method among many deep excavation construction methods in Taiwan is the highest in the world. The traditional view of managing diaphragm wall unit in the sequencing of construction activities is to establish each phase of the sequencing of construction activities by heuristics. However, it conflicts final phase of engineering construction with unit construction and effects planning construction time. In order to avoid this kind of situation, we use management of science in the study of diaphragm wall unit construction to formulate multi-objective combinational optimization problem. Because the characteristic (belong to NP-Complete problem) of problem mathematic model is multi-objective and combining explosive, it is advised that using the 2-type Self-Learning Neural Network (SLNN) to solve the N=12, 24, 36 of diaphragm wall unit in the sequencing of construction activities program problem. In order to compare the liability of the results, this study will use random researching method in comparison with the SLNN. It is found that the testing result of SLNN is superior to random researching method in whether solution-quality or Solving-efficiency.
Resumo:
Changes to the Northern Hemisphere winter (December, January and February) extratropical storm tracks and cyclones in a warming climate are investigated. Two idealised climate change experiments with HiGEM1.1, a doubled CO2 and a quadrupled CO2 experiment, are compared against a present day control run. An objective feature tracking method is used and a focus given to regional changes. The climatology of extratropical storm tracks from the control run is shown to be in good agreement with ERA-40, while the frequency distribution of cyclone intensity also compares well. In both simulations the mean climate changes are generally consistent with the simulations of the IPCC AR4 models, with a strongly enhanced surface warming at the winter pole and the reduced lower tropospheric warming over the North Atlantic Ocean associated with the slowdown of the Meridional Overturning Circulation. The circulation changes in the North Atlantic are different between the two idealised simulations with different CO2 forcings. In the North Atlantic the storm tracks are influenced by the slowdown of the MOC, the enhanced surface polar warming, and the enhanced upper tropical troposphere warming, giving a north eastward shift of the storm tracks in the 2XCO2 experiment, but no shift in the 4XCO2 experiment. Over the Pacific, in the 2XCO2 experiment, changes in the mean climate are associated with local temperature changes, while in the 4XCO2 experiment the changes in the Pacific are impacted by the weakened tropical circulation. The storm track changes are consistent with the shifts in the zonal wind. Total cyclone numbers are found to decrease over the Northern Hemisphere with increasing CO2 forcing. Changes in cyclone intensity are found using 850hPa vorticity, mean sea level pressure, and 850hPa winds. The intensity of the Northern Hemisphere cyclones is found to decrease relative to the control.
Resumo:
The consistency of precipitation variability estimated from the multiple satellite-based observing systems is assessed. There is generally good agreement between TRMM TMI, SSM/I, GPCP and AMSRE datasets for the inter-annual variability of precipitation since 1997 but the HOAPS dataset appears to overestimate the magnitude of variability. Over the tropical ocean the TRMM 3B42 dataset produces unrealistic variabilitys. Based upon deseasonalised GPCP data for the period 1998-2008, the sensitivity of global mean precipitation (P) to surface temperature (T) changes (dP/dT) is about 6%/K, although a smaller sensitivity of 3.6%/K is found using monthly GPCP data over the longer period 1989-2008. Over the tropical oceans dP/dT ranges from 10-30%/K depending upon time-period and dataset while over tropical land dP/dT is -8 to -11%/K for the 1998-2008 period. Analyzing the response of the tropical ocean precipitation intensity distribution to changes in T we find the wetter area P shows a strong positive response to T of around 20%/K. The response over the drier tropical regimes is less coherent and varies with datasets, but responses over the tropical land show significant negative relationships over an interannual time-scale. The spatial and temporal resolutions of the datasets strongly influence the precipitation responses over the tropical oceans and help explain some of the discrepancy between different datasets. Consistency between datasets is found to increase on averaging from daily to 5-day time-scales and considering a 1o (or coarser) spatial resolution. Defining the wet and dry tropical ocean regime by the 60th percentile of P intensity, the 5-day average, 1o TMI data exhibits a coherent drying of the dry regime at the rate of -20%/K and the wet regime becomes wetter at a similar rate with warming.
Resumo:
An experimental method is described which enables the inelastically scattered X-ray component to be removed from diffractometer data prior to radial density function analysis. At each scattering angle an energy spectrum is generated from a Si(Li) detector combined with a multi-channel analyser from which the coherently scattered component is separated. The data obtained from organic polymers has an improved signal/noise ratio at high values of scattering angle, and a commensurate enhancement of resolution of the RDF at low r is demonstrated for the case of PMMA (ICI `Perspex'). The method obviates the need for the complicated correction for multiple scattering.
Resumo:
Controllers for feedback substitution schemes demonstrate a trade-off between noise power gain and normalized response time. Using as an example the design of a controller for a radiometric transduction process subjected to arbitrary noise power gain and robustness constraints, a Pareto-front of optimal controller solutions fulfilling a range of time-domain design objectives can be derived. In this work, we consider designs using a loop shaping design procedure (LSDP). The approach uses linear matrix inequalities to specify a range of objectives and a genetic algorithm (GA) to perform a multi-objective optimization for the controller weights (MOGA). A clonal selection algorithm is used to further provide a directed search of the GA towards the Pareto front. We demonstrate that with the proposed methodology, it is possible to design higher order controllers with superior performance in terms of response time, noise power gain and robustness.
Resumo:
The accurate prediction of the biochemical function of a protein is becoming increasingly important, given the unprecedented growth of both structural and sequence databanks. Consequently, computational methods are required to analyse such data in an automated manner to ensure genomes are annotated accurately. Protein structure prediction methods, for example, are capable of generating approximate structural models on a genome-wide scale. However, the detection of functionally important regions in such crude models, as well as structural genomics targets, remains an extremely important problem. The method described in the current study, MetSite, represents a fully automatic approach for the detection of metal-binding residue clusters applicable to protein models of moderate quality. The method involves using sequence profile information in combination with approximate structural data. Several neural network classifiers are shown to be able to distinguish metal sites from non-sites with a mean accuracy of 94.5%. The method was demonstrated to identify metal-binding sites correctly in LiveBench targets where no obvious metal-binding sequence motifs were detectable using InterPro. Accurate detection of metal sites was shown to be feasible for low-resolution predicted structures generated using mGenTHREADER where no side-chain information was available. High-scoring predictions were observed for a recently solved hypothetical protein from Haemophilus influenzae, indicating a putative metal-binding site.
Resumo:
World-wide structural genomics initiatives are rapidly accumulating structures for which limited functional information is available. Additionally, state-of-the art structural prediction programs are now capable of generating at least low resolution structural models of target proteins. Accurate detection and classification of functional sites within both solved and modelled protein structures therefore represents an important challenge. We present a fully automatic site detection method, FuncSite, that uses neural network classifiers to predict the location and type of functionally important sites in protein structures. The method is designed primarily to require only backbone residue positions without the need for specific side-chain atoms to be present. In order to highlight effective site detection in low resolution structural models FuncSite was used to screen model proteins generated using mGenTHREADER on a set of newly released structures. We found effective metal site detection even for moderate quality protein models illustrating the robustness of the method.
Resumo:
Motivation: In order to enhance genome annotation, the fully automatic fold recognition method GenTHREADER has been improved and benchmarked. The previous version of GenTHREADER consisted of a simple neural network which was trained to combine sequence alignment score, length information and energy potentials derived from threading into a single score representing the relationship between two proteins, as designated by CATH. The improved version incorporates PSI-BLAST searches, which have been jumpstarted with structural alignment profiles from FSSP, and now also makes use of PSIPRED predicted secondary structure and bi-directional scoring in order to calculate the final alignment score. Pairwise potentials and solvation potentials are calculated from the given sequence alignment which are then used as inputs to a multi-layer, feed-forward neural network, along with the alignment score, alignment length and sequence length. The neural network has also been expanded to accommodate the secondary structure element alignment (SSEA) score as an extra input and it is now trained to learn the FSSP Z-score as a measurement of similarity between two proteins. Results: The improvements made to GenTHREADER increase the number of remote homologues that can be detected with a low error rate, implying higher reliability of score, whilst also increasing the quality of the models produced. We find that up to five times as many true positives can be detected with low error rate per query. Total MaxSub score is doubled at low false positive rates using the improved method.
Resumo:
This paper introduces new insights into the hydrochemical functioning of lowland river systems using field-based spectrophotometric and electrode technologies. The streamwater concentrations of nitrogen species and phosphorus fractions were measured at hourly intervals on a continuous basis at two contrasting sites on tributaries of the River Thames – one draining a rural catchment, the River Enborne, and one draining a more urban system, The Cut. The measurements complement those from an existing network of multi-parameter water quality sondes maintained across the Thames catchment and weekly monitoring based on grab samples. The results of the sub-daily monitoring show that streamwater phosphorus concentrations display highly complex dynamics under storm conditions dependent on the antecedent catchment wetness, and that diurnal phosphorus and nitrogen cycles occur under low flow conditions. The diurnal patterns highlight the dominance of sewage inputs in controlling the streamwater phosphorus and nitrogen concentrations at low flows, even at a distance of 7 km from the nearest sewage treatment works in the rural River Enborne. The time of sample collection is important when judging water quality against ecological thresholds or standards. An exhaustion of the supply of phosphorus from diffuse and multiple septic tank sources during storm events was evident and load estimation was not improved by sub-daily monitoring beyond that achieved by daily sampling because of the eventual reduction in the phosphorus mass entering the stream during events. The results highlight the utility of sub-daily water quality measurements and the discussion considers the practicalities and challenges of in situ, sub-daily monitoring.
Resumo:
Liquid clouds play a profound role in the global radiation budget but it is difficult to remotely retrieve their vertical profile. Ordinary narrow field-of-view (FOV) lidars receive a strong return from such clouds but the information is limited to the first few optical depths. Wideangle multiple-FOV lidars can isolate radiation scattered multiple times before returning to the instrument, often penetrating much deeper into the cloud than the singly-scattered signal. These returns potentially contain information on the vertical profile of extinction coefficient, but are challenging to interpret due to the lack of a fast radiative transfer model for simulating them. This paper describes a variational algorithm that incorporates a fast forward model based on the time-dependent two-stream approximation, and its adjoint. Application of the algorithm to simulated data from a hypothetical airborne three-FOV lidar with a maximum footprint width of 600m suggests that this approach should be able to retrieve the extinction structure down to an optical depth of around 6, and total opticaldepth up to at least 35, depending on the maximum lidar FOV. The convergence behavior of Gauss-Newton and quasi-Newton optimization schemes are compared. We then present results from an application of the algorithm to observations of stratocumulus by the 8-FOV airborne “THOR” lidar. It is demonstrated how the averaging kernel can be used to diagnose the effective vertical resolution of the retrieved profile, and therefore the depth to which information on the vertical structure can be recovered. This work enables exploitation of returns from spaceborne lidar and radar subject to multiple scattering more rigorously than previously possible.
Resumo:
Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.
Resumo:
Shiga toxin producing Escherichia coli (STEC) strains are foodborne pathogens whose ability to produce Shiga toxin (Stx) is due to the integration of Stx-encoding lambdoid bacteriophage (Stx phage). Circulating, infective Stx phages are very difficult to isolate, purify and propagate such that there is no information on their genetic composition and properties. Here we describe a novel approach that exploits the phage's ability to infect their host and form a lysogen, thus enabling purification of Stx phages by a series of sequential lysogen isolation and induction steps. A total of 15 Stx phages were rigorously purified from water samples in this way, classified by TEM and genotyped using a PCR-based multi-loci characterisation system. Each phage possessed only one variant of each target gene type, thus confirming its purity, with 9 of the 15 phages possessing a short tail-spike gene and identified by TEM as Podoviridae. The remaining 6 phages possessed long tails, four of which appeared to be contractile in nature (Myoviridae) and two of which were morphologically very similar to bacteriophage lambda (Siphoviridae).
Resumo:
We show that the four-dimensional variational data assimilation method (4DVar) can be interpreted as a form of Tikhonov regularization, a very familiar method for solving ill-posed inverse problems. It is known from image restoration problems that L1-norm penalty regularization recovers sharp edges in the image more accurately than Tikhonov, or L2-norm, penalty regularization. We apply this idea from stationary inverse problems to 4DVar, a dynamical inverse problem, and give examples for an L1-norm penalty approach and a mixed total variation (TV) L1–L2-norm penalty approach. For problems with model error where sharp fronts are present and the background and observation error covariances are known, the mixed TV L1–L2-norm penalty performs better than either the L1-norm method or the strong constraint 4DVar (L2-norm)method. A strength of the mixed TV L1–L2-norm regularization is that in the case where a simplified form of the background error covariance matrix is used it produces a much more accurate analysis than 4DVar. The method thus has the potential in numerical weather prediction to overcome operational problems with poorly tuned background error covariance matrices.