945 resultados para filters
Resumo:
The passive beam delivery system in the superficially-placed tumor therapy terminal at Heavy Ion Researc h Facility in Lanzhou (HIRFL), which includes two orthogonal dipole magnets as scanning system, a motor-driven energy degrader as range-shifter, series of ridge filters as range modulator and a multileaf collimator, is introduced in detail. The capacities of its important components and the whole system have been verified experimentally. The tests of the ridge filter for extending Bragg peak and the range shifter for energy adjustment show both work well. To examine the passive beam delivery system, a beam shaping experiment were carried out, simulating a three-dimensional (3D) conformal irradiation to a tumor. The encouraging experimental result confirms that 3D layer-stacking conformal irradiation can be performed by means of the passive system. The validation of the beam delivery system establishes a substantial basis for upcoming clinical trial for superficially-placed tumors with heavy ions in the therapy terminal at HIRFL.
Resumo:
The use of biofilms as nanostructure-engineering materials is discussed and exemplified using ZnO nanorods. Three examples are presented for illustration, the immobilization of ZnO-nanorod arrays on the inner wall of a polystyrene centrifuge tube using S. thermophilus, the morphological organization of ZnO "filters" using S. thermophilus. And the design and implementation of a ZnO-decorated Ag framework using E. coli.
Resumo:
Large-insert bacterial artificial chromosome (BAC) libraries are necessary for advanced genetics and genomics research. To facilitate gene cloning and characterization, genome analysis, and physical mapping of scallop, two BAC libraries were constructed from nuclear DNA of Zhikong scallop, Chlamys farreri Jones et Preston. The libraries were constructed in the BamHI and MboI sites of the vector pECBAC1, respectively. The BamHI library consists of 73,728 clones, and approximately 99% of the clones contain scallop nuclear DNA inserts with an average size of 110 kb, covering 8.0x haploid genome equivalents. Similarly, the MboI library consists of 7680 clones, with an average insert of 145 kb and no insert-empty clones, thus providing a genome coverage of 1.1x. The combined libraries collectively contain a total of 81,408 BAC clones arrayed in 212 384-well microtiter plates, representing 9.1x haploid genome equivalents and having a probability of greater than 99% of discovering at least one positive clone with a single-copy sequence. High-density clone filters prepared from a subset of the two libraries were screened with nine pairs of Overgos designed from the cDNA or DNA sequences of six genes involved in the innate immune system of mollusks. Positive clones were identified for every gene, with an average of 5.3 BAC clones per gene probe. These results suggest that the two scallop BAC libraries provide useful tools for gene cloning, genome physical mapping, and large-scale sequencing in the species.
Resumo:
Two biological aerated filters (BAF) were setup for ammonia removal treatment of the circulation water in a marine aquaculture. One of the BAFs was bioaugmented with a heterotrophic nitrifying bacterium, Lutimonas sp. H10, where the ammonia removal was not improved and the massive inoculation was even followed by a nitrification breakdown from day 9 to 18. The nitrification was remained stable in control BAF operated under the same conditions. Fluorescent in situ hybridization (FISH) with rRNA-targeted probes and cultivable method revealed that Lutimonas sp. H10 almost disappeared from the bioaugomented BAF within 3 d, and this was mainly due to the infection of a specific phage as revealed by flask experiment, plaque assay and transmission electron observation. Analyses of 16S rRNA gene libraries showed that bacterial groups from two reactors evolved differently and an overgrowth of protozoa was observed in the bioaugmented BAR Therefore, phage infection and poor biofilm forming ability of the inoculated strain are the main reasons for bioaugmentation failure. In addition, gazing by protozoa of the bacteria might be the reason for the nitrification breakdown in bioaugmented BAF during day 9-18.
Resumo:
介绍了一类载人潜水器导航系统的组成,接着阐述了基于工业以太网的信息采集模块.由于载人潜水器动力学模型存在未建模扰动以及各种传感器存在不同程度的误差,需要采用卡尔曼滤波器(KF)等方法进行数据滤波,最后将滤波后的数据用于该类载人潜水器的导航研究.半物理仿真平台结果表明,载人潜水器的导航精度得到了大幅度提高.
Resumo:
The seismic survey is the most effective prospecting geophysical method during exploration and development of oil/gas. The structure and the lithology of the geological body become increasingly complex now. So it must assure that the seismic section own upper resolution if we need accurately describe the targets. High signal/noise ratio is the precondition of high-resolution. For the sake of improving signal/noise ratio, we put forward four methods for eliminating random noise on the basis of detailed analysis of the technique for noise elimination using prediction filtering in f-x-y domain. The four methods are put forward for settling different problems, which are in the technique for noise elimination using prediction filtering in f-x-y domain. For weak noise and large filters, the response of the noise to the filter is little. For strong noise and short filters, the response of the noise to the filter is important. For the response of the noise, the predicting operators are inaccurate. The inaccurate operators result in incorrect results. So we put forward the method using prediction filtering by inversion in f-x-y domain. The method makes the assumption that the seismic signal comprises predictable proportion and unpredictable proportion. The transcendental information about predicting operator is introduced in the function. The method eliminates the response of the noise to filtering operator, and assures that the filtering operators are accurate. The filtering results are effectively improved by the method. When the dip of the stratum is very complex, we generally divide the data into rectangular patches in order to obtain the predicting operators using prediction filtering in f-x-y domain. These patches usually need to have significant overlap in order to get a good result. The overlap causes that the data is repeatedly used. It effectively increases the size of the data. The computational cost increases with the size of the data. The computational efficiency is depressed. The predicting operators, which are obtained by general prediction filtering in f-x-y domain, can not describe the change of the dip when the dip of the stratum is very complex. It causes that the filtering results are aliased. And each patch is an independent problem. In order to settle these problems, we put forward the method for eliminating noise using space varying prediction filtering in f-x-y domain. The predicting operators accordingly change with space varying in this method. Therefore it eliminates the false event in the result. The transcendental information about predicting operator is introduced into the function. To obtain the predicting operators of each patch is no longer independent problem, but related problem. Thus it avoids that the data is repeatedly used, and improves computational efficiency. The random noise that is eliminated by prediction filtering in f-x-y domain is Gaussian noise. The general method can't effectively eliminate non-Gaussian noise. The prediction filtering method using lp norm (especially p=l) can effectively eliminate non-Gaussian noise in f-x-y domain. The method is described in this paper. Considering the dip of stratum can be accurately obtained, we put forward the method for eliminating noise using prediction filtering under the restriction of the dip in f-x-y domain. The method can effectively increase computational efficiency and improve the result. Through calculating in the theoretic model and applying it to the field data, it is proved that the four methods in this paper can effectively solve these different problems in the general method. Their practicability is very better. And the effect is very obvious.
Resumo:
Control of machines that exhibit flexibility becomes important when designers attempt to push the state of the art with faster, lighter machines. Three steps are necessary for the control of a flexible planet. First, a good model of the plant must exist. Second, a good controller must be designed. Third, inputs to the controller must be constructed using knowledge of the system dynamic response. There is a great deal of literature pertaining to modeling and control but little dealing with the shaping of system inputs. Chapter 2 examines two input shaping techniques based on frequency domain analysis. The first involves the use of the first deriviate of a gaussian exponential as a driving function template. The second, acasual filtering, involves removal of energy from the driving functions at the resonant frequencies of the system. Chapter 3 presents a linear programming technique for generating vibration-reducing driving functions for systems. Chapter 4 extends the results of the previous chapter by developing a direct solution to the new class of driving functions. A detailed analysis of the new technique is presented from five different perspectives and several extensions are presented. Chapter 5 verifies the theories of the previous two chapters with hardware experiments. Because the new technique resembles common signal filtering, chapter 6 compares the new approach to eleven standard filters. The new technique will be shown to result in less residual vibrations, have better robustness to system parameter uncertainty, and require less computation than other currently used shaping techniques.
Resumo:
The goal of this thesis is to apply the computational approach to motor learning, i.e., describe the constraints that enable performance improvement with experience and also the constraints that must be satisfied by a motor learning system, describe what is being computed in order to achieve learning, and why it is being computed. The particular tasks used to assess motor learning are loaded and unloaded free arm movement, and the thesis includes work on rigid body load estimation, arm model estimation, optimal filtering for model parameter estimation, and trajectory learning from practice. Learning algorithms have been developed and implemented in the context of robot arm control. The thesis demonstrates some of the roles of knowledge in learning. Powerful generalizations can be made on the basis of knowledge of system structure, as is demonstrated in the load and arm model estimation algorithms. Improving the performance of parameter estimation algorithms used in learning involves knowledge of the measurement noise characteristics, as is shown in the derivation of optimal filters. Using trajectory errors to correct commands requires knowledge of how command errors are transformed into performance errors, i.e., an accurate model of the dynamics of the controlled system, as is demonstrated in the trajectory learning work. The performance demonstrated by the algorithms developed in this thesis should be compared with algorithms that use less knowledge, such as table based schemes to learn arm dynamics, previous single trajectory learning algorithms, and much of traditional adaptive control.
Resumo:
This report describes the implementation of a theory of edge detection, proposed by Marr and Hildreth (1979). According to this theory, the image is first processed independently through a set of different size filters, whose shape is the Laplacian of a Gaussian, ***. Zero-crossings in the output of these filters mark the positions of intensity changes at different resolutions. Information about these zero-crossings is then used for deriving a full symbolic description of changes in intensity in the image, called the raw primal sketch. The theory is closely tied with early processing in the human visual systems. In this report, we first examine the critical properties of the initial filters used in the edge detection process, both from a theoretical and practical standpoint. The implementation is then used as a test bed for exploring aspects of the human visual system; in particular, acuity and hyperacuity. Finally, we present some preliminary results concerning the relationship between zero-crossings detected at different resolutions, and some observations relevant to the process by which the human visual system integrates descriptions of intensity changes obtained at different resolutions.
Resumo:
R. Zwiggelaar, Q. Yang, E. Garcia-Pardo and C.R. Bull, 'Using spectral information and machine vision for bruise detection on peaches and apricots', Journal of Agricultural Engineering Research 63 (4), 323-332 1996)
Resumo:
R. Zwiggelaar and C.R. Bull, 'Optical determination of fractal dimensions using Fourier transforms', Optical Engineering 34 (5), 1325-1332 (1995)
Resumo:
We present new, simple, efficient data structures for approximate reconciliation of set differences, a useful standalone primitive for peer-to-peer networks and a natural subroutine in methods for exact reconciliation. In the approximate reconciliation problem, peers A and B respectively have subsets of elements SA and SB of a large universe U. Peer A wishes to send a short message M to peer B with the goal that B should use M to determine as many elements in the set SB–SA as possible. To avoid the expense of round trip communication times, we focus on the situation where a single message M is sent. We motivate the performance tradeoffs between message size, accuracy and computation time for this problem with a straightforward approach using Bloom filters. We then introduce approximation reconciliation trees, a more computationally efficient solution that combines techniques from Patricia tries, Merkle trees, and Bloom filters. We present an analysis of approximation reconciliation trees and provide experimental results comparing the various methods proposed for approximate reconciliation.
Resumo:
We propose an economic mechanism to reduce the incidence of malware that delivers spam. Earlier research proposed attention markets as a solution for unwanted messages, and showed they could provide more net benefit than alternatives such as filtering and taxes. Because it uses a currency system, Attention Bonds faces a challenge. Zombies, botnets, and various forms of malware might steal valuable currency instead of stealing unused CPU cycles. We resolve this problem by taking advantage of the fact that the spam-bot problem has been reduced to financial fraud. As such, the large body of existing work in that realm can be brought to bear. By drawing an analogy between sending and spending, we show how a market mechanism can detect and prevent spam malware. We prove that by using a currency (i) each instance of spam increases the probability of detecting infections, and (ii) the value of eradicating infections can justify insuring users against fraud. This approach attacks spam at the source, a virtue missing from filters that attack spam at the destination. Additionally, the exchange of currency provides signals of interest that can improve the targeting of ads. ISPs benefit from data management services and consumers benefit from the higher average value of messages they receive. We explore these and other secondary effects of attention markets, and find them to offer, on the whole, attractive economic benefits for all – including consumers, advertisers, and the ISPs.
Resumo:
Multiple sound sources often contain harmonics that overlap and may be degraded by environmental noise. The auditory system is capable of teasing apart these sources into distinct mental objects, or streams. Such an "auditory scene analysis" enables the brain to solve the cocktail party problem. A neural network model of auditory scene analysis, called the AIRSTREAM model, is presented to propose how the brain accomplishes this feat. The model clarifies how the frequency components that correspond to a give acoustic source may be coherently grouped together into distinct streams based on pitch and spatial cues. The model also clarifies how multiple streams may be distinguishes and seperated by the brain. Streams are formed as spectral-pitch resonances that emerge through feedback interactions between frequency-specific spectral representaion of a sound source and its pitch. First, the model transforms a sound into a spatial pattern of frequency-specific activation across a spectral stream layer. The sound has multiple parallel representations at this layer. A sound's spectral representation activates a bottom-up filter that is sensitive to harmonics of the sound's pitch. The filter activates a pitch category which, in turn, activate a top-down expectation that allows one voice or instrument to be tracked through a noisy multiple source environment. Spectral components are suppressed if they do not match harmonics of the top-down expectation that is read-out by the selected pitch, thereby allowing another stream to capture these components, as in the "old-plus-new-heuristic" of Bregman. Multiple simultaneously occuring spectral-pitch resonances can hereby emerge. These resonance and matching mechanisms are specialized versions of Adaptive Resonance Theory, or ART, which clarifies how pitch representations can self-organize durin learning of harmonic bottom-up filters and top-down expectations. The model also clarifies how spatial location cues can help to disambiguate two sources with similar spectral cures. Data are simulated from psychophysical grouping experiments, such as how a tone sweeping upwards in frequency creates a bounce percept by grouping with a downward sweeping tone due to proximity in frequency, even if noise replaces the tones at their interection point. Illusory auditory percepts are also simulated, such as the auditory continuity illusion of a tone continuing through a noise burst even if the tone is not present during the noise, and the scale illusion of Deutsch whereby downward and upward scales presented alternately to the two ears are regrouped based on frequency proximity, leading to a bounce percept. Since related sorts of resonances have been used to quantitatively simulate psychophysical data about speech perception, the model strengthens the hypothesis the ART-like mechanisms are used at multiple levels of the auditory system. Proposals for developing the model to explain more complex streaming data are also provided.