9 resultados para Filters methods

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spectral design and fabrication of cooled (7K) mid-infrared dichroic beamsplitters and bandpass filter coatings for the MIRI spectrometer and imager are described. Design methods to achieve the spectral performance and coating materials are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The VISIR instrument for the European Southern Observatory (ESO) Very Large Telescope (VLT) is a thermal-infrared imager and spectrometer currently being developed by the French Service d'Astrophysique of CEA Saclay, and Dutch NFRA ASTRON Dwingeloo consortium. This cryogenic instrument will employ precision infrared bandpass filters in the N-( =7.5-14µm) and Q-( =16-28µm) band mid-IR atmospheric windows to study interstellar and circumstellar environments crucial for star and planetary formation theories. As the filters in these mid-IR wavelength ranges are of interest to many astronomical cryogenic instruments, a worldwide astronomical filter consortium was set up with participation from 12 differing institutes, each requiring instrument specific filter operating environments and optical metrology. This paper describes the design and fabrication methods used to manufacture these astronomical consortium filters, including the rationale for the selection of multilayer coating designs, temperature-dependant optical properties of the filter materials and FTIR spectral measurements showing the changes in passband and blocking performance on cooling to <50K. We also describe the development of a 7-14µm broadband antireflection coating deposited on Ge lenses and KRS-5 grisms for cryogenic operation at 40K

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 3D reconstruction of a Golgi-stained dendritic tree from a serial stack of images captured with a transmitted light bright-field microscope is investigated. Modifications to the bootstrap filter are discussed such that the tree structure may be estimated recursively as a series of connected segments. The tracking performance of the bootstrap particle filter is compared against Differential Evolution, an evolutionary global optimisation method, both in terms of robustness and accuracy. It is found that the particle filtering approach is significantly more robust and accurate for the data considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cooled infrared filters and dichroic beam splitters manufactured for the Mid-Infrared Instrument are key optical components for the selection and isolation of wavelengths in the study of astrophysical properties of stars, galaxies, and other planetary objects. We describe the spectral design and manufacture of the precision cooled filter coatings for the spectrometer (7 K) and imager (9 K). Details of the design methods used to achieve the spectral requirements, selection of thin film materials, deposition technique, and testing are presented together with the optical layout of the instrument. (C) 2008 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the spectral design and manufacture of the narrow bandpass filters and 6-18µm broadband antireflection coatings for the 21-channel NASA EOS-AURA High Resolution Dynamics Limb Sounder (HIRDLS). A method of combining the measured spectral characteristics of each filter and antireflection coating, together with the spectral response of the other optical elements in the instrument to obtain a predicted system throughput response is presented. The design methods used to define the filter and coating spectral requirements, choice of filter materials, multilayer designs and deposition techniques are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The High Resolution Dynamics Limb Sounder is described, with particular reference to the atmospheric measurements to be made and the rationale behind the measurement strategy. The demands this strategy places on the filters to be used in the instrument and the designs to which this leads to are described. A second set of filters at an intermediate image plane to reduce "Ghost Imaging" is discussed together with their required spectral properties. A method of combining the spectral characteristics of the primary and secondary filters in each channel are combined together with the spectral response of the detectors and other optical elements to obtain the system spectral response weighted appropriately for the Planck function and atmospheric limb absorption. This method is used to demonstrate whether the out-of-band spectral blocking requirement for a channel is being met and an example calculation is demonstrated showing how the blocking is built up for a representative channel. Finally, the techniques used to produce filters of the necessary sub-millimetre sizes together with the testing methods and procedures used to assess the environmental durability and establish space flight quality are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we compare two different cyclone-tracking algorithms to detect North Atlantic polar lows, which are very intense mesoscale cyclones. Both approaches include spatial filtering, detection, tracking and constraints specific to polar lows. The first method uses digital bandpass-filtered mean sea level pressure (MSLP) fieldsin the spatial range of 200�600 km and is especially designed for polar lows. The second method also uses a bandpass filter but is based on the discrete cosine transforms (DCT) and can be applied to MSLP and vorticity fields. The latter was originally designed for cyclones in general and has been adapted to polar lows for this study. Both algorithms are applied to the same regional climate model output fields from October 1993 to September 1995 produced from dynamical downscaling of the NCEP/NCAR reanalysis data. Comparisons between these two methods show that different filters lead to different numbers and locations of tracks. The DCT is more precise in scale separation than the digital filter and the results of this study suggest that it is more suited for the bandpass filtering of MSLP fields. The detection and tracking parts also influence the numbers of tracks although less critically. After a selection process that applies criteria to identify tracks of potential polar lows, differences between both methods are still visible though the major systems are identified in both.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.