905 resultados para Search space reduction
Resumo:
Mode of access: Internet.
Resumo:
January 1963.
Resumo:
U.S. Atomic Energy Commission Report No. TID-3561(REV. 4).
Resumo:
We introduce a novel way of measuring the entropy of a set of values undergoing changes. Such a measure becomes useful when analyzing the temporal development of an algorithm designed to numerically update a collection of values such as artificial neural network weights undergoing adjustments during learning. We measure the entropy as a function of the phase-space of the values, i.e. their magnitude and velocity of change, using a method based on the abstract measure of entropy introduced by the philosopher Rudolf Carnap. By constructing a time-dynamic two-dimensional Voronoi diagram using Voronoi cell generators with coordinates of value- and value-velocity (change of magnitude), the entropy becomes a function of the cell areas. We term this measure teleonomic entropy since it can be used to describe changes in any end-directed (teleonomic) system. The usefulness of the method is illustrated when comparing the different approaches of two search algorithms, a learning artificial neural network and a population of discovering agents. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
This paper reports on a total electron content space weather study of the nighttime Weddell Sea Anomaly, overlooked by previously published TOPEX/Poseidon climate studies, and of the nighttime ionosphere during the 1996/1997 southern summer. To ascertain the morphology of spatial TEC distribution over the oceans in terms of hourly, geomagnetic, longitudinal and summer-winter variations, the TOPEX TEC, magnetic, and published neutral wind velocity data are utilized. To understand the underlying physical processes, the TEC results are combined with inclination and declination data plus global magnetic field-line maps. To investigate spatial and temporal TEC variations, geographic/magnetic latitudes and local times are computed. As results show, the nighttime Weddell Sea Anomaly is a large (∼1,600(°)2; ∼22 million km2 estimated for a steady ionosphere) space weather feature. Extending between 200°E and 300°E (geographic), it is an ionization enhancement peaking at 50°S–60°S/250°E–270°E and continuing beyond 66°S. It develops where the spacing between the magnetic field lines is wide/medium, easterly declination is large-medium (20°–50°), and inclination is optimum (∼55°S). Its development and hourly variations are closely correlated with wind speed variations. There is a noticeable (∼43%) reduction in its average area during the high magnetic activity period investigated. Southern summer nighttime TECs follow closely the variations of declination and field-line configuration and therefore introduce a longitudinal division of four (Indian, western/eastern Pacific, Atlantic). Northern winter nighttime TECs measured over a limited area are rather uniform longitudinally because of the small declination variation. TOPEX maps depict the expected strong asymmetry in TEC distribution about the magnetic dip equator.
Resumo:
Whilst traditional optimisation techniques based on mathematical programming techniques are in common use, they suffer from their inability to explore the complexity of decision problems addressed using agricultural system models. In these models, the full decision space is usually very large while the solution space is characterized by many local optima. Methods to search such large decision spaces rely on effective sampling of the problem domain. Nevertheless, problem reduction based on insight into agronomic relations and farming practice is necessary to safeguard computational feasibility. Here, we present a global search approach based on an Evolutionary Algorithm (EA). We introduce a multi-objective evaluation technique within this EA framework, linking the optimisation procedure to the APSIM cropping systems model. The approach addresses the issue of system management when faced with a trade-off between economic and ecological consequences.
Resumo:
We investigate the effect of transmitter and receiver array configurations on the stray-light and diffraction-caused crosstalk in free-space optical interconnects. The optical system simulation software (Code V) is used to simulate both the stray-light and diffraction-caused crosstalk. Experimentally measured, spectrally-resolved, near-field images of VCSEL higher order modes were used as extended sources in our simulation model. Our results show that by changing the square lattice geometry to a hexagonal configuration, we obtain the reduction in the stray-light crosstalk of up to 9 dB and an overall signal-to-noise ratio improvement of 3 dB.
Resumo:
This paper derives the performance union bound of space-time trellis codes in orthogonal frequency division multiplexing system (STTC-OFDM) over quasi-static frequency selective fading channels based on the distance spectrum technique. The distance spectrum is the enumeration of the codeword difference measures and their multiplicities by exhausted searching through all the possible error event paths. Exhaustive search approach can be used for low memory order STTC with small frame size. However with moderate memory order STTC and moderate frame size the computational cost of exhaustive search increases exponentially, and may become impractical for high memory order STTCs. This requires advanced computational techniques such as Genetic Algorithms (GAS). In this paper, a GA with sharing function method is used to locate the multiple solutions of the distance spectrum for high memory order STTCs. Simulation evaluates the performance union bound and the complexity comparison of non-GA aided and GA aided distance spectrum techniques. It shows that the union bound give a close performance measure at high signal-to-noise ratio (SNR). It also shows that GA sharing function method based distance spectrum technique requires much less computational time as compared with exhaustive search approach but with satisfactory accuracy.
Resumo:
Molecular transport in phase space is crucial for chemical reactions because it defines how pre-reactive molecular configurations are found during the time evolution of the system. Using Molecular Dynamics (MD) simulated atomistic trajectories we test the assumption of the normal diffusion in the phase space for bulk water at ambient conditions by checking the equivalence of the transport to the random walk model. Contrary to common expectations we have found that some statistical features of the transport in the phase space differ from those of the normal diffusion models. This implies a non-random character of the path search process by the reacting complexes in water solutions. Our further numerical experiments show that a significant long period of non-stationarity in the transition probabilities of the segments of molecular trajectories can account for the observed non-uniform filling of the phase space. Surprisingly, the characteristic periods in the model non-stationarity constitute hundreds of nanoseconds, that is much longer time scales compared to typical lifetime of known liquid water molecular structures (several picoseconds).
Resumo:
Visual search impairment can occur following stroke. The utility of optimal spectral filters on visual search in stroke patients has not been considered to date. The present study measured the effect of optimal spectral filters on visual search response time and accuracy, using a task requiring serial processing. A stroke and control cohort undertook the task three times: (i) using an optimally selected spectral filter; (ii) the subjects were randomly assigned to two groups with group 1 using an optimal filter for two weeks, whereas group 2 used a grey filter for two weeks; (iii) the groups were crossed over with group 1 using a grey filter for a further two weeks and group 2 given an optimal filter, before undertaking the task for the final time. Initial use of an optimal spectral filter improved visual search response time but not error scores in the stroke cohort. Prolonged use of neither an optimal nor a grey filter improved response time or reduced error scores. In fact, response times increased with the filter, regardless of its type, for stroke and control subjects; this outcome may be due to contrast reduction or a reflection of task design, given that significant practice effects were noted. © 2013 a Pion publication.
Resumo:
A free space quantum key distribution system has been demonstrated. Consideration has been given to factors such as field of view and spectral width, to cut down the deleterious effect from background light levels. Suitable optical sources such as lasers and RCLEDs have been investigated as well as optimal wavelength choices, always with a view to building a compact and robust system. The implementation of background reduction measures resulted in a system capable of operating in daylight conditions. An autonomous system was left running and generating shared key material continuously for over 7 days. © 2009 Published by Elsevier B.V..
Resumo:
Principal component analysis (PCA) is well recognized in dimensionality reduction, and kernel PCA (KPCA) has also been proposed in statistical data analysis. However, KPCA fails to detect the nonlinear structure of data well when outliers exist. To reduce this problem, this paper presents a novel algorithm, named iterative robust KPCA (IRKPCA). IRKPCA works well in dealing with outliers, and can be carried out in an iterative manner, which makes it suitable to process incremental input data. As in the traditional robust PCA (RPCA), a binary field is employed for characterizing the outlier process, and the optimization problem is formulated as maximizing marginal distribution of a Gibbs distribution. In this paper, this optimization problem is solved by stochastic gradient descent techniques. In IRKPCA, the outlier process is in a high-dimensional feature space, and therefore kernel trick is used. IRKPCA can be regarded as a kernelized version of RPCA and a robust form of kernel Hebbian algorithm. Experimental results on synthetic data demonstrate the effectiveness of IRKPCA. © 2010 Taylor & Francis.
Resumo:
Nowadays we meet many different evaluation methods regarding the ecological performance of green surfaces and parks. All these methods are extremely valuable in determining how well a green surface performs from ecological aspect and to what extent the environment were damaged if these sites would be built or would be developed any other way causing reduction of green surfaces. The goal of the article is to clarify the differences between two evaluation methods (GSI – Green Space Intensity, BARC – Biological Activity Rate Calculation) suitable for urban green infrastructure analysis and to see if any significant difference can be observed evaluating the same site by these methods. Our research sites are in Budapest and their sizes vary between 2,5-8 acres. The most important aspects of site analysis are the following: size and boundaries of the park, existence or lack of water features, the characteristics of their surfaces and the complexity of vegetation. We summarize the data of the site analysis in tables, make a summarizing diagram for visual representation and draw conclusions from the results. As a final step, we evaluate how these two evaluation systems relate to urban open space developments.
Resumo:
Helicopter-borne electromagnetic sea ice thickness measurements were performed over the Transpolar Drift in late summers of 2001, 2004, and 2007, continuing ground-based measurements since 1991. These show an ongoing reduction of modal and mean ice thicknesses in the region of the North Pole of up to 53 and 44%, respectively, since 2001. A buoy derived ice age model showed that the thinning was mainly due to a regime shift from predominantly multi- and second-year ice in earlier years to first-year ice in 2007, which had modal and mean summer thicknesses of 0.9 and 1.27 m. Measurements of second-year ice which still persisted at the North Pole in April 2007 indicate a reduction of late-summer second-year modal and mean ice thicknesses since 2001 of 20 and 25% to 1.65 and 1.81 m, respectively. The regime shift to younger and thinner ice could soon result in an ice free North Pole during summer.