981 resultados para Frequency diverse array
Resumo:
Every winter, the high-latitude oceans are struck by severe storms that are considerably smaller than the weather-dominating synoptic depressions1. Accompanied by strong winds and heavy precipitation, these often explosively developing mesoscale cyclones—termed polar lows1—constitute a threat to offshore activities such as shipping or oil and gas exploitation. Yet owing to their small scale, polar lows are poorly represented in the observational and global reanalysis data2 often used for climatological investigations of atmospheric features and cannot be assessed in coarse-resolution global simulations of possible future climates. Here we show that in a future anthropogenically warmed climate, the frequency of polar lows is projected to decline. We used a series of regional climate model simulations to downscale a set of global climate change scenarios3 from the Intergovernmental Panel of Climate Change. In this process, we first simulated the formation of polar low systems in the North Atlantic and then counted the individual cases. A previous study4 using NCEP/NCAR re-analysis data5 revealed that polar low frequency from 1948 to 2005 did not systematically change. Now, in projections for the end of the twenty-first century, we found a significantly lower number of polar lows and a northward shift of their mean genesis region in response to elevated atmospheric greenhouse gas concentration. This change can be related to changes in the North Atlantic sea surface temperature and mid-troposphere temperature; the latter is found to rise faster than the former so that the resulting stability is increased, hindering the formation or intensification of polar lows. Our results provide a rare example of a climate change effect in which a type of extreme weather is likely to decrease, rather than increase.
Resumo:
In this paper we are mainly concerned with the development of efficient computer models capable of accurately predicting the propagation of low-to-middle frequency sound in the sea, in axially symmetric (2D) and in fully 3D environments. The major physical features of the problem, i.e. a variable bottom topography, elastic properties of the subbottom structure, volume attenuation and other range inhomogeneities are efficiently treated. The computer models presented are based on normal mode solutions of the Helmholtz equation on the one hand, and on various types of numerical schemes for parabolic approximations of the Helmholtz equation on the other. A new coupled mode code is introduced to model sound propagation in range-dependent ocean environments with variable bottom topography, where the effects of an elastic bottom, of volume attenuation, surface and bottom roughness are taken into account. New computer models based on finite difference and finite element techniques for the numerical solution of parabolic approximations are also presented. They include an efficient modeling of the bottom influence via impedance boundary conditions, they cover wide angle propagation, elastic bottom effects, variable bottom topography and reverberation effects. All the models are validated on several benchmark problems and versus experimental data. Results thus obtained were compared with analogous results from standard codes in the literature.
Resumo:
The paper is concerned with the uniformization of a system of affine recurrence equations. This transformation is used in the design (or compilation) of highly parallel embedded systems (VLSI systolic arrays, signal processing filters, etc.). We present and implement an automatic system to achieve uniformization of systems of affine recurrence equations. We unify the results from many earlier papers, develop some theoretical extensions, and then propose effective uniformization algorithms. Our results can be used in any high level synthesis tool based on polyhedral representation of nested loop computations.
Resumo:
Measuring pollinator performance has become increasingly important with emerging needs for risk assessment in conservation and sustainable agriculture that require multi-year and multi-site comparisons across studies. However, comparing pollinator performance across studies is difficult because of the diversity of concepts and disparate methods in use. Our review of the literature shows many unresolved ambiguities. Two different assessment concepts predominate: the first estimates stigmatic pollen deposition and the underlying pollinator behaviour parameters, while the second estimates the pollinator’s contribution to plant reproductive success, for example in terms of seed set. Both concepts include a number of parameters combined in diverse ways and named under a diversity of synonyms and homonyms. However, these concepts are overlapping because pollen deposition success is the most frequently used proxy for assessing the pollinator’s contribution to plant reproductive success. We analyse the diverse concepts and methods in the context of a new proposed conceptual framework with a modular approach based on pollen deposition, visit frequency, and contribution to seed set relative to the plant’s maximum female reproductive potential. A system of equations is proposed to optimize the balance between idealised theoretical concepts and practical operational methods. Our framework permits comparisons over a range of floral phenotypes, and spatial and temporal scales, because scaling up is based on the same fundamental unit of analysis, the single visit.
Resumo:
How can a bridge be built between autonomic computing approaches and parallel computing systems? The work reported in this paper is motivated towards bridging this gap by proposing a swarm-array computing approach based on ‘Intelligent Agents’ to achieve autonomy for distributed parallel computing systems. In the proposed approach, a task to be executed on parallel computing cores is carried onto a computing core by carrier agents that can seamlessly transfer between processing cores in the event of a predicted failure. The cognitive capabilities of the carrier agents on a parallel processing core serves in achieving the self-ware objectives of autonomic computing, hence applying autonomic computing concepts for the benefit of parallel computing systems. The feasibility of the proposed approach is validated by simulation studies using a multi-agent simulator on an FPGA (Field-Programmable Gate Array) and experimental studies using MPI (Message Passing Interface) on a computer cluster. Preliminary results confirm that applying autonomic computing principles to parallel computing systems is beneficial.
Resumo:
Recent research in multi-agent systems incorporate fault tolerance concepts. However, the research does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely ‘Intelligent Agents’. In the approach considered a task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The agents hence contribute towards fault tolerance and towards building reliable systems. The feasibility of the approach is validated by simulations on an FPGA using a multi-agent simulator and implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.
Resumo:
Infrared filters and coatings have been employed on many sensing radiometer instruments to measure the thermal emission profiles and concentrations of certian chemical constituents found in planetary atmospheres. The High Resolution Dynamics Limb Sounder ( HIRDLS) is an example of the most recent developments in limb-viewing radiometry by employing a cooled focal plane detector array to provide simultaneous multi-channel monitoring of emission from gas and aerosols over an altitude range between 8 - 70 km. The use of spectrally selective cooled detectors in focal plane arrays has simplified the optical layout of radiometers, greatly reducing the number of components in the optical train. this has inevitably led to increased demands for the enviromnetal durability of the focal plane filters because of the need to cut sub-millimeter sizes, whilst maintaining an optimal spectral performance. Additionally the remaining refractive optical elements require antireflection coatings which must cover the entire spectral range of the focal plane array channels, in this case 6 to 18µm, with a minimum of reflection and absorption. This paper describes the optical layout and spectral design requirements for filteriong in the HIRDLS instrument, and reports progress on the manufacturing and testing of the sub-millimetre sized cooled filters. We also report on the spectral and environmental performance of prototype wideband antireflection coatings which satisfy the requirements above.
Resumo:
The High Resolution Dynamics Limb Sounder is described, with particular reference to the atmospheric measurements to be made and the rationale behind the measurement strategy. The demands this strategy places on the filters to be used in the instrument and the designs to which this leads to are described. A second set of filters at an intermediate image plane to reduce "Ghost Imaging" is discussed together with their required spectral properties. A method of combining the spectral characteristics of the primary and secondary filters in each channel are combined together with the spectral response of the detectors and other optical elements to obtain the system spectral response weighted appropriately for the Planck function and atmospheric limb absorption. This method is used to demonstrate whether the out-of-band spectral blocking requirement for a channel is being met and an example calculation is demonstrated showing how the blocking is built up for a representative channel. Finally, the techniques used to produce filters of the necessary sub-millimetre sizes together with the testing methods and procedures used to assess the environmental durability and establish space flight quality are discussed.
Resumo:
We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.
Resumo:
We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.
Resumo:
The role of the academic in the built environment seems generally to be not well understood or articulated. While this problem is not unique to our field, there are plenty of examples in a wide range of academic disciplines where the academic role has been fully articulated. But built environment academics have tended not to look beyond their own literature and their own vocational context in trying to give meaning to their academic work. The purpose of this keynote presentation is to explore the context of academic work generally and the connections between education, research and practice in the built environment, specifically. By drawing on ideas from the sociology of the professions, the role of universities, and the fundamentals of social science research, a case is made that helps to explain the kind of problems that routinely obstruct academic progress in our field. This discussion reveals that while there are likely to be great weaknesses in much of what is published and taught in the built environment, it is not too great a stretch to provide a more robust understanding and a good basis for developing our field in a way that would enable us collectively to make a major contribution to theory-building, theory-testing and to make a good stab at tackling some of the problems facing society at large. There is no reason to disregard the fundamental academic disciplines that underpin our knowledge of the built environment. If we contextualise our work in these more fundamental disciplines, there is every reason to think that we can have a much greater impact that we have experienced to date.
Resumo:
A new approach is presented to identify the number of incoming signals in antenna array processing. The new method exploits the inherent properties existing in the noise eigenvalues of the covariance matrix of the array output. A single threshold has been established concerning information about the signal and noise strength, data length, and array size. When the subspace-based algorithms are adopted the computation cost of the signal number detector can almost be neglected. The performance of the threshold is robust against low SNR and short data length.
Resumo:
Methods have recently been developed that make use of electromagnetic radiation at terahertz (THz) frequencies, the region of the spectrum between millimetre wavelengths and the infrared, for imaging purposes. Radiation at these wavelengths is non-ionizing and subject to far less Rayleigh scatter than visible or infrared wavelengths, making it suitable for medical applications. This paper introduces THz pulsed imaging and discusses its potential for in vivo medical applications in comparison with existing modalities.