7 resultados para Worst-case dimensioning
em CentAUR: Central Archive University of Reading - UK
Resumo:
This paper presents a semi-synchronous pipeline scheme, here referred as single-pulse pipeline, to the problem of mapping pipelined circuits to a Field Programmable Gate Array (FPGA). Area and timing considerations are given for a general case and later applied to a systolic circuit as illustration. The single-pulse pipeline can manage asynchronous worst-case data completion and it is evaluated against two chosen asynchronous pipelining: a four-phase bundle-data pipeline and a doubly-latched asynchronous pipeline. The semi-synchronous pipeline proposal takes less FPGA area and operates faster than the two selected fully-asynchronous schemes for an FPGA case.
Resumo:
Investigating agroforestry systems that incorporate poultry is warranted in Northern Europe as they may offer benefits including: improved welfare and use of range; reduced feed costs; price premia on products; reduced payback periods for forests; and, greater returns on investment. Free-range egg production accounts for 27% of the United Kingdom egg market and demand for outdoor broilers is increasing. No research has been conducted recently on the economic viability of agroforestry systems with poultry. An economic model was constructed to: assess economic viability of a broiler agroforestry system; and, investigate the sensitivity of economic performance to key factors and interactions, and identify those which warrant attention in research and management. The system modelled is a commercial trial established in Southern England in 2002 where deciduous trees were planted and broilers reared in six- or nine-week periods. The model uses Monte Carlo simulation and financial performance analyses run for a 120-year period. An Internal Rate of Return (IRR) of 15.5% is predicted for the six-week system which remains viable under a 'worst case' scenario (IRR of 12.6%). Factors which affect financial performance most (decreasing in magnitude) are prices achieved for broilers, costs of brooding houses, chicks, arks, feed and timber prices. The main anticipated effects of biological interactions on financial performance (increased ranging on feed conversion and excess nutrient supply on tree health) were not supported by analysis. Further research is particularly warranted on the welfare benefits offered by the tree component and its relation to price premia.
Resumo:
We discuss the feasibility of wireless terahertz communications links deployed in a metropolitan area and model the large-scale fading of such channels. The model takes into account reception through direct line of sight, ground and wall reflection, as well as diffraction around a corner. The movement of the receiver is modeled by an autonomous dynamic linear system in state space, whereas the geometric relations involved in the attenuation and multipath propagation of the electric field are described by a static nonlinear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a single-output Wiener model from time-domain measurements of the field intensity when the receiver motion is simulated using a constant angular speed and an exponentially decaying radius. The identification procedure is validated by using the model to perform q-step ahead predictions. The sensitivity of the algorithm to small-scale fading, detector noise, and atmospheric changes are discussed. The performance of the algorithm is tested in the diffraction zone assuming a range of emitter frequencies (2, 38, 60, 100, 140, and 400 GHz). Extensions of the simulation results to situations where a more complicated trajectory describes the motion of the receiver are also implemented, providing information on the performance of the algorithm under a worst case scenario. Finally, a sensitivity analysis to model parameters for the identified Wiener system is proposed.
Resumo:
Several non-orthogonal space-time block coding (NO-STBC) schemes have recently been proposed to achieve full rate transmission. Some of these schemes, however, suffer from weak robustness: their channel matrices will become ill conditioned in the case of highly correlated channels (HCC). To address this issue, this paper derives a family of robust NO-STBC schemes for four Tx antennas based on the worst case of HCC. These codes turned out to be a superset of Jafarkhani's quasi-orthogonal STBC codes. A computationally affordable linear decoder is also proposed. Although these codes achieve a similar performance to the non-robust schemes under normal channel conditions, they offer a strong robustness against HCC (although possibly yielding a poorer performance). Finally, computer simulations are presented to verify the algorithm design.
Resumo:
In the earth sciences, data are commonly cast on complex grids in order to model irregular domains such as coastlines, or to evenly distribute grid points over the globe. It is common for a scientist to wish to re-cast such data onto a grid that is more amenable to manipulation, visualization, or comparison with other data sources. The complexity of the grids presents a significant technical difficulty to the regridding process. In particular, the regridding of complex grids may suffer from severe performance issues, in the worst case scaling with the product of the sizes of the source and destination grids. We present a mechanism for the fast regridding of such datasets, based upon the construction of a spatial index that allows fast searching of the source grid. We discover that the most efficient spatial index under test (in terms of memory usage and query time) is a simple look-up table. A kd-tree implementation was found to be faster to build and to give similar query performance at the expense of a larger memory footprint. Using our approach, we demonstrate that regridding of complex data may proceed at speeds sufficient to permit regridding on-the-fly in an interactive visualization application, or in a Web Map Service implementation. For large datasets with complex grids the new mechanism is shown to significantly outperform algorithms used in many scientific visualization packages.
Resumo:
By employing Moody’s corporate default and rating transition data spanning the last 90 years we explore how much capital banks should hold against their corporate loan portfolios to withstand historical stress scenarios. Specifically, we will focus on the worst case scenario over the observation period, the Great Depression. We find that migration risk and the length of the investment horizon are critical factors when determining bank capital needs in a crisis. We show that capital may need to rise more than three times when the horizon is increased from 1 year, as required by current and future regulation, to 3 years. Increases are still important but of a lower magnitude when migration risk is introduced in the analysis. Further, we find that the new bank capital requirements under the so-called Basel 3 agreement would enable banks to absorb Great Depression-style losses. But, such losses would dent regulatory capital considerably and far beyond the capital buffers that have been proposed to ensure that banks survive crisis periods without government support.
Resumo:
A programmable data acquisition system to allow novel use of meteorological radiosondes for atmospheric science measurements is described. In its basic form it supports four analogue inputs at 16 bit resolution, and up to two further inputs at lower resolution configurable instead for digital instruments. It also provides multiple instrument power supplies (+8V, +16V, +5V and -8V) from the 9V radiosonde battery. During a balloon flight encountering air temperatures from +17°C to -66°C, the worst case voltage drift in the 5V unipolar digitisation circuitry was 20mV. The system liberates a new range of low cost atmospheric research measurements, by utilising radiosondes routinely launched internationally for weather forecasting purposes. No additional receiving equipment is required. Comparisons between the specially instrumented and standard meteorological radiosondes show negligible effect of the additional instrumentation on the standard meteorological data.