17 resultados para Worst-case execution-time
em CentAUR: Central Archive University of Reading - UK
Resumo:
Several non-orthogonal space-time block coding (NO-STBC) schemes have recently been proposed to achieve full rate transmission. Some of these schemes, however, suffer from weak robustness: their channel matrices will become ill conditioned in the case of highly correlated channels (HCC). To address this issue, this paper derives a family of robust NO-STBC schemes for four Tx antennas based on the worst case of HCC. These codes turned out to be a superset of Jafarkhani's quasi-orthogonal STBC codes. A computationally affordable linear decoder is also proposed. Although these codes achieve a similar performance to the non-robust schemes under normal channel conditions, they offer a strong robustness against HCC (although possibly yielding a poorer performance). Finally, computer simulations are presented to verify the algorithm design.
Resumo:
We discuss the feasibility of wireless terahertz communications links deployed in a metropolitan area and model the large-scale fading of such channels. The model takes into account reception through direct line of sight, ground and wall reflection, as well as diffraction around a corner. The movement of the receiver is modeled by an autonomous dynamic linear system in state space, whereas the geometric relations involved in the attenuation and multipath propagation of the electric field are described by a static nonlinear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a single-output Wiener model from time-domain measurements of the field intensity when the receiver motion is simulated using a constant angular speed and an exponentially decaying radius. The identification procedure is validated by using the model to perform q-step ahead predictions. The sensitivity of the algorithm to small-scale fading, detector noise, and atmospheric changes are discussed. The performance of the algorithm is tested in the diffraction zone assuming a range of emitter frequencies (2, 38, 60, 100, 140, and 400 GHz). Extensions of the simulation results to situations where a more complicated trajectory describes the motion of the receiver are also implemented, providing information on the performance of the algorithm under a worst case scenario. Finally, a sensitivity analysis to model parameters for the identified Wiener system is proposed.
Resumo:
This paper presents a semi-synchronous pipeline scheme, here referred as single-pulse pipeline, to the problem of mapping pipelined circuits to a Field Programmable Gate Array (FPGA). Area and timing considerations are given for a general case and later applied to a systolic circuit as illustration. The single-pulse pipeline can manage asynchronous worst-case data completion and it is evaluated against two chosen asynchronous pipelining: a four-phase bundle-data pipeline and a doubly-latched asynchronous pipeline. The semi-synchronous pipeline proposal takes less FPGA area and operates faster than the two selected fully-asynchronous schemes for an FPGA case.
Resumo:
In the earth sciences, data are commonly cast on complex grids in order to model irregular domains such as coastlines, or to evenly distribute grid points over the globe. It is common for a scientist to wish to re-cast such data onto a grid that is more amenable to manipulation, visualization, or comparison with other data sources. The complexity of the grids presents a significant technical difficulty to the regridding process. In particular, the regridding of complex grids may suffer from severe performance issues, in the worst case scaling with the product of the sizes of the source and destination grids. We present a mechanism for the fast regridding of such datasets, based upon the construction of a spatial index that allows fast searching of the source grid. We discover that the most efficient spatial index under test (in terms of memory usage and query time) is a simple look-up table. A kd-tree implementation was found to be faster to build and to give similar query performance at the expense of a larger memory footprint. Using our approach, we demonstrate that regridding of complex data may proceed at speeds sufficient to permit regridding on-the-fly in an interactive visualization application, or in a Web Map Service implementation. For large datasets with complex grids the new mechanism is shown to significantly outperform algorithms used in many scientific visualization packages.
Resumo:
Investigating agroforestry systems that incorporate poultry is warranted in Northern Europe as they may offer benefits including: improved welfare and use of range; reduced feed costs; price premia on products; reduced payback periods for forests; and, greater returns on investment. Free-range egg production accounts for 27% of the United Kingdom egg market and demand for outdoor broilers is increasing. No research has been conducted recently on the economic viability of agroforestry systems with poultry. An economic model was constructed to: assess economic viability of a broiler agroforestry system; and, investigate the sensitivity of economic performance to key factors and interactions, and identify those which warrant attention in research and management. The system modelled is a commercial trial established in Southern England in 2002 where deciduous trees were planted and broilers reared in six- or nine-week periods. The model uses Monte Carlo simulation and financial performance analyses run for a 120-year period. An Internal Rate of Return (IRR) of 15.5% is predicted for the six-week system which remains viable under a 'worst case' scenario (IRR of 12.6%). Factors which affect financial performance most (decreasing in magnitude) are prices achieved for broilers, costs of brooding houses, chicks, arks, feed and timber prices. The main anticipated effects of biological interactions on financial performance (increased ranging on feed conversion and excess nutrient supply on tree health) were not supported by analysis. Further research is particularly warranted on the welfare benefits offered by the tree component and its relation to price premia.
Resumo:
Pullpipelining, a pipeline technique where data is pulled from successor stages from predecessor stages is proposed Control circuits using a synchronous, a semi-synchronous and an asynchronous approach are given. Simulation examples for a DLX generic RISC datapath show that common control pipeline circuit overhead is avoided using the proposal. Applications to linear systolic arrays in cases when computation is finished at early stages in the array are foreseen. This would allow run-time data-driven digital frequency modulation of synchronous pipelined designs. This has applications to implement algorithms exhibiting average-case processing time using a synchronous approach.
Resumo:
Driven by a range of modern applications that includes telecommunications, e-business and on-line social interaction, recent ideas in complex networks can be extended to the case of time-varying connectivity. Here we propose a general frame- work for modelling and simulating such dynamic networks, and we explain how the long time behaviour may reveal important information about the mechanisms underlying the evolution.
Resumo:
Clusters of computers can be used together to provide a powerful computing resource. Large Monte Carlo simulations, such as those used to model particle growth, are computationally intensive and take considerable time to execute on conventional workstations. By spreading the work of the simulation across a cluster of computers, the elapsed execution time can be greatly reduced. Thus a user has apparently the performance of a supercomputer by using the spare cycles on other workstations.
Resumo:
By employing Moody’s corporate default and rating transition data spanning the last 90 years we explore how much capital banks should hold against their corporate loan portfolios to withstand historical stress scenarios. Specifically, we will focus on the worst case scenario over the observation period, the Great Depression. We find that migration risk and the length of the investment horizon are critical factors when determining bank capital needs in a crisis. We show that capital may need to rise more than three times when the horizon is increased from 1 year, as required by current and future regulation, to 3 years. Increases are still important but of a lower magnitude when migration risk is introduced in the analysis. Further, we find that the new bank capital requirements under the so-called Basel 3 agreement would enable banks to absorb Great Depression-style losses. But, such losses would dent regulatory capital considerably and far beyond the capital buffers that have been proposed to ensure that banks survive crisis periods without government support.
Resumo:
A programmable data acquisition system to allow novel use of meteorological radiosondes for atmospheric science measurements is described. In its basic form it supports four analogue inputs at 16 bit resolution, and up to two further inputs at lower resolution configurable instead for digital instruments. It also provides multiple instrument power supplies (+8V, +16V, +5V and -8V) from the 9V radiosonde battery. During a balloon flight encountering air temperatures from +17°C to -66°C, the worst case voltage drift in the 5V unipolar digitisation circuitry was 20mV. The system liberates a new range of low cost atmospheric research measurements, by utilising radiosondes routinely launched internationally for weather forecasting purposes. No additional receiving equipment is required. Comparisons between the specially instrumented and standard meteorological radiosondes show negligible effect of the additional instrumentation on the standard meteorological data.
Resumo:
This technique paper describes a novel method for quantitatively and routinely identifying auroral breakup following substorm onset using the Time History of Events and Macroscale Interactions During Substorms (THEMIS) all-sky imagers (ASIs). Substorm onset is characterised by a brightening of the aurora that is followed by auroral poleward expansion and auroral breakup. This breakup can be identified by a sharp increase in the auroral intensity i(t) and the time derivative of auroral intensity i'(t). Utilising both i(t) and i'(t) we have developed an algorithm for identifying the time interval and spatial location of auroral breakup during the substorm expansion phase within the field of view of ASI data based solely on quantifiable characteristics of the optical auroral emissions. We compare the time interval determined by the algorithm to independently identified auroral onset times from three previously published studies. In each case the time interval determined by the algorithm is within error of the onset independently identified by the prior studies. We further show the utility of the algorithm by comparing the breakup intervals determined using the automated algorithm to an independent list of substorm onset times. We demonstrate that up to 50% of the breakup intervals characterised by the algorithm are within the uncertainty of the times identified in the independent list. The quantitative description and routine identification of an interval of auroral brightening during the substorm expansion phase provides a foundation for unbiased statistical analysis of the aurora to probe the physics of the auroral substorm as a new scientific tool for aiding the identification of the processes leading to auroral substorm onset.
Resumo:
This paper presents a software-based study of a hardware-based non-sorting median calculation method on a set of integer numbers. The method divides the binary representation of each integer element in the set into bit slices in order to find the element located in the middle position. The method exhibits a linear complexity order and our analysis shows that the best performance in execution time is obtained when slices of 4-bit in size are used for 8-bit and 16-bit integers, in mostly any data set size. Results suggest that software implementation of bit slice method for median calculation outperforms sorting-based methods with increasing improvement for larger data set size. For data set sizes of N > 5, our simulations show an improvement of at least 40%.
Resumo:
This paper describes a fast integer sorting algorithm, herein referred as Bit-index sort, which is a non-comparison sorting algorithm for partial per-mutations, with linear complexity order in execution time. Bit-index sort uses a bit-array to classify input sequences of distinct integers, and exploits built-in bit functions in C compilers supported by machine hardware to retrieve the ordered output sequence. Results show that Bit-index sort outperforms in execution time to quicksort and counting sort algorithms. A parallel approach for Bit-index sort using two simultaneous threads is included, which obtains speedups up to 1.6.
Resumo:
Motivated by a matched case-control study to investigate potential risk factors for meningococcal disease amongst adolescents, we consider the analysis of matched case-control studies where disease incidence, and possibly other risk factors, vary with time of year. For the cases, the time of infection may be recorded. For controls, however, the recorded time is simply the time of data collection, which is shortly after the time of infection for the matched case, and so depends on the latter. We show that the effect of risk factors and interactions may be adjusted for the time of year effect in a standard conditional logistic regression analysis without introducing any bias. We also show that, if the time delay between data collection for cases and controls is constant, provided this delay is not very short, estimates of the time of year effect are approximately unbiased. In the case that the length of the delay varies over time, the estimate of the time of year effect is biased. We obtain an approximate expression for the degree of bias in this case. Copyright © 2004 John Wiley & Sons, Ltd.
The sequential analysis of repeated binary responses: a score test for the case of three time points
Resumo:
In this paper a robust method is developed for the analysis of data consisting of repeated binary observations taken at up to three fixed time points on each subject. The primary objective is to compare outcomes at the last time point, using earlier observations to predict this for subjects with incomplete records. A score test is derived. The method is developed for application to sequential clinical trials, as at interim analyses there will be many incomplete records occurring in non-informative patterns. Motivation for the methodology comes from experience with clinical trials in stroke and head injury, and data from one such trial is used to illustrate the approach. Extensions to more than three time points and to allow for stratification are discussed. Copyright © 2005 John Wiley & Sons, Ltd.