7 resultados para Safe Minimum Standard
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this article, I study the impacts of a specific incentives-based approach to safety regulation, namely the control of quality through sampling and threatening penalties when quality fails to meet some minimum standard. The welfare-improving impacts of this type of scheme seem high and are cogently illustrated in a recent contribution by Segerson, which stimulated many of the ideas in this paper. For this reason, the reader is referred to Segerson for a background on some of the motivation, and throughout, I make an effort to indicate differences between the two approaches. There are three major differences. First, I dispense with the calculus as much as possible, seeking readily interpreted, closedform solutions to illustrate the main ideas. Second, (strategically optimal, symmetric) Nash equilibria are the mainstay of each of the current models. Third, in the uncertainquality- provision equilibria, each of the Nash suppliers chooses the level of the lower bound for quality as a control and offers a draw from its (private) distribution in a contribution to the (public) pool of quality.
Resumo:
A beamforming algorithm is introduced based on the general objective function that approximates the bit error rate for the wireless systems with binary phase shift keying and quadrature phase shift keying modulation schemes. The proposed minimum approximate bit error rate (ABER) beamforming approach does not rely on the Gaussian assumption of the channel noise. Therefore, this approach is also applicable when the channel noise is non-Gaussian. The simulation results show that the proposed minimum ABER solution improves the standard minimum mean squares error beamforming solution, in terms of a smaller achievable system's bit error rate.
Resumo:
Smooth trajectories are essential for safe interaction in between human and a haptic interface. Different methods and strategies have been introduced to create such smooth trajectories. This paper studies the creation of human-like movements in haptic interfaces, based on the study of human arm motion. These motions are intended to retrain the upper limb movements of patients that lose manipulation functions following stroke. We present a model that uses higher degree polynomials to define a trajectory and control the robot arm to achieve minimum jerk movements. It also studies different methods that can be driven from polynomials to create more realistic human-like movements for therapeutic purposes.
Resumo:
Adaptive filters used in code division multiple access (CDMA) receivers to counter interference have been formulated both with and without the assumption of training symbols being transmitted. They are known as training-based and blind detectors respectively. We show that the convergence behaviour of the blind minimum-output-energy (MOE) detector can be quite easily derived, unlike what was implied by the procedure outlined in a previous paper. The simplification results from the observation that the correlation matrix determining convergence performance can be made symmetric, after which many standard results from the literature on least mean square (LMS) filters apply immediately.
Resumo:
The ability to undertake repeat measurements of flow-mediated dilatation (FMD) within a short time of a previous measurement would be useful to improve accuracy or to repeat a failed initial procedure. Although standard methods report that a minimum of 10 min is required between measurements, there is no published data to support this. Thirty healthy volunteers had five FMD measurements performed within a 2-h period, separated by various time intervals (5, 15 and 30 min). In 19 volunteers, FMD was also performed as soon as the vessel had returned to its baseline diameter. There was no significant difference between any of the FMD measurements or parameters across the visits indicating that repeat measurements may be taken after a minimum of 5 min or as soon as the vessel has returned to its baseline diameter, which in some subjects may be less than 5 min.
Resumo:
Sea ice contains flaws including frictional contacts. We aim to describe quantitatively the mechanics of those contacts, providing local physics for geophysical models. With a focus on the internal friction of ice, we review standard micro-mechanical models of friction. The solid's deformation under normal load may be ductile or elastic. The shear failure of the contact may be by ductile flow, brittle fracture, or melting and hydrodynamic lubrication. Combinations of these give a total of six rheological models. When the material under study is ice, several of the rheological parameters in the standard models are not constant, but depend on the temperature of the bulk, on the normal stress under which samples are pressed together, or on the sliding velocity and acceleration. This has the effect of making the shear stress required for sliding dependent on sliding velocity, acceleration, and temperature. In some cases, it also perturbs the exponent in the normal-stress dependence of that shear stress away from the value that applies to most materials. We unify the models by a principle of maximum displacement for normal deformation, and of minimum stress for shear failure, reducing the controversy over the mechanism of internal friction in ice to the choice of values of four parameters in a single model. The four parameters represent, for a typical asperity contact, the sliding distance required to expel melt-water, the sliding distance required to break contact, the normal strain in the asperity, and the thickness of any ductile shear zone.
Resumo:
Observations of the Sun’s corona during the space era have led to a picture of relatively constant, but cyclically varying solar output and structure. Longer-term, more indirect measurements, such as from 10Be, coupled by other albeit less reliable contemporaneous reports, however, suggest periods of significant departure from this standard. The Maunder Minimum was one such epoch where: (1) sunspots effectively disappeared for long intervals during a 70 yr period; (2) eclipse observations suggested the distinct lack of a visible K-corona but possible appearance of the F-corona; (3) reports of aurora were notably reduced; and (4) cosmic ray intensities at Earth were inferred to be substantially higher. Using a global thermodynamic MHD model, we have constructed a range of possible coronal configurations for the Maunder Minimum period and compared their predictions with these limited observational constraints. We conclude that the most likely state of the corona during—at least—the later portion of the Maunder Minimum was not merely that of the 2008/2009 solar minimum, as has been suggested recently, but rather a state devoid of any large-scale structure, driven by a photospheric field composed of only ephemeral regions, and likely substantially reduced in strength. Moreover, we suggest that the Sun evolved from a 2008/2009-like configuration at the start of the Maunder Minimum toward an ephemeral-only configuration by the end of it, supporting a prediction that we may be on the cusp of a new grand solar minimum.