915 resultados para Roll damping


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neuromuscular disorders affect millions of people world-wide. Upper limb tremor is a common symptom, and due to its complex aetiology it is difficult to compensate for except, in particular cases by surgical intervention or drug therapy. Wearable devices that mechanically compensate for limb tremor could benefit a considerable number of patients, but the technology to assist suffers in this way is under-developed. In this paper we propose an innovative orthosis that can dynamically suppress pathological tremor, by applying viscous damping to the affected limb in a controlled manner. The orthosis design utilises a new actuator design based on Magneto-Rheological Fluids that efficiently deliver damping action in response to the instantaneous tremor frequency and amplitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The conquest of Normandy by Philip Augustus of France effectively ended the ‘Anglo-Norman’ realm created in 1066, forcing cross-Channel landholders to choose between their English and their Norman estates. The best source for the resulting tenurial upheaval in England is the Rotulus de valore terrarum Normannorum, a list of seized properties and their former holders, and this article seeks to expand our understanding of the impact of the loss of Normandy through a detailed analysis of this document. First, it demonstrates that the compilation of the roll can be divided into two distinct stages, the first containing valuations taken before royal justices in June 1204 and enrolled before the end of July, and the second consisting of returns to orders for the valuation of particular properties issued during the summer and autumn, as part of the process by which these estates were committed to new holders. Second, study of the roll and other documentary sources permits a better understanding of the order for the seizure of the lands of those who had remained in Normandy, the text of which does not survive. This establishes that this royal order was issued in late May 1204 and, further, that it enjoined the temporary seizure rather than the permanent confiscation of these lands. Moreover, the seizure was not retrospective and covers a specific window of time in 1204. On the one hand, this means that the roll is far from a comprehensive record of terre Normannorum. On the other hand, it is possible to correlate the identities of those Anglo-Norman landholders whose English estates were seized with the military progress of the French king through the duchy in May and June and thus shed new light on the campaign of 1204. Third, the article considers the initial management of the seized estates and highlights the fact that, when making arrangements for the these lands, John was primarily concerned to maintain his freedom of manoeuvre, since he was not prepared to accept that Normandy had been lost for good.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A person with a moderate or severe motor disability will often use specialised or adapted tools to assist their interaction with a general environment. Such tools can assist with the movement of a person's arms so as to facilitate manipulation, can provide postural supports, or interface to computers, wheelchairs or similar assistive technologies. Designing such devices with programmable stiffness and damping may offer a better means for the person to have effective control of their surroundings. This paper addresses the possibility of designing some assistive technologies using impedance elements that can adapt to the user and the circumstances. Two impedance elements are proposed. The first, based on magnetic particle brakes, allows control of the damping coefficient in a passive element. The second, based on detuning the P-D controller in a servo-motor mechanism, allows control of both stiffness and damping. Such a mechanical impedance can be modulated to the conditions imposed by the task in hand. The limits of linear theory are explored and possible uses of programmable impedance elements are proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As healthcare costs rise and an aging population makes an increased demand on services, so new techniques must be introduced to promote an individuals independence and provide these services. Robots can now be designed so they can alter their dynamic properties changing from stiff to flaccid, or from giving no resistance to movement, to damping any large and sudden movements. This has some strong implications in health care in particular for rehabilitation where a robot must work in conjunction with an individual, and might guiding or assist a persons arm movements, or might be commanded to perform some set of autonomous actions. This paper presents the state-of-the-art of rehabilitation robots with examples from prosthetics, aids for daily living and physiotherapy. In all these situations there is the potential for the interaction to be non-passive with a resulting potential for the human/machine/environment combination to become unstable. To understand this instability we must develop better models of the human motor system and fit these models with realistic parameters. This paper concludes with a discussion of this problem and overviews some human models that can be used to facilitate the design of the human/machine interfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses a new method of impedance control that has been successfully implemented on the master robot of a teleoperation system. The method involves calibrating the robot to quantify the effect of adjustable controller parameters on the impedances along its different axes. The empirical equations relating end-effector impedance to the controller's feedback gains are obtained by performing system identification tests along individual axes of the robot. With these equations, online control of end-effector stiffness and damping is possible without having to monitor joint torques or solving complex algorithms. Hard contact conditions and compliant interfaces have been effectively demonstrated on a telemanipulation test-bed using appropriate combinations of stiffness and damping settings obtained by this method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a recent study, Williams introduced a simple modification to the widely used Robert–Asselin (RA) filter for numerical integration. The main purpose of the Robert–Asselin–Williams (RAW) filter is to avoid the undesired numerical damping of the RA filter and to increase the accuracy. In the present paper, the effects of the modification are comprehensively evaluated in the Simplified Parameterizations, Primitive Equation Dynamics (SPEEDY) atmospheric general circulation model. First, the authors search for significant changes in the monthly climatology due to the introduction of the new filter. After testing both at the local level and at the field level, no significant changes are found, which is advantageous in the sense that the new scheme does not require a retuning of the parameterized model physics. Second, the authors examine whether the new filter improves the skill of short- and medium-term forecasts. January 1982 data from the NCEP–NCAR reanalysis are used to evaluate the forecast skill. Improvements are found in all the model variables (except the relative humidity, which is hardly changed). The improvements increase with lead time and are especially evident in medium-range forecasts (96–144 h). For example, in tropical surface pressure predictions, 5-day forecasts made using the RAW filter have approximately the same skill as 4-day forecasts made using the RA filter. The results of this work are encouraging for the implementation of the RAW filter in other models currently using the RA filter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research into the topic of liquidity has greatly benefited from the availability of data. Although bid-ask spreads were inaccessible to researchers, Roll (1984) provided a conceptual model that estimated the effective bid-ask prices from regular time series data, recorded on a daily or longer interval. Later data availability improved and researchers were able to address questions regarding the factors that influenced the spreads and the relationship between spreads and risk, return and liquidity. More recently transaction data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the impact of transactions on price movements (Clayton and McKinnon, 2000) on a trade-by-trade analysis. This paper aims to use techniques that combine elements from all three approaches and, by studying US data over a relatively long time period, to throw light on earlier research as well as to reveal the changes in liquidity over the period controlling for extraneous factors such as market, age and size of REIT. It also reveals some comparable results for the UK market over the same period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following the attack on the World Trade Center on 9/11 volatility of daily returns of the US stock market rose sharply. This increase in volatility may reflect fundamental changes in the economic determinants of prices such as expected earnings, interest rates, real growth and inflation. Alternatively, the increase in volatility may simply reflect the effects of increased uncertainty in the financial markets. This study therefore sets out to determine if the effects of the attack on the World Trade Center on 9/11 had a fundamental or purely financial impact on US real estate returns. In order to do this we compare pre- and post-9/11 crisis returns for a number of US REIT indexes using an approach suggested by French and Roll (1986), as extended by Tuluca et al (2003). In general we find no evidence that the effects of 9/11 had a fundamental effect on REIT returns. In other words, we find that the effect of the attack on the World Trade Center on 9/11 had only a financial effect on REIT returns and therefore was transitory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the Arctic basin circulation, freshwater content (FWC) and heat budget by using a high-resolution global coupled ice–ocean model implemented with a state-of-the-art data assimilation scheme. We demonstrate that, despite a very sparse dataset, by assimilating hydrographic data in and near the Arctic basin, the initial warm bias and drift in the control run is successfully corrected, reproducing a much more realistic vertical and horizontal structure to the cyclonic boundary current carrying the Atlantic Water (AW) along the Siberian shelves in the reanalysis run. The Beaufort Gyre structure and FWC and variability are also more accurately reproduced. Small but important changes in the strait exchange flows are found which lead to more balanced budgets in the reanalysis run. Assimilation fluxes dominate the basin budgets over the first 10 years (P1: 1987–1996) of the reanalysis for both heat and FWC, after which the drifting Arctic upper water properties have been restored to realistic values. For the later period (P2: 1997–2004), the Arctic heat budget is almost balanced without assimilation contributions, while the freshwater budget shows reduced assimilation contributions compensating largely for surface salinity damping, which was extremely strong in this run. A downward trend in freshwater export at the Canadian Straits and Fram Strait is found in period P2, associated with Beaufort Gyre recharge. A detailed comparison with observations and previous model studies at the individual Arctic straits is also included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The orthodox approach for incentivising Demand Side Participation (DSP) programs is that utility losses from capital, installation and planning costs should be recovered under financial incentive mechanisms which aim to ensure that utilities have the right incentives to implement DSP activities. The recent national smart metering roll-out in the UK implies that this approach needs to be reassessed since utilities will recover the capital costs associated with DSP technology through bills. This paper introduces a reward and penalty mechanism focusing on residential users. DSP planning costs are recovered through payments from those consumers who do not react to peak signals. Those consumers who do react are rewarded by paying lower bills. Because real-time incentives to residential consumers tend to fail due to the negligible amounts associated with net gains (and losses) or individual users, in the proposed mechanism the regulator determines benchmarks which are matched against responses to signals and caps the level of rewards/penalties to avoid market distortions. The paper presents an overview of existing financial incentive mechanisms for DSP; introduces the reward/penalty mechanism aimed at fostering DSP under the hypothesis of smart metering roll-out; considers the costs faced by utilities for DSP programs; assesses linear rate effects and value changes; introduces compensatory weights for those consumers who have physical or financial impediments; and shows findings based on simulation runs on three discrete levels of elasticity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate replication of the processes associated with the energetics of the tropical ocean is necessary if coupled GCMs are to simulate the physics of ENSO correctly, including the transfer of energy from the winds to the ocean thermocline and energy dissipation during the ENSO cycle. Here, we analyze ocean energetics in coupled GCMs in terms of two integral parameters describing net energy loss in the system using the approach recently proposed by Brown and Fedorov (J Clim 23:1563–1580, 2010a) and Fedorov (J Clim 20:1108–1117, 2007). These parameters are (1) the efficiency c of the conversion of wind power into the buoyancy power that controls the rate of change of the available potential energy (APE) in the ocean and (2) the e-folding rate a that characterizes the damping of APE by turbulent diffusion and other processes. Estimating these two parameters for coupled models reveals potential deficiencies (and large differences) in how state-of-the-art coupled GCMs reproduce the ocean energetics as compared to ocean-only models and data assimilating models. The majority of the coupled models we analyzed show a lower efficiency (values of c in the range of 10–50% versus 50–60% for ocean-only simulations or reanalysis) and a relatively strong energy damping (values of a-1 in the range 0.4–1 years versus 0.9–1.2 years). These differences in the model energetics appear to reflect differences in the simulated thermal structure of the tropical ocean, the structure of ocean equatorial currents, and deficiencies in the way coupled models simulate ENSO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The project consists of a trilogy of films and a live performance. The Future trilogy takes IKEA riot of 2005 as the starting point for a speculative history of a fictional future, culminating in a choreographed re-enactment of the original event. Shot on 16mm and 8mm film, the series explores the possibility of collective action emerging from the capitalist relations inherent in the consumer riot. The live performance No Haus Like Bau, staged at the HAU 1 theatre in Berlin for the 5th Berlin Biennale, continues this research into re-enactment and post-1989 politics by dramatizing the rise and fall of the soviet union as a neo-Constructivist mime using a stage set made of flatpack furniture. Using the aesthetics of Modernism and the avant garde, from Constructivist and Futurist constumes to biomechanics and Bauhaus theatre theory, the project transposes early twentieth century utopian ideology to a present day setting where mass uprisings are motivated by cheap commodities. These explorations of consumerism and revolution have been widely exhibited internationally and supported by Film London, Arts Council England, Collective Gallery and the Berlin Biennale. The Future Trilogy formed the basis of a solo exhibition at the Te Tuhi Art Centre in Auckland, New Zealand and was screened as part of the Signal and Noise media art festival in Vancouver, as well as other exhibitions and screenings including “Roll it to Me” at Collective Gallery, Edinburgh, and Apocatopia, Castlefield Gallery, Manchester.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For decades regulators in the energy sector have focused on facilitating the maximisation of energy supply in order to meet demand through liberalisation and removal of market barriers. The debate on climate change has emphasised a new type of risk in the balance between energy demand and supply: excessively high energy demand brings about significantly negative environmental and economic impacts. This is because if a vast number of users is consuming electricity at the same time, energy suppliers have to activate dirty old power plants with higher greenhouse gas emissions and higher system costs. The creation of a Europe-wide electricity market requires a systematic investigation into the risk of aggregate peak demand. This paper draws on the e-Living Time-Use Survey database to assess the risk of aggregate peak residential electricity demand for European energy markets. Findings highlight in which countries and for what activities the risk of aggregate peak demand is greater. The discussion highlights which approaches energy regulators have started considering to convince users about the risks of consuming too much energy during peak times. These include ‘nudging’ approaches such as the roll-out of smart meters, incentives for shifting the timing of energy consumption, differentiated time-of-use tariffs, regulatory financial incentives and consumption data sharing at the community level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate simulations by 16 atmospheric general circulation models (AGCMs) are compared on an aqua-planet, a water-covered Earth with prescribed sea surface temperature varying only in latitude. The idealised configuration is designed to expose differences in the circulation simulated by different models. Basic features of the aqua-planet climate are characterised by comparison with Earth. The models display a wide range of behaviour. The balanced component of the tropospheric mean flow, and mid-latitude eddy covariances subject to budget constraints, vary relatively little among the models. In contrast, differences in damping in the dynamical core strongly influence transient eddy amplitudes. Historical uncertainty in modelled lower stratospheric temperatures persists in APE. Aspects of the circulation generated more directly by interactions between the resolved fluid dynamics and parameterized moist processes vary greatly. The tropical Hadley circulation forms either a single or double inter-tropical convergence zone (ITCZ) at the equator, with large variations in mean precipitation. The equatorial wave spectrum shows a wide range of precipitation intensity and propagation characteristics. Kelvin mode-like eastward propagation with remarkably constant phase speed dominates in most models. Westward propagation, less dispersive than the equatorial Rossby modes, dominates in a few models or occurs within an eastward propagating envelope in others. The mean structure of the ITCZ is related to precipitation variability, consistent with previous studies. The aqua-planet global energy balance is unknown but the models produce a surprisingly large range of top of atmosphere global net flux, dominated by differences in shortwave reflection by clouds. A number of newly developed models, not optimised for Earth climate, contribute to this. Possible reasons for differences in the optimised models are discussed. The aqua-planet configuration is intended as one component of an experimental hierarchy used to evaluate AGCMs. This comparison does suggest that the range of model behaviour could be better understood and reduced in conjunction with Earth climate simulations. Controlled experimentation is required to explore individual model behaviour and investigate convergence of the aqua-planet climate with increasing resolution.