953 resultados para deflection-compensated roll


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The conquest of Normandy by Philip Augustus of France effectively ended the ‘Anglo-Norman’ realm created in 1066, forcing cross-Channel landholders to choose between their English and their Norman estates. The best source for the resulting tenurial upheaval in England is the Rotulus de valore terrarum Normannorum, a list of seized properties and their former holders, and this article seeks to expand our understanding of the impact of the loss of Normandy through a detailed analysis of this document. First, it demonstrates that the compilation of the roll can be divided into two distinct stages, the first containing valuations taken before royal justices in June 1204 and enrolled before the end of July, and the second consisting of returns to orders for the valuation of particular properties issued during the summer and autumn, as part of the process by which these estates were committed to new holders. Second, study of the roll and other documentary sources permits a better understanding of the order for the seizure of the lands of those who had remained in Normandy, the text of which does not survive. This establishes that this royal order was issued in late May 1204 and, further, that it enjoined the temporary seizure rather than the permanent confiscation of these lands. Moreover, the seizure was not retrospective and covers a specific window of time in 1204. On the one hand, this means that the roll is far from a comprehensive record of terre Normannorum. On the other hand, it is possible to correlate the identities of those Anglo-Norman landholders whose English estates were seized with the military progress of the French king through the duchy in May and June and thus shed new light on the campaign of 1204. Third, the article considers the initial management of the seized estates and highlights the fact that, when making arrangements for the these lands, John was primarily concerned to maintain his freedom of manoeuvre, since he was not prepared to accept that Normandy had been lost for good.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper a new nonlinear digital baseband predistorter design is introduced based on direct learning, together with a new Wiener system modeling approach for the high power amplifiers (HPA) based on the B-spline neural network. The contribution is twofold. Firstly, by assuming that the nonlinearity in the HPA is mainly dependent on the input signal amplitude the complex valued nonlinear static function is represented by two real valued B-spline neural networks, one for the amplitude distortion and another for the phase shift. The Gauss-Newton algorithm is applied for the parameter estimation, in which the De Boor recursion is employed to calculate both the B-spline curve and the first order derivatives. Secondly, we derive the predistorter algorithm calculating the inverse of the complex valued nonlinear static function according to B-spline neural network based Wiener models. The inverse of the amplitude and phase shift distortion are then computed and compensated using the identified phase shift model. Numerical examples have been employed to demonstrate the efficacy of the proposed approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research into the topic of liquidity has greatly benefited from the availability of data. Although bid-ask spreads were inaccessible to researchers, Roll (1984) provided a conceptual model that estimated the effective bid-ask prices from regular time series data, recorded on a daily or longer interval. Later data availability improved and researchers were able to address questions regarding the factors that influenced the spreads and the relationship between spreads and risk, return and liquidity. More recently transaction data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the impact of transactions on price movements (Clayton and McKinnon, 2000) on a trade-by-trade analysis. This paper aims to use techniques that combine elements from all three approaches and, by studying US data over a relatively long time period, to throw light on earlier research as well as to reveal the changes in liquidity over the period controlling for extraneous factors such as market, age and size of REIT. It also reveals some comparable results for the UK market over the same period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following the attack on the World Trade Center on 9/11 volatility of daily returns of the US stock market rose sharply. This increase in volatility may reflect fundamental changes in the economic determinants of prices such as expected earnings, interest rates, real growth and inflation. Alternatively, the increase in volatility may simply reflect the effects of increased uncertainty in the financial markets. This study therefore sets out to determine if the effects of the attack on the World Trade Center on 9/11 had a fundamental or purely financial impact on US real estate returns. In order to do this we compare pre- and post-9/11 crisis returns for a number of US REIT indexes using an approach suggested by French and Roll (1986), as extended by Tuluca et al (2003). In general we find no evidence that the effects of 9/11 had a fundamental effect on REIT returns. In other words, we find that the effect of the attack on the World Trade Center on 9/11 had only a financial effect on REIT returns and therefore was transitory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High quality wind measurements in cities are needed for numerous applications including wind engineering. Such data-sets are rare and measurement platforms may not be optimal for meteorological observations. Two years' wind data were collected on the BT Tower, London, UK, showing an upward deflection on average for all wind directions. Wind tunnel simulations were performed to investigate flow distortion around two scale models of the Tower. Using a 1:160 scale model it was shown that the Tower causes a small deflection (ca. 0.5°) compared to the lattice on top on which the instruments were placed (ca. 0–4°). These deflections may have been underestimated due to wind tunnel blockage. Using a 1:40 model, the observed flow pattern was consistent with streamwise vortex pairs shed from the upstream lattice edge. Correction factors were derived for different wind directions and reduced deflection in the full-scale data-set by <3°. Instrumental tilt caused a sinusoidal variation in deflection of ca. 2°. The residual deflection (ca. 3°) was attributed to the Tower itself. Correction of the wind-speeds was small (average 1%) therefore it was deduced that flow distortion does not significantly affect the measured wind-speeds and the wind climate statistics are reliable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The orthodox approach for incentivising Demand Side Participation (DSP) programs is that utility losses from capital, installation and planning costs should be recovered under financial incentive mechanisms which aim to ensure that utilities have the right incentives to implement DSP activities. The recent national smart metering roll-out in the UK implies that this approach needs to be reassessed since utilities will recover the capital costs associated with DSP technology through bills. This paper introduces a reward and penalty mechanism focusing on residential users. DSP planning costs are recovered through payments from those consumers who do not react to peak signals. Those consumers who do react are rewarded by paying lower bills. Because real-time incentives to residential consumers tend to fail due to the negligible amounts associated with net gains (and losses) or individual users, in the proposed mechanism the regulator determines benchmarks which are matched against responses to signals and caps the level of rewards/penalties to avoid market distortions. The paper presents an overview of existing financial incentive mechanisms for DSP; introduces the reward/penalty mechanism aimed at fostering DSP under the hypothesis of smart metering roll-out; considers the costs faced by utilities for DSP programs; assesses linear rate effects and value changes; introduces compensatory weights for those consumers who have physical or financial impediments; and shows findings based on simulation runs on three discrete levels of elasticity.