98 resultados para VAR errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An interface between satellite retrievals and the incremental version of the four-dimensional variational assimilation scheme is developed, making full use of the information content of satellite measurements. In this paper, expressions for the function that calculates simulated observations from model states (called “observation operator”), together with its tangent linear version and adjoint, are derived. Results from our work can be used for implementing a quasi-optimal assimilation of satellite retrievals (e.g., of atmospheric trace gases) in operational meteorological centres.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assimilation of Doppler radar radial winds for high resolution NWP may improve short term forecasts of convective weather. Using insects as the radar target, it is possible to provide wind observations during convective development. This study aims to explore the potential of these new observations, with three case studies. Radial winds from insects detected by 4 operational weather radars were assimilated using 3D-Var into a 1.5 km resolution version of the Met Office Unified Model, using a southern UK domain and no convective parameterization. The effect on the analysis wind was small, with changes in direction and speed up to 45° and 2 m s−1 respectively. The forecast precipitation was perturbed in space and time but not substantially modified. Radial wind observations from insects show the potential to provide small corrections to the location and timing of showers but not to completely relocate convergence lines. Overall, quantitative analysis indicated the observation impact in the three case studies was small and neutral. However, the small sample size and possible ground clutter contamination issues preclude unequivocal impact estimation. The study shows the potential positive impact of insect winds; future operational systems using dual polarization radars which are better able to discriminate between insects and clutter returns should provided a much greater impact on forecasts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Little has so far been reported on the performance of the near-far resistant CDMA detectors in the presence of the synchronization errors. Starting with the general mathematical model of matched filters, this paper examines the effects of three classes of synchronization errors (i.e. time-delay errors, carrier phase errors, and carrier frequency errors) on the performance (bit error rate and near-far resistance) of an emerging type of near-far resistant coherent DS/SSMA detectors, i.e. the linear decorrelating detector (LDD). For comparison, the corresponding results for the conventional detector are also presented. It is shown that the LDD can still maintain a considerable performance advantage over the conventional detector even when some synchronization errors exist. Finally, several computer simulations are carried out to verify the theoretical conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the effect of time offset errors on the partial parallel interference canceller (PIC) and compares the performance of it against that of the standard PIC. The BER performances of the standard and partial interference cancellers are simulated in a near far environment with varying time offset errors. These simulations indicate that whilst timing errors significantly affect the performance of both these schemes, they do not diminish the gains that are realised by the partial PIC over that of the standard PIC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Little has been reported on the performance of near-far resistant CDMA detectors in the presence of system parameter estimation errors (SPEEs). Starting with the general mathematical model of matched filters, the paper examines the effects of three classes of SPEEs, i.e., time-delay, carrier phase, and carrier frequency errors, on the performance (BER) of an emerging type of near-far resistant coherent DS/SSMA detector, i.e., the linear decorrelating detector. For comparison, the corresponding results for the conventional detector are also presented. It is shown that the linear decorrelating detector can still maintain a considerable performance advantage over the conventional detector even when some SPEEs exist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For a targeted observations case, the dependence of the size of the forecast impact on the targeted dropsonde observation error in the data assimilation is assessed. The targeted observations were made in the lee of Greenland; the dependence of the impact on the proximity of the observations to the Greenland coast is also investigated. Experiments were conducted using the Met Office Unified Model (MetUM), over a limited-area domain at 24-km grid spacing, with a four-dimensional variational data assimilation (4D-Var) scheme. Reducing the operational dropsonde observation errors by one-half increases the maximum forecast improvement from 5% to 7%–10%, measured in terms of total energy. However, the largest impact is seen by replacing two dropsondes on the Greenland coast with two farther from the steep orography; this increases the maximum forecast improvement from 5% to 18% for an 18-h forecast (using operational observation errors). Forecast degradation caused by two dropsonde observations on the Greenland coast is shown to arise from spreading of data by the background errors up the steep slope of Greenland. Removing boundary layer data from these dropsondes reduces the forecast degradation, but it is only a partial solution to this problem. Although only from one case study, these results suggest that observations positioned within a correlation length scale of steep orography may degrade the forecast through the anomalous upslope spreading of analysis increments along terrain-following model levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – Expectations of future market conditions are acknowledged to be crucial for the development decision and hence for shaping the built environment. The purpose of this paper is to study the central London office market from 1987 to 2009 and test for evidence of rational, adaptive and naive expectations. Design/methodology/approach – Two parallel approaches are applied to test for either rational or adaptive/naive expectations: vector auto-regressive (VAR) approach with Granger causality tests and recursive OLS regression with one-step forecasts. Findings – Applying VAR models and a recursive OLS regression with one-step forecasts, the authors do not find evidence of adaptive and naïve expectations of developers. Although the magnitude of the errors and the length of time lags between market signal and construction starts vary over time and development cycles, the results confirm that developer decisions are explained, to a large extent, by contemporaneous and historic conditions in both the City and the West End, but this is more likely to stem from the lengthy design, financing and planning permission processes rather than adaptive or naive expectations. Research limitations/implications – More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of large demand shocks and/or irrational behaviour. Practical implications – Developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. Originality/value – This paper focuses the scholarly debate of real estate cycles on the role of expectations. It is also one of very few spatially disaggregate studies of the subject matter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

View-based and Cartesian representations provide rival accounts of visual navigation in humans, and here we explore possible models for the view-based case. A visual “homing” experiment was undertaken by human participants in immersive virtual reality. The distributions of end-point errors on the ground plane differed significantly in shape and extent depending on visual landmark configuration and relative goal location. A model based on simple visual cues captures important characteristics of these distributions. Augmenting visual features to include 3D elements such as stereo and motion parallax result in a set of models that describe the data accurately, demonstrating the effectiveness of a view-based approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical weather prediction (NWP) centres use numerical models of the atmospheric flow to forecast future weather states from an estimate of the current state. Variational data assimilation (VAR) is used commonly to determine an optimal state estimate that miminizes the errors between observations of the dynamical system and model predictions of the flow. The rate of convergence of the VAR scheme and the sensitivity of the solution to errors in the data are dependent on the condition number of the Hessian of the variational least-squares objective function. The traditional formulation of VAR is ill-conditioned and hence leads to slow convergence and an inaccurate solution. In practice, operational NWP centres precondition the system via a control variable transform to reduce the condition number of the Hessian. In this paper we investigate the conditioning of VAR for a single, periodic, spatially-distributed state variable. We present theoretical bounds on the condition number of the original and preconditioned Hessians and hence demonstrate the improvement produced by the preconditioning. We also investigate theoretically the effect of observation position and error variance on the preconditioned system and show that the problem becomes more ill-conditioned with increasingly dense and accurate observations. Finally, we confirm the theoretical results in an operational setting by giving experimental results from the Met Office variational system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the effect of GARCH errors on the tests proposed byPerron (1997) for a unit root in the presence of a structural break. We assessthe impact of degeneracy and integratedness of the conditional varianceindividually and find that, apart from in the limit, the testing procedure isinsensitive to the degree of degeneracy but does exhibit an increasingover-sizing as the process becomes more integrated. When we consider the GARCHspecifications that we are likely to encounter in empirical research, we findthat the Perron tests are reasonably robust to the presence of GARCH and donot suffer from severe over-or under-rejection of a correct null hypothesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A developing polar low is targeted with dropsonde observations to improve the forecast of its landfall. Accurately forecasting a polar low's strength and location remains a challenge; polar lows form over the ocean in poorly observed regions, therefore initial condition errors may contribute significantly to forecast error. The targeted polar low formed in the Norwegian Sea on 3 March 2008, during the Norwegian IPY-THORPEX field campaign. Two flights, six hours apart, released dense networks of dropsondes into a sensitive region covering the polar low and Arctic front to its west. The impact of the targeted observations is assessed using the limited-area Met Office Unified Model and three-dimensional variational (3D-Var) data assimilation scheme. Forecasts were verified using ECMWF analysis data, which show good agreement with both dropsonde data from a flight through the mature polar low, and 10 m QuikSCAT winds. The impact of the targeted data moved southwards with the polar low as it developed and then hit the Norwegian coast after 24 hours. The results show that the forecast of the polar low is sensitive to the initial conditions; targeted observations from the first flight did not improve the forecast, but those from the second flight clearly improved the forecast polar low position and intensity. However, caution should be applied to attributing the forecast improvement to the assimilation of the targeted observations from a single case-study, especially in this case as the forecast improvement is moderate relative to the spread from an operational ensemble forecast

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.