25 resultados para Point interpolation method
Resumo:
Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.
Resumo:
We extend a meshless method of fundamental solutions recently proposed by the authors for the one-dimensional two-phase inverse linear Stefan problem, to the nonlinear case. In this latter situation the free surface is also considered unknown which is more realistic from the practical point of view. Building on the earlier work, the solution is approximated in each phase by a linear combination of fundamental solutions to the heat equation. The implementation and analysis are more complicated in the present situation since one needs to deal with a nonlinear minimization problem to identify the free surface. Furthermore, the inverse problem is ill-posed since small errors in the input measured data can cause large deviations in the desired solution. Therefore, regularization needs to be incorporated in the objective function which is minimized in order to obtain a stable solution. Numerical results are presented and discussed. © 2014 IMACS.
Resumo:
The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.
Resumo:
Eight otherwise healthy diabetic volunteers took a daily antioxidant supplement consisting of vitamin E (200 IU), vitamin C (250 mg) and α-lipoic acid (90 mg) for a period of 6 weeks. Diabetic dapsone hydroxylamine-mediated methaemoglobin formation and resistance to erythrocytic thiol depletion was compared with age and sex-matched non-diabetic subjects. At time zero, methaemoglobin formation in the non-diabetic subjects was greater at all four time points compared with that of the diabetic subjects. Resistance to glutathione depletion was initially greater in non-diabetic compared with diabetic samples. Half-way through the study (3 weeks), there were no differences between the two groups in methaemoglobin formation and thiol depletion in the diabetic samples was now lower than the non-diabetic samples at 10 and 20 min. At 6 weeks, diabetic erythrocytic thiol levels remained greater than those of non-diabetics. HbA1c values were significantly reduced in the diabetic subjects at 6 weeks compared with time zero values. At 10 weeks, 4 weeks after the end of supplementation, the diabetic HbA1c values significantly increased to the point where they were not significantly different from the time zero values. Total antioxidant status measurement (TAS) indicated that diabetic plasma antioxidant capacity was significantly improved during antioxidant supplementation. Conversion of α-lipoic acid to dihydrolipoic acid (DHLA) in vivo led to potent interference in a standard fructosamine assay kit, negating its use in this study. This report suggests that triple antioxidant therapy in diabetic volunteers attenuates the in vitro experimental oxidative stress of methaemoglobin formation and reduces haemoglobin glycation in vivo. © 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Infantile Nystagmus Syndrome, or Congenital Nystagmus, is an ocular-motor disorder characterized by involuntary, conjugated and bilateral to and fro ocular oscillations. Good visual acuity in congenital nystagmus can be achieved during the foveation periods in which eye velocity slows down while the target image crosses the fovea. Visual acuity was found to be mainly dependent on the duration of the foveation periods. In this work a new approach is proposed for estimation of foveation parameters: a cubic spline interpolation of the nystagmus recording before localizing the start point of foveation window and to estimate its duration. The performances of the proposed algorithm were assessed in comparison with a previously developed algorithm, used here as gold standard. The obtained results suggest that the spline interpolation could be a useful tool to filter the eye movement recordings before applying an algorithm to estimate the foveation window parameters. © 2013 IEEE.
Resumo:
Cardiotocographic data provide physicians information about foetal development and permit to assess conditions such as foetal distress. An incorrect evaluation of the foetal status can be of course very dangerous. To improve interpretation of cardiotocographic recordings, great interest has been dedicated to foetal heart rate variability spectral analysis. It is worth reminding, however, that foetal heart rate is intrinsically an uneven series, so in order to produce an evenly sampled series a zero-order, linear or cubic spline interpolation can be employed. This is not suitable for frequency analyses because interpolation introduces alterations in the foetal heart rate power spectrum. In particular, interpolation process can produce alterations of the power spectral density that, for example, affects the estimation of the sympatho-vagal balance (computed as low-frequency/high-frequency ratio), which represents an important clinical parameter. In order to estimate the frequency spectrum alterations of the foetal heart rate variability signal due to interpolation and cardiotocographic storage rates, in this work, we simulated uneven foetal heart rate series with set characteristics, their evenly spaced versions (with different orders of interpolation and storage rates) and computed the sympatho-vagal balance values by power spectral density. For power spectral density estimation, we chose the Lomb method, as suggested by other authors to study the uneven heart rate series in adults. Summarising, the obtained results show that the evaluation of SVB values on the evenly spaced FHR series provides its overestimation due to the interpolation process and to the storage rate. However, cubic spline interpolation produces more robust and accurate results. © 2010 Elsevier Ltd. All rights reserved.
Resumo:
Measuring and compensating the pivot points of five-axis machine tools is always challenging and very time consuming. This paper presents a newly developed approach for automatic measurement and compensation of pivot point positional errors on five-axis machine tools. Machine rotary axis errors are measured using a circular test. This method has been tested on five-axis machine tools with swivel table configuration. Results show that up to 99% of the positional errors of the rotary axis can be compensated by using this approach.
Resumo:
Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.
Resumo:
Objectives: Hospital discharge is a transition of care, where medication discrepancies are likely to occur and potentially cause patient harm. The purpose of our study was to assess the prescribing accuracy of hospital discharge medication orders at a London, UK teaching hospital. The timeliness of the discharge summary reaching the general practitioner (GP, family physician) was also assessed based on the 72 h target referenced in the Care Quality Commission report.1 Method: 501 consecutive discharge medication orders from 142 patients were examined and the following records were compared (1) the final inpatient drug chart at the point of discharge, (2) printed signed copy of the initial to take away (TTA) discharge summary produced electronically by the physician, (3) the pharmacist's amendments on the initial TTA that were hand written, (4) the final electronic patient discharge summary record, (5) the patients final take home medication from the hospital. Discrepancies between the physician's order (6) and pharmacist's change(s) (7) were compared with two types of failures – ‘failure to make a required change’ and ‘change where none was required’. Once the patient was discharged, the patient's GP, was contacted 72 h after discharge to see if the patient discharge summary, sent by post or via email, was received. Results: Over half the patients seen (73 out of 142) patients had at least one discrepancy that was made on the initial TTA by the doctor and amended by the pharmacist. Out of the 501 drugs, there were 140 discrepancies, 108 were ‘failures to make a required change’ (77%) and 32 were ‘changes where none were required’ (23%). The types of ‘failures to make required changes’ discrepancies that were found between the initial TTA and pharmacist's amendments were paracetamol and ibuprofen changes (dose banding) 38 (27%), directions of use 34 (24%), incorrect formulation of medication 28 (20%) and incorrect strength 8 (6%). The types of ‘changes where none were required discrepancies’ were omitted medication 15 (11%), unnecessary drug 14 (10%) and incorrect medicine including spelling mistakes 3 (2%). After contacting the GPs of the discharged patients 72 h postdischarge; 49% had received the discharge summary and 45% had not, the remaining 6% were patients who were discharged without a GP. Conclusion: This study shows that doctor prescribing at discharge is often not accurate, and interventions made by pharmacist to reconcile are important at this point of care. It was also found that half the discharge summaries had not reached the patient's family physician (according to the GP) within 72 h.
Resumo:
We show that a set of fundamental solutions to the parabolic heat equation, with each element in the set corresponding to a point source located on a given surface with the number of source points being dense on this surface, constitute a linearly independent and dense set with respect to the standard inner product of square integrable functions, both on lateral- and time-boundaries. This result leads naturally to a method of numerically approximating solutions to the parabolic heat equation denoted a method of fundamental solutions (MFS). A discussion around convergence of such an approximation is included.