35 resultados para Errors and omission


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report the case of a neologistic jargonaphasic and ask whether her target-related and abstruse neologisms are the result of a single deficit, which affects some items more severely than others, or two deficits: one to lexical access and the other to phonological encoding. We analyse both correct/incorrect performance and errors and apply both traditional and formal methods (maximum-likelihood estimation and model selection). All evidence points to a single deficit at the level of phonological encoding. Further characteristics are used to constrain the locus still further. V.S. does not show the type of length effect expected of a memory component, nor the pattern of errors associated with an articulatory deficit. We conclude that her neologistic errors can result from a single deficit at a level of phonological encoding that immediately follows lexical access where segments are represented in terms of their features. We do not conclude, however, that this is the only possible locus that will produce phonological errors in aphasia, or, indeed, jargonaphasia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mechanical, physical and chemical changes in the surface of commercial thin film metal evaporated magnetic recording media have been correlated to recording error and signal degradation measurements. Modified and adapted commercial Hi-8 video recorders have been used for sample generation whilst analytical techniques such as SXPS,IMS and SEM have been employed in the surface characterisation. The durability of the media was assessed through stop motion (still frame) and cycling tests, where error growth and signal degradation were measured as a function of running time. The tests were performed under ambient (22°C, 40% RH) and high humidity (22°C, 80% RH) conditions. Characterisation of the lubricant layer on each tape was performed through models based on XPS and angle resolved XPS. The lubricant thickness can significantly affect the durability and signal output level of a thin film tape and thus it is important that reliable quantification can be achieved. Various models were considered for determining the lubricant thickness although ultimately, the most suitable technique was deemed to be a model that assumed a uniform layer structure. In addition to thin film metal evaporated media, equivalent durability tests and surface analysis experiments were performed using a commercial metal particle tape in order that comparisons could be made between the two types of recording media. The signal performance of the thin film metal evaporated media was found to be quite different from that for the metal particle tape since dropout errors and signal degradation increased at a much earlier stage. Extensive surface analyses enabled the mechanisms responsible for media failure and error growth to be identified in the ME and MP tapes and these were found to result from cyclic stressing and fatigue on the immediate substrate of the media.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is devoted to the tribology at the head~to~tape interface of linear tape recording systems, OnStream ADRTM system being used as an experimental platform, Combining experimental characterisation with computer modelling, a comprehensive picture of the mechanisms involved in a tape recording system is drawn. The work is designed to isolate the mechanisms responsible for the physical spacing between head and tape with the aim of minimising spacing losses and errors and optimising signal output. Standard heads-used in ADR current products-and prototype heads- DLC and SPL coated and dummy heads built from a AI203-TiC and alternative single-phase ceramics intended to constitute the head tape-bearing surface-are tested in controlled environment for up to 500 hours (exceptionally 1000 hours), Evidences of wear on the standard head are mainly observable as a preferential wear of the TiC phase of the AI203-TiC ceramic, The TiC grains are believed to delaminate due to a fatigue wear mechanism, a hypothesis further confirmed via modelling, locating the maximum von Mises equivalent stress at a depth equivalent to the TiC recession (20 to 30 nm). Debris of TiC delaminated residues is moreover found trapped within the pole-tip recession, assumed therefore to provide three~body abrasive particles, thus increasing the pole-tip recession. Iron rich stain is found over the cycled standard head surface (preferentially over the pole-tip and to a lesser extent over the TiC grains) at any environment condition except high temperature/humidity, where mainly organic stain was apparent, Temperature (locally or globally) affects staining rate and aspect; stain transfer is generally promoted at high temperature. Humidity affects transfer rate and quantity; low humidity produces, thinner stains at higher rate. Stain generally targets preferentially head materials with high electrical conductivity, i.e. Permalloy and TiC. Stains are found to decrease the friction at the head-to-tape interface, delay the TiC recession hollow-out and act as a protective soft coating reducing the pole-tip recession. This is obviously at the expense of an additional spacing at the head-to-tape interface of the order of 20 nm. Two kinds of wear resistant coating are tested: diamond like carbon (DLC) and superprotective layer (SPL), 10 nm and 20 to 40 nm thick, respectively. DLC coating disappears within 100 hours due possibly to abrasive and fatigue wear. SPL coatings are generally more resistant, particularly at high temperature and low humidity, possibly in relation with stain transfer. 20 nm coatings are found to rely on the substrate wear behaviour whereas 40 nm coatings are found to rely on the adhesive strength at the coating/substrate interface. These observations seem to locate the wear-driving forces 40 nm below the surface, hence indicate that for coatings in the 10 nm thickness range-· i,e. compatible with high-density recording-the substrate resistance must be taken into account. Single-phase ceramic as candidate for wear-resistant tape-bearing surface are tested in form of full-contour dummy-heads. The absence of a second phase eliminates the preferential wear observed at the AI203-TiC surface; very low wear rates and no evidence of brittle fracture are observed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we develop set of novel Markov chain Monte Carlo algorithms for Bayesian smoothing of partially observed non-linear diffusion processes. The sampling algorithms developed herein use a deterministic approximation to the posterior distribution over paths as the proposal distribution for a mixture of an independence and a random walk sampler. The approximating distribution is sampled by simulating an optimized time-dependent linear diffusion process derived from the recently developed variational Gaussian process approximation method. Flexible blocking strategies are introduced to further improve mixing, and thus the efficiency, of the sampling algorithms. The algorithms are tested on two diffusion processes: one with double-well potential drift and another with SINE drift. The new algorithm's accuracy and efficiency is compared with state-of-the-art hybrid Monte Carlo based path sampling. It is shown that in practical, finite sample, applications the algorithm is accurate except in the presence of large observation errors and low observation densities, which lead to a multi-modal structure in the posterior distribution over paths. More importantly, the variational approximation assisted sampling algorithm outperforms hybrid Monte Carlo in terms of computational efficiency, except when the diffusion process is densely observed with small errors in which case both algorithms are equally efficient.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The controversy surrounding the non-uniqueness of predictive gene lists (PGL) of small selected subsets of genes from very large potential candidates as available in DNA microarray experiments is now widely acknowledged 1. Many of these studies have focused on constructing discriminative semi-parametric models and as such are also subject to the issue of random correlations of sparse model selection in high dimensional spaces. In this work we outline a different approach based around an unsupervised patient-specific nonlinear topographic projection in predictive gene lists. Methods: We construct nonlinear topographic projection maps based on inter-patient gene-list relative dissimilarities. The Neuroscale, the Stochastic Neighbor Embedding(SNE) and the Locally Linear Embedding(LLE) techniques have been used to construct two-dimensional projective visualisation plots of 70 dimensional PGLs per patient, classifiers are also constructed to identify the prognosis indicator of each patient using the resulting projections from those visualisation techniques and investigate whether a-posteriori two prognosis groups are separable on the evidence of the gene lists. A literature-proposed predictive gene list for breast cancer is benchmarked against a separate gene list using the above methods. Generalisation ability is investigated by using the mapping capability of Neuroscale to visualise the follow-up study, but based on the projections derived from the original dataset. Results: The results indicate that small subsets of patient-specific PGLs have insufficient prognostic dissimilarity to permit a distinction between two prognosis patients. Uncertainty and diversity across multiple gene expressions prevents unambiguous or even confident patient grouping. Comparative projections across different PGLs provide similar results. Conclusion: The random correlation effect to an arbitrary outcome induced by small subset selection from very high dimensional interrelated gene expression profiles leads to an outcome with associated uncertainty. This continuum and uncertainty precludes any attempts at constructing discriminative classifiers. However a patient's gene expression profile could possibly be used in treatment planning, based on knowledge of other patients' responses. We conclude that many of the patients involved in such medical studies are intrinsically unclassifiable on the basis of provided PGL evidence. This additional category of 'unclassifiable' should be accommodated within medical decision support systems if serious errors and unnecessary adjuvant therapy are to be avoided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

FULL TEXT: Like many people one of my favourite pastimes over the holiday season is to watch the great movies that are offered on the television channels and new releases in the movie theatres or catching up on those DVDs that you have been wanting to watch all year. Recently we had the new ‘Star Wars’ movie, ‘The Force Awakens’, which is reckoned to become the highest grossing movie of all time, and the latest offering from James Bond, ‘Spectre’ (which included, for the car aficionados amongst you, the gorgeous new Aston Martin DB10). It is always amusing to see how vision correction or eye injury is dealt with by movie makers. Spy movies and science fiction movies have a freehand to design aliens with multiples eyes on stalks or retina scanning door locks or goggles that can see through walls. Eye surgery is usually shown in some kind of day case simplified laser treatment that gives instant results, apart from the great scene in the original ‘Terminator’ movie where Arnold Schwarzenegger's android character encounters an injury to one eye and then proceeds to remove the humanoid covering to this mechanical eye over a bathroom sink. I suppose it is much more difficult to try and include contact lenses in such movies. Although you may recall the film ‘Charlie's Angels’, which did have a scene where one of the Angels wore a contact lens that had a retinal image imprinted on it so she could by-pass a retinal scan door lock and an Eddy Murphy spy movie ‘I-Spy’, where he wore contact lenses that had electronic gadgetry that allowed whatever he was looking at to be beamed back to someone else, a kind of remote video camera device. Maybe we aren’t quite there in terms of devices available but these things are probably not the behest of science fiction anymore as the technology does exist to put these things together. The technology to incorporate electronics into contact lenses is being developed and I am sure we will be reporting on it in the near future. In the meantime we can continue to enjoy the unrealistic scenes of eye swapping as in the film ‘Minority Report’ (with Tom Cruise). Much more closely to home, than in a galaxy far far away, in this issue you can find articles on topics much nearer to the closer future. More and more optometrists in the UK are becoming registered for therapeutic work as independent prescribers and the number is likely to rise in the near future. These practitioners will be interested in the review paper by Michael Doughty, who is a member of the CLAE editorial panel (soon to be renamed the Jedi Council!), on prescribing drugs as part of the management of chronic meibomian gland dysfunction. Contact lenses play an active role in myopia control and orthokeratology has been used not only to help provide refractive correction but also in the retardation of myopia. In this issue there are three articles related to this topic. Firstly, an excellent paper looking at the link between higher spherical equivalent refractive errors and the association with slower axial elongation. Secondly, a paper that discusses the effectiveness and safety of overnight orthokeratology with high-permeability lens material. Finally, a paper that looks at the stabilisation of early adult-onset myopia. Whilst we are always eager for new and exciting developments in contact lenses and related instrumentation in this issue of CLAE there is a demonstration of a novel and practical use of a smartphone to assisted anterior segment imaging and suggestions of this may be used in telemedicine. It is not hard to imagine someone taking an image remotely and transmitting that back to a central diagnostic centre with the relevant expertise housed in one place where the information can be interpreted and instruction given back to the remote site. Back to ‘Star Wars’ and you will recall in the film ‘The Phantom Menace’ when Qui-Gon Jinn first meets Anakin Skywalker on Tatooine he takes a sample of his blood and sends a scan of it back to Obi-Wan Kenobi to send for analysis and they find that the boy has the highest midichlorian count ever seen. On behalf of the CLAE Editorial board (or Jedi Council) and the BCLA Council (the Senate of the Republic) we wish for you a great 2016 and ‘may the contact lens force be with you’. Or let me put that another way ‘the CLAE Editorial Board and BCLA Council, on behalf of, a great 2016, we wish for you!’

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot full all user needs or cover all concepts of data quality. In this paper we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specification on data quality, and propose an integrated model for data quality in the eld of Earth observation. We also propose a practical mechanism for applying the integrated quality information model to large number of datasets through metadata inheritance. While our data quality management approach is in the domain of Earth observation, we believe the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Golfers, coaches and researchers alike, have all keyed in on golf putting as an important aspect of overall golf performance. Of the three principle putting tasks (green reading, alignment and the putting action phase), the putting action phase has attracted the most attention from coaches, players and researchers alike. This phase includes the alignment of the club with the ball, the swing, and ball contact. A significant amount of research in this area has focused on measuring golfer’s vision strategies with eye tracking equipment. Unfortunately this research suffers from a number of shortcomings, which limit its usefulness. The purpose of this thesis was to address some of these shortcomings. The primary objective of this thesis was to re-evaluate golfer’s putting vision strategies using binocular eye tracking equipment and to define a new, optimal putting vision strategy which was associated with both higher skill and success. In order to facilitate this research, bespoke computer software was developed and validated, and new gaze behaviour criteria were defined. Additionally, the effects of training (habitual) and competition conditions on the putting vision strategy were examined, as was the effect of ocular dominance. Finally, methods for improving golfer’s binocular vision strategies are discussed, and a clinical plan for the optometric management of the golfer’s vision is presented. The clinical management plan includes the correction of fundamental aspects of golfers’ vision, including monocular refractive errors and binocular vision defects, as well as enhancement of their putting vision strategy, with the overall aim of improving performance on the golf course. This research has been undertaken in order to gain a better understanding of the human visual system and how it relates to the sport performance of golfers specifically. Ultimately, the analysis techniques and methods developed are applicable to the assessment of visual performance in all sports.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims - To characterize the population pharmacokinetics of ranitidine in critically ill children and to determine the influence of various clinical and demographic factors on its disposition. Methods - Data were collected prospectively from 78 paediatric patients (n = 248 plasma samples) who received oral or intravenous ranitidine for prophylaxis against stress ulcers, gastrointestinal bleeding or the treatment of gastro-oesophageal reflux. Plasma samples were analysed using high-performance liquid chromatography, and the data were subjected to population pharmacokinetic analysis using nonlinear mixed-effects modelling. Results - A one-compartment model best described the plasma concentration profile, with an exponential structure for interindividual errors and a proportional structure for intra-individual error. After backward stepwise elimination, the final model showed a significant decrease in objective function value (−12.618; P < 0.001) compared with the weight-corrected base model. Final parameter estimates for the population were 32.1 l h−1 for total clearance and 285 l for volume of distribution, both allometrically modelled for a 70 kg adult. Final estimates for absorption rate constant and bioavailability were 1.31 h−1 and 27.5%, respectively. No significant relationship was found between age and weight-corrected ranitidine pharmacokinetic parameters in the final model, with the covariate for cardiac failure or surgery being shown to reduce clearance significantly by a factor of 0.46. Conclusions - Currently, ranitidine dose recommendations are based on children's weights. However, our findings suggest that a dosing scheme that takes into consideration both weight and cardiac failure/surgery would be more appropriate in order to avoid administration of higher or more frequent doses than necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Electronic Patient Record (EPR) is being developed by many hospitals in the UK and across the globe. We class an EPR system as a type of Knowledge Management System (KMS), in that it is a technological tool developed to support the process of knowledge management (KM). Healthcare organisations aim to use these systems to provide a vehicle for more informed and improved clinical decision making thereby delivering reduced errors and risks, enhanced quality and consequently offering enhanced patient safety. Finding an effective way for a healthcare organisation to practically implement these systems is essential. In this study we use the concept of the business process approach to KM as a theoretical lens to analyse and explore how a large NHS teaching hospital developed, executed and practically implemented an EPR system. This theory advocates the importance of taking into account all organizational activities - the business processes - in considering any KM initiatives. Approaching KM through business processes allows for a more holistic view of the requirements across a process: emphasis is placed on how particular activities are performed, how they are structured and what knowledge demanded and not just supplied across each process. This falls in line with the increased emphasis in healthcare on patient-centred approaches to care delivery. We have found in previous research that hospitals are happy with the delivery of patient care being referred to as their 'business'. A qualitative study was conducted over a two and half year period with data collected from semi-structured interviews with eight members of the strategic management team, 12 clinical users and 20 patients in addition to non- participant observation of meetings and documentary data. We believe that the inclusion of patients within the study may well be the first time this has been done in examining the implementation of a KMS. The theoretical propositions strategy was used as the overarching approach for data analysis. Here Initial theoretical research themes and propositions were used to help shape and organise the case study analysis. This paper will present preliminary findings about the hospital's business strategy and its links to the KMS strategy and process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we develop set of novel Markov Chain Monte Carlo algorithms for Bayesian smoothing of partially observed non-linear diffusion processes. The sampling algorithms developed herein use a deterministic approximation to the posterior distribution over paths as the proposal distribution for a mixture of an independence and a random walk sampler. The approximating distribution is sampled by simulating an optimized time-dependent linear diffusion process derived from the recently developed variational Gaussian process approximation method. The novel diffusion bridge proposal derived from the variational approximation allows the use of a flexible blocking strategy that further improves mixing, and thus the efficiency, of the sampling algorithms. The algorithms are tested on two diffusion processes: one with double-well potential drift and another with SINE drift. The new algorithm's accuracy and efficiency is compared with state-of-the-art hybrid Monte Carlo based path sampling. It is shown that in practical, finite sample applications the algorithm is accurate except in the presence of large observation errors and low to a multi-modal structure in the posterior distribution over paths. More importantly, the variational approximation assisted sampling algorithm outperforms hybrid Monte Carlo in terms of computational efficiency, except when the diffusion process is densely observed with small errors in which case both algorithms are equally efficient. © 2011 Springer-Verlag.