837 resultados para Errors and blunders, Literary.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is devoted to the tribology at the head~to~tape interface of linear tape recording systems, OnStream ADRTM system being used as an experimental platform, Combining experimental characterisation with computer modelling, a comprehensive picture of the mechanisms involved in a tape recording system is drawn. The work is designed to isolate the mechanisms responsible for the physical spacing between head and tape with the aim of minimising spacing losses and errors and optimising signal output. Standard heads-used in ADR current products-and prototype heads- DLC and SPL coated and dummy heads built from a AI203-TiC and alternative single-phase ceramics intended to constitute the head tape-bearing surface-are tested in controlled environment for up to 500 hours (exceptionally 1000 hours), Evidences of wear on the standard head are mainly observable as a preferential wear of the TiC phase of the AI203-TiC ceramic, The TiC grains are believed to delaminate due to a fatigue wear mechanism, a hypothesis further confirmed via modelling, locating the maximum von Mises equivalent stress at a depth equivalent to the TiC recession (20 to 30 nm). Debris of TiC delaminated residues is moreover found trapped within the pole-tip recession, assumed therefore to provide three~body abrasive particles, thus increasing the pole-tip recession. Iron rich stain is found over the cycled standard head surface (preferentially over the pole-tip and to a lesser extent over the TiC grains) at any environment condition except high temperature/humidity, where mainly organic stain was apparent, Temperature (locally or globally) affects staining rate and aspect; stain transfer is generally promoted at high temperature. Humidity affects transfer rate and quantity; low humidity produces, thinner stains at higher rate. Stain generally targets preferentially head materials with high electrical conductivity, i.e. Permalloy and TiC. Stains are found to decrease the friction at the head-to-tape interface, delay the TiC recession hollow-out and act as a protective soft coating reducing the pole-tip recession. This is obviously at the expense of an additional spacing at the head-to-tape interface of the order of 20 nm. Two kinds of wear resistant coating are tested: diamond like carbon (DLC) and superprotective layer (SPL), 10 nm and 20 to 40 nm thick, respectively. DLC coating disappears within 100 hours due possibly to abrasive and fatigue wear. SPL coatings are generally more resistant, particularly at high temperature and low humidity, possibly in relation with stain transfer. 20 nm coatings are found to rely on the substrate wear behaviour whereas 40 nm coatings are found to rely on the adhesive strength at the coating/substrate interface. These observations seem to locate the wear-driving forces 40 nm below the surface, hence indicate that for coatings in the 10 nm thickness range-· i,e. compatible with high-density recording-the substrate resistance must be taken into account. Single-phase ceramic as candidate for wear-resistant tape-bearing surface are tested in form of full-contour dummy-heads. The absence of a second phase eliminates the preferential wear observed at the AI203-TiC surface; very low wear rates and no evidence of brittle fracture are observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop set of novel Markov chain Monte Carlo algorithms for Bayesian smoothing of partially observed non-linear diffusion processes. The sampling algorithms developed herein use a deterministic approximation to the posterior distribution over paths as the proposal distribution for a mixture of an independence and a random walk sampler. The approximating distribution is sampled by simulating an optimized time-dependent linear diffusion process derived from the recently developed variational Gaussian process approximation method. Flexible blocking strategies are introduced to further improve mixing, and thus the efficiency, of the sampling algorithms. The algorithms are tested on two diffusion processes: one with double-well potential drift and another with SINE drift. The new algorithm's accuracy and efficiency is compared with state-of-the-art hybrid Monte Carlo based path sampling. It is shown that in practical, finite sample, applications the algorithm is accurate except in the presence of large observation errors and low observation densities, which lead to a multi-modal structure in the posterior distribution over paths. More importantly, the variational approximation assisted sampling algorithm outperforms hybrid Monte Carlo in terms of computational efficiency, except when the diffusion process is densely observed with small errors in which case both algorithms are equally efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern cosmopolitans are compulsive explorers in search of knowledge of world cultures; their role as translators of different languages enhances cross-cultural understanding. Defined as "world citizen", the cosmopolitan emerges as a habitual city-dweller whose existence coincides with the emergence of the modern metropolis. Whether as Kant's blueprint for "world peace" or Goethe's "world literature", this study of cosmo-politanism introduces profiles of authors and intellectuals whose contribution to German and Austrian literary culture spans the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The controversy surrounding the non-uniqueness of predictive gene lists (PGL) of small selected subsets of genes from very large potential candidates as available in DNA microarray experiments is now widely acknowledged 1. Many of these studies have focused on constructing discriminative semi-parametric models and as such are also subject to the issue of random correlations of sparse model selection in high dimensional spaces. In this work we outline a different approach based around an unsupervised patient-specific nonlinear topographic projection in predictive gene lists. Methods: We construct nonlinear topographic projection maps based on inter-patient gene-list relative dissimilarities. The Neuroscale, the Stochastic Neighbor Embedding(SNE) and the Locally Linear Embedding(LLE) techniques have been used to construct two-dimensional projective visualisation plots of 70 dimensional PGLs per patient, classifiers are also constructed to identify the prognosis indicator of each patient using the resulting projections from those visualisation techniques and investigate whether a-posteriori two prognosis groups are separable on the evidence of the gene lists. A literature-proposed predictive gene list for breast cancer is benchmarked against a separate gene list using the above methods. Generalisation ability is investigated by using the mapping capability of Neuroscale to visualise the follow-up study, but based on the projections derived from the original dataset. Results: The results indicate that small subsets of patient-specific PGLs have insufficient prognostic dissimilarity to permit a distinction between two prognosis patients. Uncertainty and diversity across multiple gene expressions prevents unambiguous or even confident patient grouping. Comparative projections across different PGLs provide similar results. Conclusion: The random correlation effect to an arbitrary outcome induced by small subset selection from very high dimensional interrelated gene expression profiles leads to an outcome with associated uncertainty. This continuum and uncertainty precludes any attempts at constructing discriminative classifiers. However a patient's gene expression profile could possibly be used in treatment planning, based on knowledge of other patients' responses. We conclude that many of the patients involved in such medical studies are intrinsically unclassifiable on the basis of provided PGL evidence. This additional category of 'unclassifiable' should be accommodated within medical decision support systems if serious errors and unnecessary adjuvant therapy are to be avoided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FULL TEXT: Like many people one of my favourite pastimes over the holiday season is to watch the great movies that are offered on the television channels and new releases in the movie theatres or catching up on those DVDs that you have been wanting to watch all year. Recently we had the new ‘Star Wars’ movie, ‘The Force Awakens’, which is reckoned to become the highest grossing movie of all time, and the latest offering from James Bond, ‘Spectre’ (which included, for the car aficionados amongst you, the gorgeous new Aston Martin DB10). It is always amusing to see how vision correction or eye injury is dealt with by movie makers. Spy movies and science fiction movies have a freehand to design aliens with multiples eyes on stalks or retina scanning door locks or goggles that can see through walls. Eye surgery is usually shown in some kind of day case simplified laser treatment that gives instant results, apart from the great scene in the original ‘Terminator’ movie where Arnold Schwarzenegger's android character encounters an injury to one eye and then proceeds to remove the humanoid covering to this mechanical eye over a bathroom sink. I suppose it is much more difficult to try and include contact lenses in such movies. Although you may recall the film ‘Charlie's Angels’, which did have a scene where one of the Angels wore a contact lens that had a retinal image imprinted on it so she could by-pass a retinal scan door lock and an Eddy Murphy spy movie ‘I-Spy’, where he wore contact lenses that had electronic gadgetry that allowed whatever he was looking at to be beamed back to someone else, a kind of remote video camera device. Maybe we aren’t quite there in terms of devices available but these things are probably not the behest of science fiction anymore as the technology does exist to put these things together. The technology to incorporate electronics into contact lenses is being developed and I am sure we will be reporting on it in the near future. In the meantime we can continue to enjoy the unrealistic scenes of eye swapping as in the film ‘Minority Report’ (with Tom Cruise). Much more closely to home, than in a galaxy far far away, in this issue you can find articles on topics much nearer to the closer future. More and more optometrists in the UK are becoming registered for therapeutic work as independent prescribers and the number is likely to rise in the near future. These practitioners will be interested in the review paper by Michael Doughty, who is a member of the CLAE editorial panel (soon to be renamed the Jedi Council!), on prescribing drugs as part of the management of chronic meibomian gland dysfunction. Contact lenses play an active role in myopia control and orthokeratology has been used not only to help provide refractive correction but also in the retardation of myopia. In this issue there are three articles related to this topic. Firstly, an excellent paper looking at the link between higher spherical equivalent refractive errors and the association with slower axial elongation. Secondly, a paper that discusses the effectiveness and safety of overnight orthokeratology with high-permeability lens material. Finally, a paper that looks at the stabilisation of early adult-onset myopia. Whilst we are always eager for new and exciting developments in contact lenses and related instrumentation in this issue of CLAE there is a demonstration of a novel and practical use of a smartphone to assisted anterior segment imaging and suggestions of this may be used in telemedicine. It is not hard to imagine someone taking an image remotely and transmitting that back to a central diagnostic centre with the relevant expertise housed in one place where the information can be interpreted and instruction given back to the remote site. Back to ‘Star Wars’ and you will recall in the film ‘The Phantom Menace’ when Qui-Gon Jinn first meets Anakin Skywalker on Tatooine he takes a sample of his blood and sends a scan of it back to Obi-Wan Kenobi to send for analysis and they find that the boy has the highest midichlorian count ever seen. On behalf of the CLAE Editorial board (or Jedi Council) and the BCLA Council (the Senate of the Republic) we wish for you a great 2016 and ‘may the contact lens force be with you’. Or let me put that another way ‘the CLAE Editorial Board and BCLA Council, on behalf of, a great 2016, we wish for you!’

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Golfers, coaches and researchers alike, have all keyed in on golf putting as an important aspect of overall golf performance. Of the three principle putting tasks (green reading, alignment and the putting action phase), the putting action phase has attracted the most attention from coaches, players and researchers alike. This phase includes the alignment of the club with the ball, the swing, and ball contact. A significant amount of research in this area has focused on measuring golfer’s vision strategies with eye tracking equipment. Unfortunately this research suffers from a number of shortcomings, which limit its usefulness. The purpose of this thesis was to address some of these shortcomings. The primary objective of this thesis was to re-evaluate golfer’s putting vision strategies using binocular eye tracking equipment and to define a new, optimal putting vision strategy which was associated with both higher skill and success. In order to facilitate this research, bespoke computer software was developed and validated, and new gaze behaviour criteria were defined. Additionally, the effects of training (habitual) and competition conditions on the putting vision strategy were examined, as was the effect of ocular dominance. Finally, methods for improving golfer’s binocular vision strategies are discussed, and a clinical plan for the optometric management of the golfer’s vision is presented. The clinical management plan includes the correction of fundamental aspects of golfers’ vision, including monocular refractive errors and binocular vision defects, as well as enhancement of their putting vision strategy, with the overall aim of improving performance on the golf course. This research has been undertaken in order to gain a better understanding of the human visual system and how it relates to the sport performance of golfers specifically. Ultimately, the analysis techniques and methods developed are applicable to the assessment of visual performance in all sports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the face of global population growth and the uneven distribution of water supply, a better knowledge of the spatial and temporal distribution of surface water resources is critical. Remote sensing provides a synoptic view of ongoing processes, which addresses the intricate nature of water surfaces and allows an assessment of the pressures placed on aquatic ecosystems. However, the main challenge in identifying water surfaces from remotely sensed data is the high variability of spectral signatures, both in space and time. In the last 10 years only a few operational methods have been proposed to map or monitor surface water at continental or global scale, and each of them show limitations. The objective of this study is to develop and demonstrate the adequacy of a generic multi-temporal and multi-spectral image analysis method to detect water surfaces automatically, and to monitor them in near-real-time. The proposed approach, based on a transformation of the RGB color space into HSV, provides dynamic information at the continental scale. The validation of the algorithm showed very few omission errors and no commission errors. It demonstrates the ability of the proposed algorithm to perform as effectively as human interpretation of the images. The validation of the permanent water surface product with an independent dataset derived from high resolution imagery, showed an accuracy of 91.5% and few commission errors. Potential applications of the proposed method have been identified and discussed. The methodology that has been developed 27 is generic: it can be applied to sensors with similar bands with good reliability, and minimal effort. Moreover, this experiment at continental scale showed that the methodology is efficient for a large range of environmental conditions. Additional preliminary tests over other continents indicate that the proposed methodology could also be applied at the global scale without too many difficulties

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims - To characterize the population pharmacokinetics of ranitidine in critically ill children and to determine the influence of various clinical and demographic factors on its disposition. Methods - Data were collected prospectively from 78 paediatric patients (n = 248 plasma samples) who received oral or intravenous ranitidine for prophylaxis against stress ulcers, gastrointestinal bleeding or the treatment of gastro-oesophageal reflux. Plasma samples were analysed using high-performance liquid chromatography, and the data were subjected to population pharmacokinetic analysis using nonlinear mixed-effects modelling. Results - A one-compartment model best described the plasma concentration profile, with an exponential structure for interindividual errors and a proportional structure for intra-individual error. After backward stepwise elimination, the final model showed a significant decrease in objective function value (−12.618; P < 0.001) compared with the weight-corrected base model. Final parameter estimates for the population were 32.1 l h−1 for total clearance and 285 l for volume of distribution, both allometrically modelled for a 70 kg adult. Final estimates for absorption rate constant and bioavailability were 1.31 h−1 and 27.5%, respectively. No significant relationship was found between age and weight-corrected ranitidine pharmacokinetic parameters in the final model, with the covariate for cardiac failure or surgery being shown to reduce clearance significantly by a factor of 0.46. Conclusions - Currently, ranitidine dose recommendations are based on children's weights. However, our findings suggest that a dosing scheme that takes into consideration both weight and cardiac failure/surgery would be more appropriate in order to avoid administration of higher or more frequent doses than necessary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Electronic Patient Record (EPR) is being developed by many hospitals in the UK and across the globe. We class an EPR system as a type of Knowledge Management System (KMS), in that it is a technological tool developed to support the process of knowledge management (KM). Healthcare organisations aim to use these systems to provide a vehicle for more informed and improved clinical decision making thereby delivering reduced errors and risks, enhanced quality and consequently offering enhanced patient safety. Finding an effective way for a healthcare organisation to practically implement these systems is essential. In this study we use the concept of the business process approach to KM as a theoretical lens to analyse and explore how a large NHS teaching hospital developed, executed and practically implemented an EPR system. This theory advocates the importance of taking into account all organizational activities - the business processes - in considering any KM initiatives. Approaching KM through business processes allows for a more holistic view of the requirements across a process: emphasis is placed on how particular activities are performed, how they are structured and what knowledge demanded and not just supplied across each process. This falls in line with the increased emphasis in healthcare on patient-centred approaches to care delivery. We have found in previous research that hospitals are happy with the delivery of patient care being referred to as their 'business'. A qualitative study was conducted over a two and half year period with data collected from semi-structured interviews with eight members of the strategic management team, 12 clinical users and 20 patients in addition to non- participant observation of meetings and documentary data. We believe that the inclusion of patients within the study may well be the first time this has been done in examining the implementation of a KMS. The theoretical propositions strategy was used as the overarching approach for data analysis. Here Initial theoretical research themes and propositions were used to help shape and organise the case study analysis. This paper will present preliminary findings about the hospital's business strategy and its links to the KMS strategy and process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel error-free (infinite-precision) architecture for the fast implementation of 8x8 2-D Discrete Cosine Transform. The architecture uses a new algebraic integer encoding of a 1-D radix-8 DCT that allows the separable computation of a 2-D 8x8 DCT without any intermediate number representation conversions. This is a considerable improvement on previously introduced algebraic integer encoding techniques to compute both DCT and IDCT which eliminates the requirements to approximate the transformation matrix ele- ments by obtaining their exact representations and hence mapping the transcendental functions without any errors. Apart from the multiplication-free nature, this new mapping scheme fits to this algorithm, eliminating any computational or quantization errors and resulting short-word-length and high-speed-design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we develop set of novel Markov Chain Monte Carlo algorithms for Bayesian smoothing of partially observed non-linear diffusion processes. The sampling algorithms developed herein use a deterministic approximation to the posterior distribution over paths as the proposal distribution for a mixture of an independence and a random walk sampler. The approximating distribution is sampled by simulating an optimized time-dependent linear diffusion process derived from the recently developed variational Gaussian process approximation method. The novel diffusion bridge proposal derived from the variational approximation allows the use of a flexible blocking strategy that further improves mixing, and thus the efficiency, of the sampling algorithms. The algorithms are tested on two diffusion processes: one with double-well potential drift and another with SINE drift. The new algorithm's accuracy and efficiency is compared with state-of-the-art hybrid Monte Carlo based path sampling. It is shown that in practical, finite sample applications the algorithm is accurate except in the presence of large observation errors and low to a multi-modal structure in the posterior distribution over paths. More importantly, the variational approximation assisted sampling algorithm outperforms hybrid Monte Carlo in terms of computational efficiency, except when the diffusion process is densely observed with small errors in which case both algorithms are equally efficient. © 2011 Springer-Verlag.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explains how Poisson regression can be used in studies in which the dependent variable describes the number of occurrences of some rare event such as suicide. After pointing out why ordinary linear regression is inappropriate for treating dependent variables of this sort, we go on to present the basic Poisson regression model and show how it fits in the broad class of generalized linear models. Then we turn to discussing a major problem of Poisson regression known as overdispersion and suggest possible solutions, including the correction of standard errors and negative binomial regression. The paper ends with a detailed empirical example, drawn from our own research on suicide.