17 resultados para Errors analysis

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is known that distillation tray efficiency depends on the liquid flow pattern, particularly for large diameter trays. Scale·up failures due to liquid channelling have occurred, and it is known that fitting flow control devices to trays sometirr.es improves tray efficiency. Several theoretical models which explain these observations have been published. Further progress in understanding is at present blocked by lack of experimental measurements of the pattern of liquid concentration over the tray. Flow pattern effects are expected to be significant only on commercial size trays of a large diameter and the lack of data is a result of the costs, risks and difficulty of making these measurements on full scale production columns. This work presents a new experiment which simulates distillation by water cooling. and provides a means of testing commercial size trays in the laboratory. Hot water is fed on to the tray and cooled by air forced through the perforations. The analogy between heat and mass transfer shows that the water temperature at any point is analogous to liquid concentration and the enthalpy of the air is analogous to vapour concentration. The effect of the liquid flow pattern on mass transfer is revealed by the temperature field on the tray. The experiment was implemented and evaluated in a column of 1.2 m. dia. The water temperatures were measured by thennocouples interfaced to an electronic computerised data logging system. The "best surface" through the experimental temperature measurements was obtained by the mathematical technique of B. splines, and presented in tenos of lines of constant temperature. The results revealed that in general liquid channelling is more imponant in the bubbly "mixed" regime than in the spray regime. However, it was observed that severe channelling also occurred for intense spray at incipient flood conditions. This is an unexpected result. A computer program was written to calculate point efficiency as well as tray efficiency, and the results were compared with distillation efficiencies for similar loadings. The theoretical model of Porter and Lockett for predicting distillation was modified to predict water cooling and the theoretical predictions were shown to be similar to the experimental temperature profiles. A comparison of the repeatability of the experiments with an errors analysis revealed that accurate tray efficiency measurements require temperature measurements to better than ± 0.1 °c which is achievable with conventional techniques. This was not achieved in this work, and resulted in considerable scatter in the efficiency results. Nevertheless it is concluded that the new experiment is a valuable tool for investigating the effect of the liquid flow pattern on tray mass transfer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present information-theory analysis of the tradeoff between bit-error rate improvement and the data-rate loss using skewed channel coding to suppress pattern-dependent errors in digital communications. Without loss of generality, we apply developed general theory to the particular example of a high-speed fiber communication system with a strong patterning effect. © 2007 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WWe present the case of two aphasic patients: one with fluent speech, MM, and one with dysfluent speech, DB. Both patients make similar proportions of phonological errors in speech production and the errors have similar characteristics. A closer analysis, however, shows a number of differences. DB's phonological errors involve, for the most part, simplifications of syllabic structure; they affect consonants more than vowels; and, among vowels, they show effects of sonority/complexity. This error pattern may reflect articulatory difficulties. MM's errors, instead, show little effect of syllable structure, affect vowels at least as much as consonants and, and affect all different vowels to a similar extent. This pattern is consistent with a more central impairment involving the selection of the right phoneme among competing alternatives. We propose that, at this level, vowel selection may be more difficult than consonant selection because vowels belong to a smaller set of repeatedly activated units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To assess the repeatability of an objective image analysis technique to determine intraocular lens (IOL) rotation and centration. SETTING: Six ophthalmology clinics across Europe. METHODS: One-hundred seven patients implanted with Akreos AO aspheric IOLs with orientation marks were imaged. Image quality was rated by a masked observer. The axis of rotation was determined from a line bisecting the IOL orientation marks. This was normalized for rotation of the eye between visits using the axis bisecting 2 consistent conjunctival vessels or iris features. The center of ovals overlaid to circumscribe the IOL optic edge and the pupil or limbus were compared to determine IOL centration. Intrasession repeatability was assessed in 40 eyes and the variability of repeated analysis examined. RESULTS: Intrasession rotational stability of the IOL was ±0.79 degrees (SD) and centration was ±0.10 mm horizontally and ±0.10 mm vertically. Repeated analysis variability of the same image was ±0.70 degrees for rotation and ±0.20 mm horizontally and ±0.31 mm vertically for centration. Eye rotation (absolute) between visits was 2.23 ± 1.84 degrees (10%>5 degrees rotation) using one set of consistent conjunctival vessels or iris features and 2.03 ± 1.66 degrees (7%>5 degrees rotation) using the average of 2 sets (P =.13). Poorer image quality resulted in larger apparent absolute IOL rotation (r =-0.45,P<.001). CONCLUSIONS: Objective analysis of digital retroillumination images allows sensitive assessment of IOL rotation and centration stability. Eye rotation between images can lead to significant errors if not taken into account. Image quality is important to analysis accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An intelligent agent, operating in an external world which cannot be fully described in its internal world model, must be able to monitor the success of a previously generated plan and to respond to any errors which may have occurred. The process of error analysis requires the ability to reason in an expert fashion about time and about processes occurring in the world. Reasoning about time is needed to deal with causality. Reasoning about processes is needed since the direct effects of a plan action can be completely specified when the plan is generated, but the indirect effects cannot. For example, the action `open tap' leads with certainty to `tap open', whereas whether there will be a fluid flow and how long it might last is more difficult to predict. The majority of existing planning systems cannot handle these kinds of reasoning, thus limiting their usefulness. This thesis argues that both kinds of reasoning require a complex internal representation of the world. The use of Qualitative Process Theory and an interval-based representation of time are proposed as a representation scheme for such a world model. The planning system which was constructed has been tested on a set of realistic planning scenarios. It is shown that even simple planning problems, such as making a cup of coffee, require extensive reasoning if they are to be carried out successfully. The final Chapter concludes that the planning system described does allow the correct solution of planning problems involving complex side effects, which planners up to now have been unable to solve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For optimum utilization of satellite-borne instrumentation, it is necessary to know precisely the orbital position of the spacecraft. The aim of this thesis is therefore two-fold - firstly to derive precise orbits with particular emphasis placed on the altimetric satellite SEASAT and secondly, to utilize the precise orbits, to improve upon atmospheric density determinations for satellite drag modelling purposes. Part one of the thesis, on precise orbit determinations, is particularly concerned with the tracking data - satellite laser ranging, altimetry and crossover height differences - and how this data can be used to analyse errors in the orbit, the geoid and sea-surface topography. The outcome of this analysis is the determination of a low degree and order model for sea surface topography. Part two, on the other hand, mainly concentrates on using the laser data to analyse and improve upon current atmospheric density models. In particular, the modelling of density changes associated with geomagnetic disturbances comes under scrutiny in this section. By introducing persistence modelling of a geomagnetic event and solving for certain geomagnetic parameters, a new density model is derived which performs significantly better than the state-of-the-art models over periods of severe geomagnetic storms at SEASAT heights. This is independently verified by application of the derived model to STARLETTE orbit determinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of fatal accidents in the agricultural, horticultural and forestry industry in Great Britain has declined from an annual rate of about 135 in the 1960's to its current level of about 50. Changes to the size and makeup of the population at risk mean that there has been no real improvement in fatal injury incidence rates for farmers. The Health and Safety Executives' (HSE) current system of accident investigation, recording, and analysis is directed primarily at identifying fault, allocating blame, and punishing wrongdoers. Relatively little information is recorded about the personal and organisational factors that contributed to, or failed to prevent accidents. To develop effective preventive strategies, it is important to establish whether errors by the victims and others, occur at the skills, rules, or knowledge level of functioning: are violations of some rule or procedure; or stem from failures to correctly appraise, or control a hazard. A modified version of the Hale and Glendon accident causation model was used to study 230 fatal accidents. Inspectors' original reports were examined and expert judgement applied to identify and categorise the errors committed by each of the parties involved. The highest proportion of errors that led directly to accidents occurred whilst the victims were operating at the knowledge level. The mix and proportion of errors varied considerably between different classes of victim and kind of accident. Different preventive strategies will be needed to address the problem areas identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the failure of PRARE the orbital accuracy of ERS-1 is typically 10-15 cm radially as compared to 3-4cm for TOPEX/Poseidon. To gain the most from these simultaneous datasets it is necessary to improve the orbital accuracy of ERS-1 so that it is commensurate with that of TOPEX/Poseidon. For the integration of these two datasets it is also necessary to determine the altimeter and sea state biases for each of the satellites. Several models for the sea state bias of ERS-1 are considered by analysis of the ERS-1 single satellite crossovers. The model adopted consists of the sea state bias as a percentage of the significant wave height, namely 5.95%. The removal of ERS-1 orbit error and recovery of an ERS-1 - TOPEX/Poseidon relative bias are both achieved by analysis of dual crossover residuals. The gravitational field based radial orbit error is modelled by a finite Fourier expansion series with the dominant frequencies determined by analysis of the JGM-2 co-variance matrix. Periodic and secular terms to model the errors due to atmospheric density, solar radiation pressure and initial state vector mis-modelling are also solved for. Validation of the dataset unification consists of comparing the mean sea surface topographies and annual variabilities derived from both the corrected and uncorrected ERS-1 orbits with those derived from TOPEX/Poseidon. The global and regional geographically fixed/variable orbit errors are also analysed pre and post correction, and a significant reduction is noted. Finally the use of dual/single satellite crossovers and repeat pass data, for the calibration of ERS-2 with respect to ERS-1 and TOPEX/Poseidon is shown by calculating the ERS-1/2 sea state and relative biases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extent to which the surface parameters of Progressive Addition Lenses (PALs) affect successful patient tolerance was investigated. Several optico-physical evaluation techniques were employed, including a newly constructed surface reflection device which was shown to be of value for assessing semi-finished PAL blanks. Detailed physical analysis was undertaken using a computer-controlled focimeter and from these data, iso-cylindrical and mean spherical plots were produced for each PAL studied. Base curve power was shown to have little impact upon the distribution of PAL astigmatism. A power increase in reading addition primarily caused a lengthening and narrowing of the lens progression channel. Empirical measurements also indicated a marginal steepening of the progression power gradient with an increase in reading addition power. A sample of the PAL wearing population were studied using patient records and questionnaire analysis (90% were returned). This subjective analysis revealed the reading portion to be the most troublesome lens zone and showed that patients with high astigmatism (> 2.00D) adapt more readily to PALs than those with spherical or low cylindrical (2.00D) corrections. The psychophysical features of PALs were then investigated. Both grafting visual acuity (VA) and contrast sensitivity (CS) were shown to be reduced with an increase in eccentricity from the central umbilical line. Two sample populations (N= 20) of successful and unsuccessful PAL wearers were assessed for differences in their visual performance and their adaptation to optically induced distortion. The possibility of dispensing errors being the cause of poor patient tolerance amongst the unsuccessful wearer group was investigated and discounted. The contrast sensitivity of the successful group was significantly greater than that of the unsuccessful group. No differences in adaptation to or detection of curvature distortion were evinced between these presbyopic groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural analysis in handwritten mathematical expressions focuses on interpreting the recognized symbols using geometrical information such as relative sizes and positions of the symbols. Most existing approaches rely on hand-crafted grammar rules to identify semantic relationships among the recognized mathematical symbols. They could easily fail when writing errors occurred. Moreover, they assume the availability of the whole mathematical expression before being able to analyze the semantic information of the expression. To tackle these problems, we propose a progressive structural analysis (PSA) approach for dynamic recognition of handwritten mathematical expressions. The proposed PSA approach is able to provide analysis result immediately after each written input symbol. This has an advantage that users are able to detect any recognition errors immediately and correct only the mis-recognized symbols rather than the whole expression. Experiments conducted on 57 most commonly used mathematical expressions have shown that the PSA approach is able to achieve very good performance results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) as introduced by Charnes, Cooper, and Rhodes (1978) is a linear programming technique that has widely been used to evaluate the relative efficiency of a set of homogenous decision making units (DMUs). In many real applications, the input-output variables cannot be precisely measured. This is particularly important in assessing efficiency of DMUs using DEA, since the efficiency score of inefficient DMUs are very sensitive to possible data errors. Hence, several approaches have been proposed to deal with imprecise data. Perhaps the most popular fuzzy DEA model is based on a-cut. One drawback of the a-cut approach is that it cannot include all information about uncertainty. This paper aims to introduce an alternative linear programming model that can include some uncertainty information from the intervals within the a-cut approach. We introduce the concept of "local a-level" to develop a multi-objective linear programming to measure the efficiency of DMUs under uncertainty. An example is given to illustrate the use of this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a new technique for extracting histological parameters from multi-spectral images of the ocular fundus. The new method uses a Monte Carlo simulation of the reflectance of the fundus to model how the spectral reflectance of the tissue varies with differing tissue histology. The model is parameterised by the concentrations of the five main absorbers found in the fundus: retinal haemoglobins, choroidal haemoglobins, choroidal melanin, RPE melanin and macular pigment. These parameters are shown to give rise to distinct variations in the tissue colouration. We use the results of the Monte Carlo simulations to construct an inverse model which maps tissue colouration onto the model parameters. This allows the concentration and distribution of the five main absorbers to be determined from suitable multi-spectral images. We propose the use of "image quotients" to allow this information to be extracted from uncalibrated image data. The filters used to acquire the images are selected to ensure a one-to-one mapping between model parameters and image quotients. To recover five model parameters uniquely, images must be acquired in six distinct spectral bands. Theoretical investigations suggest that retinal haemoglobins and macular pigment can be recovered with RMS errors of less than 10%. We present parametric maps showing the variation of these parameters across the posterior pole of the fundus. The results are in agreement with known tissue histology for normal healthy subjects. We also present an early result which suggests that, with further development, the technique could be used to successfully detect retinal haemorrhages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - It is well recognised that errors are more likely to occur during transitions of care, especially medicines errors. Clinic letters are used as a communication tool during a transition from hospital (outpatient clinics) to primary care (general practitioners). Little is known about medicines errors in clinic letters, as previous studies in this area have focused on medicines errors in inpatient or outpatient prescriptions. Published studies concerning clinic letters largely focus on perceptions of patients or general practitioners in respect to overall quality. Purpose - To investigate medicines errors contained in outpatient clinic letters generated by prescribers within the Neurology Department of a specialist paediatric hospital in the UK.Materials and methods - Single site, retrospective, cross-sectional review of 100 clinic letters generated during March–July 2013 in response to an outpatient consultation. Clinic letters were conveniently selected from the most recent visit of each patient. An evaluation tool with a 10-point scale, where 10 was no error and 0 was significant error, was developed and refined throughout the study to facilitate identification and characterisation of medicines errors. The tool was tested for a relationship between scores and number of medicines errors using a regression analysis.Results - Of 315 items related to neurology mentioned within the letters, 212 items were associated with 602 errors. Common missing information was allergy (97%, n = 97), formulation (60.3%, n = 190), strength/concentration (59%, n = 186) and weight (53%, n = 53). Ninety-nine letters were associated with at least one error. Scores were in range of 4–10 with 42% of letters scored as 7. Statistically significant relationships were observed between scores and number of medicines errors (R2 = 0.4168, p < 0.05) as well as between number of medicines and number of drug-related errors (R2 = 0.9719, p < 0.05). Conclusions - Nearly all clinic letters were associated with medicines errors. The 10-point evaluation tool may be a useful device to categorise clinic letter errors.