832 resultados para accuracy analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a biventricular model, which couples the electrical and mechanical properties of the heart, and computer simulations of ventricular wall motion and deformation by means of a biventricular model. In the constructed electromechanical model, the mechanical analysis was based on composite material theory and the finite-element method; the propagation of electrical excitation was simulated using an electrical heart model, and the resulting active forces were used to calculate ventricular wall motion. Regional deformation and Lagrangian strain tensors were calculated during the systole phase. Displacements, minimum principal strains and torsion angle were used to describe the motion of the two ventricles. The simulations showed that during the period of systole, (1) the right ventricular free wall moves towards the septum, and at the same time, the base and middle of the free wall move towards the apex, which reduces the volume of the right ventricle; the minimum principle strain (E3) is largest at the apex, then at the middle of the free wall and its direction is in the approximate direction of the epicardial muscle fibres; (2) the base and middle of the left ventricular free wall move towards the apex and the apex remains almost static; the torsion angle is largest at the apex; the minimum principle strain E3 is largest at the apex and its direction on the surface of the middle wall of the left ventricle is roughly in the fibre orientation. These results are in good accordance with results obtained from MR tagging images reported in the literature. This study suggests that such an electromechanical biventricular model has the potential to be used to assess the mechanical function of the two ventricles, and also could improve the accuracy ECG simulation when it is used in heart torso model-based body surface potential simulation studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theoretical analyses of air traffic complexity were carried out using the Method for the Analysis of Relational Complexity. Twenty-two air traffic controllers examined static air traffic displays and were required to detect and resolve conflicts. Objective measures of performance included conflict detection time and accuracy. Subjective perceptions of mental workload were assessed by a complexity-sorting task and subjective ratings of the difficulty of different aspects of the task. A metric quantifying the complexity of pair-wise relations among aircraft was able to account for a substantial portion of the variance in the perceived complexity and difficulty of conflict detection problems, as well as reaction time. Other variables that influenced performance included the mean minimum separation between aircraft pairs and the amount of time that aircraft spent in conflict.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Thames Estuary, UK, and the Brisbane River, Australia, are comparable in size and catchment area. Both are representative of the large and growing number of the world's estuaries associated with major cities. Principle differences between the two systems relate to climate and human population pressures. In order to assess the potential phytotoxic impact of herbicide residues in the estuaries, surface waters were analysed with a PAM fluorometry-based bioassay that employs the photosynthetic efficiency (photosystem II quantum yield) of laboratory cultured microalgae, as an endpoint measure of phytotoxicity. In addition, surface waters were chemically analysed for a limited number of herbicides. Diuron atrazine and simazine were detected in both systems at comparable concentrations. In contrast, bioassay results revealed that whilst detected herbicides accounted for the observed phytotoxicity of Brisbane River extracts with great accuracy, they consistently explained only around 50% of the phytotoxicity induced by Thames Estuary extracts. Unaccounted for phytotoxicity in Thames surface waters is indicative of unidentified phytotoxins. The greatest phytotoxic response was measured at Charing Cross, Thames Estuary, and corresponded to a diuron equivalent concentration of 180 ng L-1. The study employs relative potencies (REP) of PSII impacting herbicides and demonstrates that chemical analysis alone is prone to omission of valuable information. Results of the study provide support for the incorporation of bioassays into routine monitoring programs where bioassay data may be used to predict and verify chemical contamination data, alert to unidentified compounds and provide the user with information regarding cumulative toxicity of complex mixtures. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Government agencies responsible for riparian environments are assessing the combined utility of field survey and remote sensing for mapping and monitoring indicators of riparian zone health. The objective of this work was to determine if the structural attributes of savanna riparian zones in northern Australia can be detected from commercially available remotely sensed image data. Two QuickBird images and coincident field data covering sections of the Daly River and the South Alligator River - Barramundie Creek in the Northern Territory were used. Semi-variograms were calculated to determine the characteristic spatial scales of riparian zone features, both vegetative and landform. Interpretation of semi-variograms showed that structural dimensions of riparian environments could be detected and estimated from the QuickBird image data. The results also show that selecting the correct spatial resolution and spectral bands is essential to maximize the accuracy of mapping spatial characteristics of savanna riparian features. The distribution of foliage projective cover of riparian vegetation affected spectral reflectance variations in individual spectral bands differently. Pan-sharpened image data enabled small-scale information extraction (< 6 m) on riparian zone structural parameters. The semi-variogram analysis results provide the basis for an inversion approach using high spatial resolution satellite image data to map indicators of savanna riparian zone health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grid computing is an advanced technique for collaboratively solving complicated scientific problems using geographically and organisational dispersed computational, data storage and other recourses. Application of grid computing could provide significant benefits to all aspects of power system that involves using computers. Based on our previous research, this paper presents a novel grid computing approach for probabilistic small signal stability (PSSS) analysis in electric power systems with uncertainties. A prototype computing grid is successfully implemented in our research lab to carry out PSSS analysis on two benchmark systems. Comparing to traditional computing techniques, the gird computing has given better performances for PSSS analysis in terms of computing capacity, speed, accuracy and stability. In addition, a computing grid framework for power system analysis has been proposed based on the recent study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To assess the repeatability of an objective image analysis technique to determine intraocular lens (IOL) rotation and centration. SETTING: Six ophthalmology clinics across Europe. METHODS: One-hundred seven patients implanted with Akreos AO aspheric IOLs with orientation marks were imaged. Image quality was rated by a masked observer. The axis of rotation was determined from a line bisecting the IOL orientation marks. This was normalized for rotation of the eye between visits using the axis bisecting 2 consistent conjunctival vessels or iris features. The center of ovals overlaid to circumscribe the IOL optic edge and the pupil or limbus were compared to determine IOL centration. Intrasession repeatability was assessed in 40 eyes and the variability of repeated analysis examined. RESULTS: Intrasession rotational stability of the IOL was ±0.79 degrees (SD) and centration was ±0.10 mm horizontally and ±0.10 mm vertically. Repeated analysis variability of the same image was ±0.70 degrees for rotation and ±0.20 mm horizontally and ±0.31 mm vertically for centration. Eye rotation (absolute) between visits was 2.23 ± 1.84 degrees (10%>5 degrees rotation) using one set of consistent conjunctival vessels or iris features and 2.03 ± 1.66 degrees (7%>5 degrees rotation) using the average of 2 sets (P =.13). Poorer image quality resulted in larger apparent absolute IOL rotation (r =-0.45,P<.001). CONCLUSIONS: Objective analysis of digital retroillumination images allows sensitive assessment of IOL rotation and centration stability. Eye rotation between images can lead to significant errors if not taken into account. Image quality is important to analysis accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Respiration is a complex activity. If the relationship between all neurological and skeletomuscular interactions was perfectly understood, an accurate dynamic model of the respiratory system could be developed and the interaction between different inputs and outputs could be investigated in a straightforward fashion. Unfortunately, this is not the case and does not appear to be viable at this time. In addition, the provision of appropriate sensor signals for such a model would be a considerable invasive task. Useful quantitative information with respect to respiratory performance can be gained from non-invasive monitoring of chest and abdomen motion. Currently available devices are not well suited in application for spirometric measurement for ambulatory monitoring. A sensor matrix measurement technique is investigated to identify suitable sensing elements with which to base an upper body surface measurement device that monitors respiration. This thesis is divided into two main areas of investigation; model based and geometrical based surface plethysmography. In the first instance, chapter 2 deals with an array of tactile sensors that are used as progression of existing and previously investigated volumetric measurement schemes based on models of respiration. Chapter 3 details a non-model based geometrical approach to surface (and hence volumetric) profile measurement. Later sections of the thesis concentrate upon the development of a functioning prototype sensor array. To broaden the application area the study has been conducted as it would be fore a generically configured sensor array. In experimental form the system performance on group estimation compares favourably with existing system on volumetric performance. In addition provides continuous transient measurement of respiratory motion within an acceptable accuracy using approximately 20 sensing elements. Because of the potential size and complexity of the system it is possible to deploy it as a fully mobile ambulatory monitoring device, which may be used outside of the laboratory. It provides a means by which to isolate coupled physiological functions and thus allows individual contributions to be analysed separately. Thus facilitating greater understanding of respiratory physiology and diagnostic capabilities. The outcome of the study is the basis for a three-dimensional surface contour sensing system that is suitable for respiratory function monitoring and has the prospect with future development to be incorporated into a garment based clinical tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In induction machines the tooth frequency losses due to permeance variation constitute a signif'icant, portion of the total loss. In order to predict and estimate these losses it, is essential to obtain a clear understanding of the no-load distribution of the air gap magnetic field and the magnitude of flux pulsation in both stator and rotor teeth. The existing theories and methods by which the air gap permeance variation in a doubly slotted structure is calculated are either empirical or restricted. The main objective of this thesis is to obtain a detailed analysis of the no-load air gap magnetic field distribution and the effect of air gap geometry on the magnitude and waveform of the tooth flux pulsation. In this thesis a detaiiled theoretical and experimental analysis of flux distribution not only leads to a better understanding of the distribution of no-load losses but also provides theoretical analysis for calculating the losses with greater accuracy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assess the accuracy of the Visante anterior segment optical coherence tomographer (AS-OCT) and present improved formulas for measurement of surface curvature and axial separation. Measurements are made in physical model eyes. Accuracy is compared for measurements of corneal thickness (d1) and anterior chamber depth (d2) using-built-in AS-OCT software versus the improved scheme. The improved scheme enables measurements of lens thickness (d 3) and surface curvature, in the form of conic sections specified by vertex radii and conic constants. These parameters are converted to surface coordinates for error analysis. The built-in AS-OCT software typically overestimates (mean±standard deviation(SD)]d1 by +62±4 μm and d2 by +4±88μm. The improved scheme reduces d1 (-0.4±4 μm) and d2 (0±49 μm) errors while also reducing d3 errors from +218±90 (uncorrected) to +14±123 μm (corrected). Surface x coordinate errors gradually increase toward the periphery. Considering the central 6-mm zone of each surface, the x coordinate errors for anterior and posterior corneal surfaces reached +3±10 and 0±23 μm, respectively, with the improved scheme. Those of the anterior and posterior lens surfaces reached +2±22 and +11±71 μm, respectively. Our improved scheme reduced AS-OCT errors and could, therefore, enhance pre- and postoperative assessments of keratorefractive or cataract surgery, including measurement of accommodating intraocular lenses. © 2007 Society of Photo-Optical Instrumentation Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed source analyses of half-field pattern onset visual evoked magnetic responses (VEMR) were carried out by the authors with a view to locating the source of the largest of the components, the CIIm. The analyses were performed using a series of realistic source spaces taking into account the anatomy of the visual cortex. Accuracy was enhanced by constraining the source distributions to lie within the visual cortex only. Further constraints on the source space yielded reliable, but possibly less meaningful, solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis deals with the development and application of a finite element program for the analysis of several cracked structures. In order to simplify the organisation of the material presented herein, the thesis has been subdivided into two Sections : In the first Section the development of a finite element program for the analysis of two-dimensional problems of plane stress or plane strain is described. The element used in this program is the six-mode isoparametric triangular element which permits the accurate modelling of curved boundary surfaces. Various cases of material aniftropy are included in the derivation of the element stiffness properties. A digital computer program is described and examples of its application are presented. In the second Section, on fracture problems, several cracked configurations are analysed by embedding into the finite element mesh a sub-region, containing the singularities and over which an analytic solution is used. The modifications necessary to augment a standard finite element program, such as that developed in Section I, are discussed and complete programs for each cracked configuration are presented. Several examples are included to demonstrate the accuracy and flexibility of the technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed literature survey confirmed cold roll-forming to be a complex and little understood process. In spite of its growing value, the process remains largely un-automated with few principles used in set-up of the rolling mill. This work concentrates on experimental investigations of operating conditions in order to gain a scientific understanding of the process. The operating conditions are; inter-pass distance, roll load, roll speed, horizontal roll alignment. Fifty tests have been carried out under varied operating conditions, measuring section quality and longitudinal straining to give a picture of bending. A channel section was chosen for its simplicity and compatibility with previous work. Quality measurements were measured in terms of vertical bow, twist and cross-sectional geometric accuracy, and a complete method of classifying quality has been devised. The longitudinal strain profile was recorded, by the use of strain gauges attached to the strip surface at five locations. Parameter control is shown to be important in allowing consistency in section quality. At present rolling mills are constructed with large tolerances on operating conditions. By reduction of the variability in parameters, section consistency is maintained and mill down-time is reduced. Roll load, alignment and differential roll speed are all shown to affect quality, and can be used to control quality. Set-up time is reduced by improving the design of the mill so that parameter values can be measured and set, without the need for judgment by eye. Values of parameters can be guided by models of the process, although elements of experience are still unavoidable. Despite increased parameter control, section quality is variable, if only due to variability in strip material properties. Parameters must therefore be changed during rolling. Ideally this can take place by closed-loop feedback control. Future work lies in overcoming the problems connected with this control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The finite element method is now well established among engineers as being an extremely useful tool in the analysis of problems with complicated boundary conditions. One aim of this thesis has been to produce a set of computer algorithms capable of efficiently analysing complex three dimensional structures. This set of algorithms has been designed to permit much versatility. Provisions such as the use of only those parts of the system which are relevant to a given analysis and the facility to extend the system by the addition of new elements are incorporate. Five element types have been programmed, these are, prismatic members, rectangular plates, triangular plates and curved plates. The 'in and out of plane' stiffness matrices for a curved plate element are derived using the finite element technique. The performance of this type of element is compared with two other theoretical solutions as well as with a set of independent experimental observations. Additional experimental work was then carried out by the author to further evaluate the acceptability of this element. Finally the analysis of two large civil engineering structures, the shell of an electrical precipitator and a concrete bridge, are presented to investigate the performance of the algorithms. Comparisons are made between the computer time, core store requirements and the accuracy of the analysis, for the proposed system and those of another program.